CONFIGURATION AND PROGRAMMING OF THE FANUC IRVISION

Transcription

International Journal of Modern Manufacturing TechnologiesISSN 2067–3604, Vol. XII, No. 1 / 2020CONFIGURATION AND PROGRAMMING OF THE FANUC IRVISIONVISION SYSTEM FOR APPLICATIONS IN THE DYNAMIC ENVIRONMENTOF MANIPULATED ELEMENTSMirosław Marny, Mariusz Piotr HetmanczykThe Silesian University of Technology, Faculty of Mechanical Engineering,Department of Engineering Processes Automation and Integrated Manufacturing Systems,Konarskiego 18A St., 44-100, Gliwice, PolandCorresponding author: Mariusz Piotr Hetmanczyk, mariusz.hetmanczyk@polsl.plAbstract: The article presents the configuration process ofthe vision system with a fixed camera and identification ofposition of manipulated components related to robot'sscene coordinates system. In the research phase, a 2Dvision system was used, which determines the location ofthe detail in the form of X and Y coordinates, as well asthe orientation around the Z axis (defined as the Rparameter). The description of the camera configuration,defining the TCP point of the robot, the definition of therobot's scene and the camera calibration procedure werediscussed in detail. Subsequently, a teaching process of thecalibration pattern and a definition of the referenceposition were described. The authors presented also thebasic steps of elementary image analysis related toprocessing, recognition of the learned patterns and theirlocating in captured images.Key words: vision systems, robotics, Artifical Intelligence,image analysis, automation of manipulation processes.classified industrial vision systems (Golnabia andAsadpourb, 2007). Vision systems allow to interact withthe environment (Košecká et al., 2015), orientation in it,simple determination of the basic properties of objectsand provide a higher level of autonomy (compared withother groups of industrial sensors).The manufacturers of industrial robots meet thegrowing market expectations by the implementation ofvision systems solutions dedicated for their own robotcontrollers. One such example is the FANUC iRVisionvision system (B-82774EN-3/03, 2019). The use of theiRVision system minimizes the time and number ofactivities performed during the implementation phase,eliminating at the same time the need to developadvanced image analysis algorithms (Jiang et al., 2019,Cholewa, A., 2018), as well as configuring thecommunication between the vision system and therobot controller. The user has to perform only basicconfiguration tasks, teach the vision system, as well asdevelop the structure of the robot's control algorithm(B-82774EN-3/01, 2019). The process of a initialpreparation and running of the application should alsotake into account many additional factors, relatedmainly to the functional safety and an optimization ofwork.1. INTRODUCTIONThe main tasks of modern industrial robots arepalletizing, packaging, welding, pressure welding,cutting, gluing, assembly and many others. The robotcan accomplish required tasks using knowledge aboutthe environment, which is defined by the controlalgorithm in an unchangeable manner (Cubero, 2006).Such an approach assumes the total invariability of therobot scene or possibility of modifications occurring ina strictly predictable way. Modern industrial robots(Rashitov and Ivanou, 2019) usually work in adynamically changing environment, in which positionsof manipulated components do not show the features ofrepeatability and additionally occurs extraordinary casesthat could not be anticipated by the programmer(Connolly, 2007). Current development trends requirethe implementation of devices characterized by anincreasing degree of an intelligence, an autonomy aswell as an interaction with the dynamically changingindustrial environment. To the main assisting systemsthat enable partial meet of these needs could be2. SYSTEM CONFIGURATION2.1. Identification of main functional parametersand selection of a vision cameraAs a part of the research scope, it was assumedconfiguration of the 2D vision system for the purposeof recognition and location of two types of objects(showing significant similarity features, Figure 1), inorder to sort them on separate storage palettes(Lyshevski, 2008).The selection of camera (especially optical systemparameters) is a crucial for the satisfactoryreproduction of features of real objects (Xinguo andGuangjun Zhang, 2013).98

In particular, parameters such as the minimum size ofthe detail reproduced in the recorded image, themaximum field of a view and the distance of thecamera lens from the surface of the detail should betaken into account (Yahui and Xianzhong, 2011).Based on the described data it is possible to selectthe resolution, size of the matrix, as well as the focallength of the camera lens (Zhuang and Roth, 2018).PW - width of the field of view [mm],LPW - number of pixels on the matrix widthdimension.Based on the overall dimensions of the test station,the parameters of the video camera were calculated(Table 1).Table 1. Calculated configuration parameters of the visionsystem (on the basis of equations 1 4)ParameterHeight of the field of view (P W)Width of the field of view (PS)Minimum height of the observed object (wH)Minimum width of the observed object (wS)In the application the Sony XC-56 monochromecamera connected to the robot controller was used.Specification of the camera is shown in Table 2.Fig. 1. View of objects subjected to the sorting processusing a vision system, where: 1, 2 - objects of the first andsecond type, 3 - storage palletTable 2. Specifications of the SONY XC-56 cameraParameterImage deviceEffective pictureelementsCCD verticaldrive frequencyThe functional parameters of the camera wereselected on the basis of the following equations(1 4), in the case of: height of the field of view (PW):PW [(dOC CW) (dW CW)]/dW(1)Scanning systemwhere:PW - height of the field of view [mm],CW - height of the CCD matrix [mm],dW - focal length of the lens (selected in relation tothe PW height) [mm],dOC - distance of the observed object from the camera[mm], width of the field of view (PS):Output signalfrequencySensitivityMin. illuminationS/N ratioWhite clipNormal shutterspeedExtrenal shutterspeedPS [(dOC CS) (dW CS)]/dS(2)High-ratescanningwhere:PS - width of the field of view [mm],CS - width of the CCD matrix [mm],dS - focal length of the lens (selected in relation to thePS width). minimum height of the observed object (wH):wH PH/(0.5 LPH)Calculatedvalue [mm]356.4475.21.41.4Value/ feature of the parameter1/3 type IT progressive scan CCD659(H) x 494(V)15.734 kHz 1%Normal: 525 linesNon-Interlace: 1/30sBinning: 263 lines, 1/60s29.97 Hz (normal mode),59.94 Hz (binning mode)400 lux (F8 Fix Gain (0 dB))0.5 lux (F1.4, Manual Gain Max)58 dB820 mV 70 mV (F1.4, Fix Gain)OFF to 1/15000s switchable at rear panel1/4 to 1/100000sR/R mode binning off: max 120 frames/s,R/R mode binning on: max 180 frames/s,External trigger shutter mode (MODE 1)binning off: max 120 frames/s,External trigger shutter mode (MODE 1)binning on: max 180 frames/sAn industrial robot can also carry out manipulationtasks without utilization a vision system, using theknowledge of the environment contained in thecontrol programme. However, this approach assumesthe invariability of the sequence of performed tasksor making changes in a algorithmized manner.(3)where:wH - minimum height of the observed object [mm],PH - height of the field of view [mm],LPH - number of pixels on the matrix heightdimension, minimum width of the observed object (wW):2.2 Configuration of the vision system in the aspectof cooperation with the industrial robotThe main tasks of the vision system in the presentedapplication include: identification and verification ofgeometric features of objects located on the robotstage, determining the location and orientation ofobjects, navigation and control of the robot'swW PW/(0.5 LPW)(4)where:wW - minimum width of the observed object [mm],99

kinematic system and gripper.The measurement of the robot's displacement, relativeto the reference position, can be estimated in thecoordinate system associated with (Figure 2): robotscene (Fixed Frame Offset; in the case ofdisplacement of the gripped detail), tool (Tool Offset;approach used when it is possible to change theposition of the detail relative to the gripping tool orthe possibility of relative displacement duringgrasping e.g. vacuum, needle and magnetic grippers).Fig. 4. View of the vision system configuration used forthe research; where: 1 - camera, 2 - optical axis of thecamera, 3 - height of workpiece in Z direction(Z coordinate of measurement plane viewed from XYplane of application user frame), 4 - manipulated object,5 - storage pallet, 6 - reference axis system (ApplicationUser Frame), 7 - robot, 8 - gripperFigure 5 shows the view of the complete stand,including robot instrumentation (Michalski, 2017). Inaddition, a SICK WTB4S photoelectric sensor withadjustable output threshold value was mounted on thegripper (Figure 5b). The photoelectric sensor wasused to control the distance between the robot's wristand manipulated parts, which enables precisepositioning of the gripper's jaws.Fig. 2. The methods of identifying the position of objectsin the coordinate system: a) Fixed Frame Offset, b) ToolOffsetThe vision camera can be permanently mounted toa fixed bracket or directly to the wrist of an industrialrobot (Figure 3).Fig. 3. Vision camera mounting methods: a) fixed bracketoriented relative to the global coordinate system of therobot, b) mounting on the robot wristMounting on a robot wrist results in maximization ofthe area covered by the camera lens once, as well asthe ability to capture images from different positionsand distances. In this case, the position of the detaildepends not only on the location of the camera, butalso on the current position of the robot wrist (whichincreases the complexity of the calculations). Inaddition, the movement of the robot during shootingmay cause blur (Michalski, 2018).Due to the characteristics and requirements of theapplication, the vision system configuration with thefixed camera and determination of detail positionrelated to the coordinate system of the robot scene wasused (Figure 4).The advantages of permanent mounting includeimage recording with a fixed value of cameraparameters (e.g. distance, focal length etc.). Thedescribedsolution allowsadditionallyforacceleration of the objects identification process, dueto the possibility of performing image processingduring the robot's other activities.Fig. 5. Views of: a) the test stand with the FANUC LRMate 200iD/4S series robot, b) the robot wrist with theSCHUNK EGP 40 gripper and the SICK WTB4Sphotorelay, c) the SONY XC-56 camera, d) a 3D model ofthe test stand built in the SIEMENS NX11 environment100

Figure 8). In the next step the AUF coordinate system(Application User Frame) has been defined, using thethree-point method (Figure 9). The three positionsdefine, respectively: the landmark of the coordinatesystem center, the positive direction of the X axis andthe positive direction of the Y axis.2.3 The procedure for configuring vision systemand industrial robotThe image processing algorithm presented in Figure6(a) has been adopted in the field of vision systemoperation. The assumed algorithm for configuring thevision system (connected to the industrial robotcontroller) includes all the necessary steps leading toobtain a fully functional system (Figure 6(b)).Fig. 6. View of: a) the vision processing algorithm,b) the vision system configuration procedureA correct operation of the vision system requiredconfiguration of the connection to the robot controllervia Ethernet network (the controller is identified bythe IP address). All the necessary steps have beencarried out using the WEB SERVER application(which allows also to view configuration data, currentparameter values as well as launch the iRVisionsystem). The first stage was the configuration of thecamera type and its parameters (Figure 7).Fig. 8. View: a) a definition of the TCP point relative tothe calibration pin in a position No. 1, b) the definition ofthe TCP point relative to the calibration pin in a positionNo. 2, c) the dialog box with the saved TCP pointIn accordance with the proposed conceptual models,additional instrumentation was made, including a pinthat allows accurate determination of coordinatesystems (cameras and robot scenes) and a visionsystem calibration board.Fig. 7. View of the camera main parameters configurationscreenThe next configuration step includes the definition ofthe TCP (Tool Center Point) of the auxiliary tool,which at a later stage was used to determine thecoordinate systems of the manipulator working spaceand the vision system associated with the calibrationscreen (in this case the four-point method was used;Fig. 9. View: a) the pin designed for definition ofcoordinate systems, b) the board with a calibration pattern101

The calibration board has been glued onto a flat andrigid surface to minimize distortion and calibrationerrors. Completed auxiliary tools allow for correctconfiguration and teaching of the vision system.The direction of the Z axis is determined according tothe rule of clockwise coordinate system (Figure 10).point of the coordinate system (Figure 11).Fig. 11. Definition of the characteristic points on thecalibration board: a) landmark, b) defining point of the Xaxis, c) defining point of the Y axis, d) starting point, e) thedialog box with the coordinate system defined with usageof the four-point methodFig. 10. View of the robot wrist orientation in the phase ofdefining the characteristic points of the coordinate systemof the robot scene: a) landmark of the coordinate system,b) defining point of the X axis, c) defining point of the Yaxis, d) the dialog box with the saved parameters of thecoordinate systemThe last step at the configuration stage is the calibrationof the vision system. During this process, the visionsystem calculates the camera position (relative to

CONFIGURATION AND PROGRAMMING OF THE FANUC IRVISION VISION SYSTEM FOR APPLICATIONS IN THE DYNAMIC ENVIRONMENT OF MANIPULATED ELEMENTS Mirosław Marny, Mariusz Piotr Hetmanczyk The Silesian University of Technology, Faculty of Mechanical Engineering, Department of Engineering Processes Automation and Integrated Manufacturing Systems,