Vision and Robot
Vision and Robot
BFP-A3659-A
MELFA robot seminar schedule
Day 1 Day 2
Agenda Time Agenda Time
Seminar introduction ■ Fixed downward-facing camera
Orientation Automatic operation
Break 10 Break 10
min min
■ Fixed downward-facing camera ■ Hand eye
Tool settings Calibration
Calibration
1 hr 1 hr
Lunch Lunch
■ Vision sensor commands ■ Hand eye
Status variables Create identification jobs
Teaching
1 hr 1 hr
50 40
■ Fixed downward-facing camera min min
Create identification jobs
Teaching
Break 10 Break 10
min min
■ Fixed downward-facing camera ■ Hand eye
Manual operation Manual operation
Automatic operation
1 hr 1 hr
55 35
min min
End End
Teaching of the robot should only be performed by individuals who have undergone special
safety training.
same applies to maintenance work with the robot power ON.)
→ Conduct safety education.
Prepare work regulations indicating robot operation methods and procedures, and measures to
be taken when errors occur or when rebooting robots. Observe these rules at all times.
(The same applies to maintenance work with the robot power ON.)
→ Prepare work regulations.
Only teach the robot after first equipping the controller with a device capable of stopping
operation immediately.
(The same applies to maintenance work with the robot power ON.)
→ Equip the controller with an Emergency Stop button.
Notify others that the robot is being taught by affixing a sign to the Start switch.
(The same applies to maintenance work with the robot power ON.)
→ Indicate that the robot is being taught.
Install fences or enclosures around robots to prevent contact between robots and workers
during operation.
→ Install safety fences.
Stipulate a specific signaling method to be used among related workers when starting
operation.
→ Operation start signal
Shut off the power when maintaining the robot. Notify others that the robot is under
maintenance by affixing a sign to the Start switch.
→ Indicate that maintenance work is being performed.
Before starting operation, conduct an inspection of robots, Emergency Stop buttons, and any
other related devices to ensure that there are no abnormalities.
→ Inspection before starting operation
The following precautions are taken from the separate Safety Manual.
Refer to the Safety Manual for further details.
Design interlocks such as individual device operation rights when operating the robot
automatically with multiple control devices (GOTs, programmable controllers and push-button
switches).
Caution Install and use the robot on a secure and stable platform.
Positional displacement or vibrations may occur if the robot is unstable.
Ensure that cables are kept as far apart from noise sources as much as possible.
Positional displacement or malfunction may occur if cables are close to noise sources.
Ensure that the weight of the workpiece, including the hand, does not exceed the rated load
or permissible torque.
Failure to observe this may result in alarms or breakdown.
Ensure that hands and tools are attached properly, and that workpieces are gripped securely.
Failure to observe this may result in bodily injury or property damage if objects are sent flying
or released during operation.
If performing teaching work inside the robot operation range, always ensure complete control
over the robot beforehand. Failure to observe this may result in bodily injury or property
damage if the robot is able to start with external commands.
Jog the robot with the speed set as low as possible, and never take your eyes off the robot.
Failure to observe this may result in collision with workpieces or surrounding equipment.
Always check robot movement in step operation before commencing auto operation following
program editing.
Failure to observe this may result in collision with surrounding equipment due to programming
mistakes, etc.
Ensure that the door of the safety fence locks or that the robot automatically stops if
someone attempts to open it during auto operation.
Failure to observe this may result in bodily injury.
Do not perform unauthorized modifications or use maintenance parts other than those
stipulated.
Failure to observe this may result in breakdown or malfunction.
When moving the robot arm by hand, never insert hands or fingers into openings. Depending
on the posture of the robot, hands or fingers may become jammed.
Do not stop the robot or perform an emergency stop by turning OFF the main power of the
robot controller.
Robot accuracy may be adversely affected if the main power of the robot controller is turned
OFF during auto operation. Furthermore, the robot arm may collide with surrounding
equipment if it falls or moves under its own inertia.
When rewriting internal robot controller information such as programs or parameters, do not
turn OFF the main power of the robot controller.
If the main power of the robot controller is turned OFF while rewriting programs or
parameters during auto operation, internal robot controller information may be corrupted.
Do not connect a Handy GOT when using this product's GOT direct connection function. The
Handy GOT can operate the robot automatically regardless of whether the operation rights are
enabled or disabled. This could lead to property damage or bodily injury.
Do not disconnect SSCNET III cables when either the multi CPU system or the servo
amplifier is ON. Do not look directly at light emitted from the end of SSCNET III connectors
or SSCNET III cables. Doing so may cause discomfort. (SSCNET III employs a Class 1 or
equivalent light source as specified in JISC6802 and IEC60825-1.)
Do not disconnect SSCNET III cables while the controller is ON. Do not look directly at light
emitted from the end of SSCNET III connectors or SSCNET III cables. Doing so may cause
discomfort.
(SSCNET III employs a Class 1 or equivalent light source as specified in JISC6802 and
IEC60825-1.)
Replace the caps of SSCNET III connectors after they have been disconnected to prevent
them from being adversely affected by dust and foreign matter.
Take care not to wire devices incorrectly. Incorrect wiring may cause malfunctions such as the
inability to terminate an emergency stop.
After wiring devices such as the teaching pendant Emergency Stop switch, door switch and
any emergency stop devices prepared by the customer, ensure that they are functioning
correctly to prevent accidents from occurring.
Not all commercial devices such as computers and network hubs will function correctly when
connected to the controller's USB port. They may also be affected by temperatures and
electronic noise found in FA environments.
When using commercial devices, protection against EMI (Electric-magnetic interference) and
the use of ferrite cores etc. may be required to ensure devices function correctly. Mitsubishi
Electric does not guarantee commercial devices will be compatible with our products. Neither
will we carry out maintenance on them.
If needed, create a way to keep the robot system safe from unauthorized access from
external devices connected to the network.
In addition, implement a firewall to keep the robot system safe from unauthorized access from
external devices connected to the Internet.
■ Revision history
・ Distribution of this document, in-part or whole, without prior authorization is strictly prohibited.
・ Information contained in this document is subject to change without notice.
・ Specifications are based on the results of in-house standardized tests.
・ This is an original Mitsubishi document.
・ All trademarks/registered trademarks of company and product names mentioned in this document
are the property of their respective owners.
and ™ marks have been omitted in this document.
R
・
Main applications
Recognize and accurately pick randomly orientated workpieces from supply platforms and
roughly aligned workpieces on plastic trays.
Confirm the point of assembly and accurately place or assemble the workpiece at that point.
Adjust the position of the hand if the workpiece is not being held properly.
Read 2D codes and pick the product related to that code from the production line.
Detect abnormalities.
Robots alone cannot inspect parts. For inquires related to using vision sensors for flaw detection
(scratches etc.), contact your local sales representative.
Examples
■ Introduction 1
■ Terminology
The following terms are used in this document.
Term Definition
In-Sight Explorer Software used for setting up the vision sensor. (Created by Cognex Corp.)
(In-Sight Explorer In-Sight Explorer is required to configure vision sensor settings and setup
for MELSENSOR) vision applications.
To operate this software, a camera needs to be connected to the
computer the software is installed on.
* In this document, In-Sight Explorer for MELSENSOR is expressed as
"In-Sight Explorer ".
EasyBuilder® EasyBuilder helps users program their vision system. (Created by Cognex
Corp.)
PatMax® A positioning tool which utilizes advanced geometric pattern verification
technology and algorithms developed in-house by Cognex (U.S. patent
granted). It is able to precisely identify the position of objects and is
unaffected by shadows and changes in the angle or size of an object.
Job Contains the program data of the 2D vision project.
Jobs are needed to manage data and communications transmitted between
the robot and 2D vision sensor(s).
MELFA BASIC robot programs can control job access, imaging triggers and
data acquisition of jobs created in In-Sight Explorer.
Calibration Calibration is work that involves converting the coordinates of the vision
sensor camera into robot coordinates.
Trigger Captures images from the camera.
Hand eye A camera attached to the end of the robot arm which is used for taking
measurements and recognizing workpieces.
Fixed camera A camera attached to a frame which is used for taking measurements and
recognizing workpieces. Unlike the hand eye, fixed cameras cannot move.
* For further information on robot operations, robot programming, and vision sensors, refer to the PDF robot
manual. The latest PDF robot manual can be downloaded from the Mitsubishi Electric FA website.
(Mitsubishi Electric FA website: www.MitsubishiElectric.co.jp/fa)
Related manuals
Term Definition
Detailed explanations of Explanations of and how to use functions, MELFA BASIC
functions and operations commands, external I/O devices, and parameters.
Troubleshooting Reasons for errors and how to deal with them when they arise.
RT ToolBox3/RT ToolBox3 mini Instruction manuals for comprehensive robot engineering
Instruction manuals assistance software (RT ToolBox3, RT ToolBox3 mini,
RT ToolBox3 Pro.
■ Terminology 2
Chapter 1 - Fundamentals of 2D vision sensors
1.1 Specifications of 2D vision sensors
There are two Mitsubishi Electric 2D vision sensors (MELSENSORS): the compact, wire-saving,
stand-alone VS80 series, and the VS70 series which has in-built lights.
In this seminar we will use VS80 series vision sensors and learn the basics of systems that incorporate
robots and vision sensors.
4) Application Steps
5) Status bar
7) Tools 8) Settings
Software
In-Sight Explorer
RT ToolBox3
2D vision sensor
Computer for
MELSENSOR VS80 series
adjustments
PoE LAN cable
(Made by Cognex)
LAN cable
Power
supply
PoE hub
LAN cable
Warning: Do not connect to a PoE
port
Machine cable
Robot controller
Robot
Teaching pendant
(R32TB or R56TB)
Software
In-Sight Explorer
RT ToolBox3
Computer for
2D vision sensor adjustments
MELSENSOR VS80 series
PoE LAN cable
(Made by Cognex)
LAN cable
Power
supply PoE hub
LAN cable
Warning: Do not connect to a PoE
port
Programmable controller
MELSEC iQ-R series
MELSEC Q series
SSCNETIII cable
Machine cable
Robot controller
Robot
Teaching pendant
(R32TB or R56TB)
1. Fixed
downward-facing camera
Robot: RV-2FR-R
2. Hand eye
Robot hand
Workpiece
destination platform
○ Using a fixed downward-facing camera is the standard application of a vision sensor. Combining a vision
sensor with a robots multi-task function allows the vision sensor to search for another workpiece while
the robot is outside its field of view (FOV), i.e., when the robot is transporting a workpiece. This helps
improve cycle time.
▲ However, because the camera's position and FOV are fixed a different camera with a narrow FOV and
a large resolution may be required for high-precision tasks. This leads to increased costs and limits to
the types of systems that can be constructed.
2. Hand eye
○ Unlike a fixed downward-facing camera, the hand eye's FOV can be narrowed.
○ This is because the hand eye's FOV is not restricted to the position where it picks up a workpiece and
includes the full operating range of the robot.
▲ The downside of a hand eye is that it has a longer cycle time than a fixed downward-facing camera as it
cannot do other things while processing identification data.
○ A fixed upward-facing camera is generally used to help adjust the position of the workpiece the robot is
holding.
○ In theory, it could be used for high precision work if the camera has a good view of the workpiece.
○ The camera could also be used if it does not matter if the hand deviates from its position when it grasps
the workpiece.
▲ A fixed upward-facing camera, however, is rarely used on its own and it is assumed that it will be used
with either one or both of the two cameras mentioned above. This means that system costs and camera
processing times cannot be reduced.
Steps 1 to 4 of the communication settings are used for both the hand eye and the
fixed downward-facing camera.
Steps 5 to 9 are the steps used to configure both cameras. Each camera is configured
differently
Communication settings
Chapter 2
Programmable
controller
Fixed-upward facing
camera
Under_VS80M-200-E
Robot controller
Communication settings
(Chapter 2)
Control Panel
Local Area
Connection Status
2)
3)
5)
4)
IP address: the address the robot connects to
Subnet mask: 255.255.255.0
4)
5)
2) 3)
4)
Ethernet status
Ethernet properties 5)
6)
7)
7)
RT ToolBox3
desktop icon
2) Click New in the menu bar on the Home tab. The New Workspace window will appear.
3) Specify a place to save the workspace, then enter a workspace name and click OK.
3)
2)
3)
3)
1)
2)
1) Select the robot and controller that will be used for the project.
The series, type and maximum load of the robot and controller can be narrowed down using the
drop-down boxes.
2) Select the model from the list that appears below the drop-down boxes and click Next >.
The RV-2FR-R will be used in this seminar. Select the following from the drop-down boxes.
・ Series: FR-R series CR800-R
・ Type: RV Vertical type (6 axis)
・ Maximum load: 2 kg
・ Robot model: RV-2FR-R
1)
Selected model 2)
2)
1)
2)
3)
Connection
method
4)
4)
5) Online
5) Online
2)
OPT13
3)
1)
OPT13
1)
3)
*
1) Follow Steps 1.1 and 1.2, then configure the settings as shown below.
2) In the Ethernet window, click Write to write the settings to the robot controller.
3) A dialog box with the message "Are you sure want to restart the robot controller?" will appear. Click
Yes. The robot controller will restart.
2)
Project tree
NVTRGTMG
2)
1)
3)
4)
* This completes the connection settings for the robot and camera.
Additional
Parameter NVTRGTMG
If parameter NVTRGTMG is set to the default value "2", the NVRun command*1 starts processing the
next command without waiting for image processing to finish. Therefore, the results of previous
identification data may be read if the EBRead command*2 is executed immediately.
*1 NvRun command: A command which specifies the job, which vision sensor to control and when to
activate a trigger.
*2 EBRead command: A command which reads data from the camera.
For information on NVTRGTMG, refer to "Appendix 3 2D Vision sensor parameters".
1) There will be times when In-Sight Explorer detects cameras on start up and times when it does not.
1)
2)
2)
3)
1)
1)
2) List of cameras
2)
3)
1) MAC address 3)
4)
1) Complete Step 1.2 for the other cameras to configure the network settings.
2) After configuring the settings, click Close.
The Add Sensor/Device To Network window will close, and then the cameras will restart.
2)
1) After the cameras have restarted, the names of the cameras will appear under Select an In-Sight Sensor
or Emulator.
2) Select the camera you want to use and click Connect to connect the camera.
3) Select Set Up Image from the Application Steps window.
4) Click Trigger under Acquire/Load Image to display the image of the selected camera.
* If a camera cannot be connected, ensure that the IP address, Subnet Mask, and Telnet Port (Port No.)
are the same as the settings that are stated in "2.3.2 Robot and camera connection settings".
How to check
Go to Sensor on the menu bar and select Network Settings.
If the IP Address, Subnet Mask, and Telnet Port (Port No.) are not the same as the settings that are
stated in "2.3.2 Robot and camera connection settings", change them and click OK.
Reselect the camera, then click Connect and check that the camera is connected.
2)
3)
4)
1) Go to System on the menu bar and select Options. The Options window will appear.
2) Select User Interface from the menu on the left.
3) Select the Use English Symbolic Tags for EasyBuilder check box. Click Apply then OK.
1) System → Options
3)
2)
3)
If the Use English Symbolic Tags for EasyBuilder check box is not selected, data
communication will become unstable when a program is executed.
Additional
Common methods of acquiring
The following methods are the main methods used for acquiring images.
(1) Trigger
The camera acquires one image when the Trigger button is pressed.
When the trigger control is set to Manual, images can be acquired by pressing the F5 button.
Image acquisition settings can be configured in the Edit Acquisition Settings window, accessible by
clicking on Set Up Image in the Application steps window.
Main trigger control methods
・ Camera: Images are acquired upon the detection of a rising edge signal on the hardware input
port of the vision system.
・ Continuous: Images are acquired continuously using the intervals set in Trigger Intervals.
・ External: Images are acquired on the command of an external device.
・ Manual: Images are acquired when the function key (F5) is pressed.
(2) Live Video
Camera images are displayed in real time. Live view is also used when adjusting the lens.
Workpiece suction
platform
Hand eye
1) Move the robot arm so that the Height
hand camera is above where the workpiece is
identified
workpiece is to be picked. Move
the robot to a height which does
not change the required FOV.
Identification
range
Continuous trigger
(toolbar)
Good
Bad
Fig. 3.1 shows that in a typical vision system, the robot needs a lot of information in advance to complete
the operations (imaging/picking). The robot needs the following two pieces of information to perform the
operations shown above.
1) Where is the position of the workpiece recognized by the vision sensor?
2) How should the robot approach the workpiece found by the vision sensor?
However, in regards to question 1, the robot does not know where the vision sensor is installed or where it is
looking at.
Origin
+X
+Y
The vision sensor uses In-Sight Explorer to identify the position of the workpiece within the FOV. However,
the vision sensor does not know where the workpiece is in relation to the robot's origin.
Information from
the vision sensor
The position of
the workpiece is
equal to:
X: ### pixels
Y: ### pixels
C: ### degrees
Vision sensor looks for workpiece Robot does not know where Robot cannot move
The following things need to be done in order to answer questions 1 and 2 on the previous page.
・ Convert the position of the workpiece recognized by the vision sensor into robot coordinates.
・ Teach the robot the approach position for the master workpiece which has been converted into robot
coordinates.
Figure out the relative position of the workpiece in respect to the robot.
* Converting the position of the workpiece recognized by the vision sensor into robot coordinates is called
"calibration".
We will use the program "TLUP" to set a control point that is used for calibration.
(Refer to 3.2.1Program for setting a control point used for calibration.)
Fig. 3.5 and Fig. 3.6 depict the teaching of P0 and P90 using TLUP.
A pointed object is also placed on the platform. When the robot is taught to rotate the tool's C axis
90°around the tip of the object on the platform, it calculates where that point is within its own tool
coordinates. It then sets the coordinates as tool points even though they are only XY components.
Executing TLUP (control point setting program) sets the tip of the calibration tool in the robot hand as
the control point. This control point is then used for calibration.
Flange center
Y
Control point used for
calibration
P_TLUP = (X, Y, 0,0,0,0) Position of the tip of the calibration tool
from the center of the flange (X, Y).
Vision sensor
Origin
XV
YV
XR
YR
Robot center
Camera
Camera coordinates Robot coordinates
1. (XXX.XX YYY.YY) - (XXX.XX ,YYY.YY)
2. (XXX.XX YYY.YY) - (XXX.XX ,YYY.YY)
Calibration sheet
3. (XXX.XX YYY.YY) - (XXX.XX ,YYY.YY)
4. (XXX.XX YYY.YY) - (XXX.XX ,YYY.YY)
5. (XXX.XX YYY.YY) - (XXX.XX ,YYY.YY)
324
187
Fig. 3.12: Position of features does not match in robot and vision
In order to check the robot coordinates, move the robot arm to the position of the feature. Then read the
robot’s current position using the teaching pendent or RT ToolBox3. The current position will then become
the position of the feature.
Fig. 3.13 shows a workpiece that has been placed within the camera's FOV and the robot arm being moved
to the position of the feature (XY coordinates).
The center of the flange (control point) is aligned with the feature.
Fig. 3.13: Alignment of the control point with the workpiece feature
It is likely that you will question where the robot control point is during this step. As there is no marking on
the center of the flange, it is very difficult to align the center of the flange with the feature on the workpiece
while using the robot in JOG mode.
Step 2: Move the robot arm in JOG mode to the position of the feature.
Step 3: Set the position where the feature and control point meet to the
position of the feature in the robot coordinates system. Check the
position data with the teaching pendant/RT ToolBox3.
Fig. 3.14: Alignment of the position of the feature with the robot
Workpiece pick
up point
+Z
+X +Y
Workpiece position
Robot origin
Fig. 3.15: shows the relative position of the robot's pick up point and the position of the workpiece.
Typically, the same picking method is used to pick workpieces of the same type.
The workpiece pick up point, which the robot considers to be the origin of the workpiece, is at a fixed point
in space relative to the actual position of the workpiece.
The workpiece pickup point (PTRG) can be thought of as the point that the robot considers to be the position
of the workpiece (PVS). That is, at what coordinates in the workpiece coordinate system (A, B, C, X, Y, Z)
should the robot pick up the workpiece.
Workpiece
Workpiece position coordinates
(PVS)
PH=Inv(PVS)*PWK
When working with multiple workpieces of the same type, select one of the workpieces to be the
master workpiece and teach the robot PWK and PVS. After PH has been calculated, it is possible
for the robot to pick other workpieces in different positions accurately using PH (the calculation of
the relationship between the position of the workpiece and the workpiece pick up point). This is
only possible if the calculated PH from the master workpiece can be applied to the other pieces.
PWK
PH
Fig. 3.17: Master workpiece pick up point (PWK) and master workpiece
◇ Program UBP.prg
1 '---------------------------------------------------
2 ' User external variable definition program UBP.prg
3 '---------------------------------------------------
4 Def Pos P_TLUP ' Fixed downward-facing camera: Control point data used for calibration
5 Def Pos P_TLLW ' Fixed upward-facing camera: Control point data used for calibration
6 Def Pos P_CMTL ' Hand camera: Camera center position data for calibration
7 '
8 Def Pos P_PH01 ' Fixed downward-facing camera: Coefficient for calculating the handling position
from the position of the identified workpiece
9 Def Pos P_PH02 ' Fixed upward-facing camera: Coefficient for calculating the handling position from
the position of the identified workpiece
10 Def Pos P_PH03 ' Hand camera: Coefficient for calculating the handling position from the position of
the identified workpiece
11 '
12 Def Pos P_WRK01 ' Fixed downward-facing camera: Master workpiece grasp position
13 Def Pos P_WRK02 ' Fixed upward-facing camera: Master workpiece grasp position
14 Def Pos P_WRK03 ' Hand camera: Master workpiece grasp position
15 '
16 Def Pos P_PVS01 ' Fixed downward-facing camera: Master workpiece identification point
17 Def Pos P_PVS02 ' Fixed upward-facing camera: Master workpiece identification point
18 Def Pos P_PVS03 ' Hand camera: Master workpiece identification point
19 '
20 Def Pos P_PHome ' Safe position
21 Def Pos P_CLPos ' Hand camera: Reference point to start calibration
22 Def Pos P_CMH ' Hand camera: Offset from the surface of the identified workpiece to the imaging
point
23 Def Pos P_HVSP1 ' Hand camera: Default imaging point
24 Def Pos P_LVSP1 ' Fixed upward-facing camera: Default imaging point
25 Def Pos P_MPUT1 ' Fixed upward-facing camera: Master workpiece destination position
26 '
27 Def Char C_C01 ' Fixed downward-facing camera: COM name
28 Def Char C_C02 ' Fixed upward-facing camera: COM name
29 Def Char C_C03 ' Hand camera: COM name
30 '
31 Def Char C_J01 ' Fixed downward-facing camera: Job name
32 Def Char C_J02 ' Fixed upward-facing camera: Job name
(End of UBP.prg)
1 '----------------------------------------------------------------------------
2 ' MELFA Sample Program, "XY-tool setting program" TLUP.prg
3 '----------------------------------------------------------------------------
4 Def Pos P_TLUP ' Fixed downward-facing camera: Control point data used for calibration
5 Loadset 1,1
6 OAdl On
7 Servo On
8 Wait M_Svo=1
9 '----- HAND INIT -----
10 '---- Wait M_Svo=1
11 '---- If M_EHOrg=0 Then ' If hand has not returned to origin:
12 '---- EHOrg 1 ' returns Hand 1 to origin.
13 '---- Wait M_EHOrg=1 ' waits for Hand 1 to return to origin.
14 '---- EndIf
15 '---- EHOpen 1,100,100 ' Opens Hand 1 (speed = 100%, Force = 20%)
16 '---- Wait M_EHBusy=0 ' Checks if operation is complete
17 '---------------------
18 PTool=P_Tool
19 '---- Align hand. Align the tip of the calibration tool held in the hand with the tip of the object on the
platform.
20 '
21 P0=P_Fbc
22 P91=P0*(+0.00,+0.00,+0.00,+0.00,+0.00,+90.00)
23 Mvs P91
24 ' Moving the hand in only the X and Y axes, use the teaching pendant's XYZ jog mode to align the tip
of the calibration tool held in the robot hand with the tip of the object on the platform.
25 P90=P_Fbc
26 PTL=P_Zero
27 PT=Inv(P90)*P0
28 PTL.X=(PT.X+PT.Y)/2
29 PTL.Y=(-PT.X+PT.Y)/2
30 P_TLUP=PTool*PTL
31 Tool P_TLUP
32 Hlt
33 End
P_TLUP=(+0.62,-0.61,+0.00,+0.00,+0.00,+0.00,+0.00,+0.00)(0,0)
PTool=(+0.00,+0.00,+0.00,+0.00,+0.00,+0.00,+0.00,+0.00)(0,0)
P0=(+248.75,+204.28,+293.04,+180.00,+0.00,-137.58,+0.00,+0.00)(7,0)
P91=(+248.75,+204.28,+293.04,+180.00,+0.00,+132.42,+0.00,+0.00)(7,0)
P90=(+249.58,+203.36,+293.04,+180.00,+0.00,+132.42,+0.00,+0.00)(7,0)
PTL=(+0.62,-0.61,+0.00,+0.00,+0.00,+0.00,+0.00,+0.00)(0,0)
PT=(+1.23,+0.01,+0.00,+0.00,+0.00,-90.00,+0.00,+0.00)(7,0)
PTL2=(+0.00,+0.00,+0.00,+0.00,+0.00,+0.00,+0.00,+0.00)(,)
(End of TLUP.prg)
P0
P91
◇ Program TLUP.prg
......
18 PTool=P_Tool
19 ' Align hand. Align the tip of the needle in the hand with the
tip of the needle on the platform.
20 '
21 P0=P_Fbc
22 P91=P0*(+0.00,+0.00,+0.00,+0.00,+0.00,+90.00)
23 Mov P91 ' Rotates the hand 90° (tips misaligned).
24 ' Move the robot using the teaching pendant's XYZ jog
mode in the X and Y axes only to realign the tip of the
calibration tool held in the robot hand with the tip of the
object on the platform.
25 P90=P_Fbc
26 PTL=P_Zero
27 PT=Inv(P90)*P0
28 PTL.X=(PT.X+PT.Y)/2
29 PTL.Y=(-PT.X+PT.Y)/2
30 P_TLUP=PTool*PTL
31 Tool P_TLUP
Add a user base program in RT ToolBox3. This will set up variables which can be used in other programs.
1) Go to Online → Parameter → Parameter List in the project tree.
2) Read (search for) parameter "PRGUSR". Then double click PRGUSR and enter UBP in the field shown
below.
3) Click Write to write the parameter to the robot controller.
4) Restart the robot controller.
PRGUSR
2)
3)
F3
1) HAND
Hand
screen
Teaching pendant
F2
Electric Hand
Operation
screen
1) HAND
F1 and F2
Teaching pendant
1) Place the pointed object on the platform parallel to the surface the robot is on.
B
A
A
(Enlarged view)
Tips aligned
3) Align tips
Tool jog
The calibration tool should rotate
around its center point.
The tip should not wobble about.
The calibration tool is required for calibration so do not remove it from the
robot hand. Make sure not to accidentally knock, dislodge, or interfere with
the calibration tool.
Point 2
Point 1
Point 3
Point 5
Point 4
Point 3
Enter the teaching pendant
value (robot coordinates)
into x and y.
(6) Carry out N point calibration.
(World X, world Y)
1)
2)
1)
2)
1)
Repeating trigger
1) Select Live Video from Acquire/Load Image, or click the Repeating trigger button on the tool bar.
2) Adjust the focus and aperture to get a clear image of the workpiece suction point.
Refer to 2.5Adjusting the lens (focus and aperture) for information on how to adjust the lens.
1)
3)
2)
3)
Created user-defined
points are displayed.
Create an equal number of user-defined points and calibration marks. Five points will be used in this
seminar.
1) Create the five points using steps 3 and 4 in section 4.1.
The created user-defined points will highlight in green on the camera image and added to the Palette.
(In the Palette, the user-defined points will be listed in the order they were created.)
Place the five points in easily recognizable positions on the workpieces. (at the apex of each triangle)
* It is helpful to remember where each point (1 to 5) has been placed.
1) Select Point_1 from the Palette. The corresponding point will highlight in green on the camera image.
2) Drag the highlighted point with the mouse to an easily identifiable point on a workpiece (the apex of the
triangle or a point where two lines intersect). * With the image enlarged, align the highlighted point with
the feature as accurately as possible.
3) Repeat the same process for points 2 to 4.
2) Drag the highlighted point with the mouse to an easily identifiable point.
1)
Selected user-defined
points will turn pink.
5)
Note: Clicking the user-defined points will create N points.
If anything other than a user-defined points is clicked, or a user-defined point does not turn
pink when selected, click Cancel and repeat the process from Step 3 in section 5.1.
2) Calibration name
1) Created N points
Note:
If the number of listed N points differs from the number of N points you want to create...
Something else other than the user-defined points may have been selected in Step 5.1.
(Click user-defined points to create N points.)
In this case, recreate N points in the following steps.
3) Select the calibration name from Palette, then right-click and select Delete to delete the data.
4) Create N points by repeating the process from Step 3 in section 5.1 onwards.
1) Move the calibration tool in the robot hand to the position that corresponds with the on-screen
position of Point 0.
2) Once the robot has moved to the position of Point 0, move the selector in the bottom right of In-Sight
Explorer to Point 0.
3) Enter the X and Y robot coordinates displayed on the teaching pendant in the World X and World Y
fields.
4) Repeat the process for points 1 to 4. (Five points in total)
1) Point 0
1) Point 0
Enlarged view
Points 0 to 4
Five points in
total
1) Select the Settings tab under Edit Tool in the bottom center of In-Sight Explorer.
2) If the Repeating trigger is enabled, disable it.
3) Enter a file name and press Export.
4) Pass: Export Complete will be shown in the Results tab of the Palette if the data has been exported
successfully.
Settings
3)
Export button
Additional
Reason for exporting data
Allowed programs
2) Change settings
1)
2)
1)
2) Trigger: Manual
4) Calibration file name
4) Model region
1)
3)
4) Search region
Feature to be identified
Search
1)
1)
2) Settings
Accept Threshold value: The amount (%) in which the identified workpiece matches the
registered image.
・ A rotation tolerance value is an angle at which the actual workpiece can be identified
even if the workpiece is rotated (±deg).
1)
2)
Communication settings
3)
Field Setting value
Device Robot
Manufacturer Mitsubishi
Protocol Ethernet
Native String
5)
1)
2)
3)
5) Select each piece of data in order
Select Output Data window
4) Click the blue arrow while holding down the Ctrl key.
of Pattern_1. Pattern_1.Number_Found
Pattern_1.Fixture.X
Pattern_1.Fixture.Y
Pattern_1.Fixture.Angle
6)
1)
2)
Online
Online
(8.1) Checking calibration without moving the workpiece (master workpiece check)
Complete the following steps without moving the workpiece that is used for calibration.
Create a new job and identify the workpiece. Move the robot arm in Jog mode to the coordinates displayed
on the teaching pendant, then move robot arm up and down in the Z axis.
Calibration has been successful if the point of the calibration tool (control point) lines up with the TCP on
the image in In-Sight Explorer.
1 '----------------------------------
2 ' UVS1.prg: Fixed downward-facing camera adjustment program
3 '----------------------------------
4 Def Pos P_WRK01 ' Fixed downward-facing camera: Master workpiece grasp position
5 Def Pos P_PH01 ' Fixed downward-facing camera: Coefficient for calculating the handling position
from the position of the identified workpiece
6 Def Pos P_TLUP ' Fixed downward-facing camera: Control point data used for calibration
7 Def Pos P_PVS01 ' Fixed downward-facing camera: Master workpiece identification point
8 Def Char C_J01 ' Fixed downward-facing camera: Job name
9 Def Char C_C01 ' Fixed downward-facing camera: COM name
10 '
11 Loadset 1,1
12 OAdl On
13 Ovrd M_NOvrd
14 Spd M_NSpd
15 Tool P_TLUP
16 Servo On
17 Wait M_Svo=1
18 CCOM$="COM3:" ' COM port
19 CPRG$="sample.job" ' Vision sensor job name
20 '----- HAND INIT -----
21 Wait M_Svo=1
22 If M_EHOrg=0 Then ' If hand has not returned to origin:
23 EHOrg 1 ' returns Hand 1 to origin.
24 Wait M_EHOrg=1 ' waits for Hand 1 to return to origin.
25 EndIf
26 EHOpen 1,100,100 ' Opens Hand 1 (speed = 100%, Force = 20%)
27 Wait M_EHBusy=0 ' Checks if operation is complete
28 '---------------------
29 ' 1. Teach safe position "PHOME".
30 ' 2. Set "1" in the robot parameter NVTRGTMG and restart the robot.
31 ' 3. Close Hand 1 with the teaching pendant and lower the suction pad.
32 ' 4. Teach the robot the place on the workpiece it should pick it up by in PWK and then send the
robot arm to the safe position.
33 ' 5. With the teaching pendant, change OVRD to 3% and start automatic operation.
34 Hlt
35 ' ----- Automatic operation -----
36 Mov PHOME
37 Dly 0.5
38 If M_NvOpen(1)<>1 Then
1 '----------------------------------------------
2 ' UVS2.prg: Fixed downward-facing camera operation execution program
3 '----------------------------------------------
4 '
5 GoSub *INIT
6 '
7 Mov PHOME
8 Dly 0.5
9 If M_NvOpen(1) <> 1 Then
10 NVOpen CCOM$ As #1
11 EndIf
12 *VSCHK
13 NVRun #1,CPRG$
14 EBRead #1,,MNUM,PVS
15 If MNUM>=1 Then *VSOK
16 GoTo *VSCHK
17 *VSOK
18 PVS.FL1=PWK.FL1
19 PVS.FL2=PWK.FL2
20 PTRG=PVS*PH ‘--- Calculation of the workpiece grasp position ----
21 Mov PTRG,-50 ' (RH series: +100) *1
22 Dly 0.2
23 HClose 1
24 Fine 0.1,P
25 Mvs PTRG
26 Cnt 0
27 Dly 0.2
28 M_Out(10129)=1
29 Dly 0.2
30 Mvs PTRG,-50 ' (RH series: +100) *1
31 Hlt
32 '
33 ' Teach the destination position "PPUT" here.
34 '
35 Mov PPUT,-50 ' (RH series: +100) *1
36 Fine 0.1,P
37 Mvs PPUT
38 Fine 0
39 Dly 0.2
40 M_Out(10129)=0
41 M_Out(10128)=1 Dly 0.1
42 Dly 0.2
43 HOpen 1
44 Dly 0.2
45 Mvs PPUT,-50 ' (RH series: +100) *1
46 ' ***** Return to safe position *****
47 Mov PHOME
48 Hlt
49 End
(End of UVS2.prg)
(2) Teach the suction area on the master (4) Teach the destination position
workpiece (PWK). (PPUT).
1) Remove the calibration workpiece and pointed object, then place a workpiece on the platform.
This workpiece will now be used as the master workpiece for adjustment.
2) Check that the robot parameter "NVTRGTMG" is set to "1". If not, set "1" and restart the robot
controller.
(2) Teaching the safe position (PHOME) and suction area on the master workpiece (PWK)
(2.1) Run the robot program "UVS1".
1) Run the program up to line 28 using step operation.
◇Specify a job file in line 19 (CPRG$="sample.job").
(The file is the one saved in Step 7 of Section 3.4.)
◇ In line 23, press and hold the F1 key until the hand operation completes. (Initialization of the
electric hand)
◇ Program UVS1.prg
......
18 CCOM$="COM3:" 'COM port Specify an identification job file.
19 CPRG$="sample.job" ' Vision sensor job
name
(2.2) Teach the safe position (PHOME) and suction area on the master workpiece (PWK).
1) Teach the safe position (PHOME). PHOME: Safe position or position during imaging.
2) Close Hand 1 with the teaching pendant and lower the suction pad.
In the pneumatic hand (standard hand) window, press -C.
3) Move the robot to a place where it can apply suction to the master workpiece used for adjustment
(set in Step 1), and teach the position as PWK. PWK: The suction area on the master workpiece.
Used for finding PH (correction value for workpiece suction).
4) Move the robot arm to a place where it will not interfere with anything.
Make sure the workpiece does not move after teaching PWK.
*If the workpiece moves at this point, its position will deviate when
suction is applied to it.
Operation after the suction
pad has been lowered
PWK
Master
1) Move the robot to the safe position "PHOME" using Jog operation.
2) Set Ovrd to 3%.
3) Check that Ovrd is set to 3% and run UVS1.prg automatically.
4) The program will stop at line 34. Run the program again until it stops at line 55.
5) Check the variable values of PVS in the Variables monitor of RT ToolBox3. Check that the X, Y,
and C components of PVS are the same as the X, Y, and C components of the vision sensor.
The component values of the vision sensor can be checked in In-sight Explorer in the following
area: the window on the right of the Format Output String window in Communications settings.
◇ Program UVS1.prg
......
44 PVS.FL1=PWK.FL1
45 PVS.FL2=PWK.FL2
46 PH=Inv(PVS)*PWK '---- Calculation for correcting the
workpiece grasp position ----
47 '-------------------- PWK
48 P_PH01=PH (master workpiece picking position)
49 P_PHome=PHOME
50 P_WRK01=PWK
51 P_PVS01=PVS
52 C_C01$=CCOM$
53 C_J01$=CPRG$ PWK
54 '--------------------
55 Hlt
Robot origin
PH
Master workpiece
PVS
(Position of the master workpiece
Formula for correcting the workpiece grasp
output from the vision sensor)
position: PH = Inv(PVS) × PWK
PVS: Position of the master workpiece output from the vision sensor
PWK: Corrected position of the master workpiece
Communication
5) Component values of
the vision sensor
◇ Program UVS2.prg
5 GoSub *INIT
6 ' ◇ Program UVS2.prg
7 Mov PHOME ......
8 Dly 0.5 51 *INIT
9 If M_NvOpen(1) <> 1 Then ......
10 NVOpen CCOM$ As #1 62 Tool P_TLUP
11 EndIf ......
12 *VSCHK 68 PH=P_PH01
13 NVRun #1,CPRG$ Vision sensor 69 PHOME=P_PHome
14 EBRead #1,,MNUM,PVS operating 70 PWK=P_WRK01 PH and other data
15 If MNUM>=1 Then *VSOK 71 CCOM$=C_C01$ in the UVS1.prg
16 GoTo *VSCHK...... are set.
72 CPRG$=C_J01$
17 *VSOK ......
18 PVS.FL1=PWK.FL1
76 If P_Fbc.Z < PHOME.Z Then
19 PVS.FL2=PWK.FL2
77 PNow=P_Fbc
20 PTRG=PVS*PH '--- Calculation of the workpiec
e grasp position ---- 78 PNow.Z=PHOME.Z
21 Mov PTRG,-50 ' (RH series: +100) 79 Mvs PNow
22 Dly 0.2 ......
23 HClose 1
24 Fine 0.1,P
25 Mvs PTRG
...... Workpiece picking
30 Mvs PTRG,-50 ' (RH series: +100) *1 position (PTRG)
31 Hlt
32 '
33 ' Teach the destination position
"PPUT" here.
...... PTRG
PH Robot origin
PVS
(Position of the workpiece
Workpiece output from the vision sensor)
Teach the destination position
"PPUT". Formula for calculating the workpiece picking
position: PTRG = PVS × PH
Vision sensor
Safe position
(PHOME)
Workpiece destination
position (PPUT)
Fig. 4.1: Standard application Fig. 4.1a: Workpiece seen from the hand
eye shown in Fig. 4.1
However, calibration cannot be accomplished in the same manner as when a fixed downward-facing
camera is used.
This is because the vision sensor is attached to the flange of the robot. If the robot moves, the vision
sensor's FOV changes accordingly. Therefore, it is not easy to get coordinates of the robot's origin from the
vision sensor's FOV.
The images taken by the vision sensor shown in Fig. 4.1, Fig. 4.2, and Fig. 4.3 are those shown in Fig. 4.1a,
Fig. 4.2a, and Fig. 4.3a respectively. This makes it difficult to find where the workpiece is.
Fig. 4.3: Imaging position 2 Fig. 4.3a: Workpiece seen from the hand eye
shown in Fig. 4.3
Workpiece position 4 = imaging position 1 × vision sensor center 2 seen from flange center ×
output position 3 seen from vision sensor center
・ The robot always knows where the center of the flange is. (1 in Fig. 4.4)
・ The vision sensor is attached to the flange. Therefore, even if the robot moves, the distance between
the flange center and the vision sensor center is always the same. In the tool coordinates, the coordinate
values of the vision sensor center change according to those of the flange center. (2 in Fig. 4.4)
・ The robot can find workpieces only when they are within the vision sensor's FOV. The position of the
workpieces can be calculated with reference to the center of the vision sensor. (3 in Fig. 4.4)
Flange center
Robot center
Fig. 4.4: Principle of how the position of the workpiece can be calculated using the hand eye
Camera center
Fig. 4.6: Calibration 1 Fig. 4.7: Calibration screen 1 Tool center point
Then, move the robot in the direction of the X axis in the tool coordinates from (0, 0) as shown in Fig. 4.8,
and take an image again. (The robot is moved only in direction of the X axis, and therefore the Y
component is always 0.)
After the robot has moved, the TCP is located as shown in Fig. 4.9.
A second calibration is made using the reciprocal of the point in the tool coordinates and the tool center
point in the pixel coordinates of the vision sensor.
When the robot is moved in the +X, -X, +Y, and -Y directions, the pixel coordinates of the vision sensor will
be converted into the tool coordinates.
PVSP
Workpiece position
(PVSP*P_CMTL*PVS)
Robot origin
Workpiece position
(PVS)
PWK
PH
Robot origin
PVSP*P_CMTL*PVS
*1) The COM port communication settings used in this seminar differ depending on the camera.
Refer to "* Cameras used in this seminar and COM port communication settings".
4.2.1 Program for setting the control point used for calibration
The programs used in this chapter are UBP.prg and HND1.prg.
・UBP.prg: Used for defining user external variables.
・HND1.prg: Used for aligning the crosshairs in the center of the screen with the feature of the
workpiece to set the control point for robot calibration.
Calculate the distance between the center of the camera and the center of the robot.
◇ Program UBP.prg
For details on the program, refer to 3.2.1Program for setting a control point used for calibration.
◇ Program HND1.prg
1 '-------------------------------------
2 ' Step 1. Program to adjust the hand camera
3 ' Camera center calculation program
4 '-------------------------------------
5 Def Pos P_CMTL ' Hand camera: Camera center position data used for calibration
6 Def Pos P_CLPos ' Hand camera: Reference point to start calibration
7 Loadset 1,1
8 OAdl On
9 Servo On
10 Wait M_Svo=1
11 '----- HAND INIT -----
12 Wait M_Svo=1
13 If M_EHOrg=0 Then ' If hand has not returned to origin:
14 EHOrg 1 ' returns Hand 1 to origin.
15 Wait M_EHOrg=1 ' waits for Hand 1 to return to origin.
16 EndIf
17 EHOpen 1,100,100 ' Opens Hand 1 (speed = 100%, force = 20%).
18 Wait M_EHBusy=0 ' Checks if operation is complete.
19 '---------------------
20 PTool=P_Tool
21 ' Align the hand. Place the crosshairs drawn in the center of the screen over
22 ' the feature while checking the screen.
23 P0=P_Fbc
24 P91=P0*(+0.00,+0.00,+0.00,+0.00,+0.00,+90.00)
25 Mov P91
26 ' Align the crosshairs using the XYZ jog mode in the X and Y axes only.
27 P90=P_Fbc
28 PTL=P_Zero
29 PT=Inv(P90)*P0
30 PTL.X=(PT.X+PT.Y)/2
31 PTL.Y=(-PT.X+PT.Y)/2
32 P_CMTL=PTool*PTL
33 Tool P_CMTL
(continued on the next page→)
(End of HND1.prg)
(3) Position the workpiece used for calibration. (5) Draw two intersecting lines.
In-Sight Explorer
On the workpiece destination platform or
workpiece suction platform
P91
P91
P0
Align the feature with the crosshairs when the hand is positioned at 0° and 90°.
Add a user base program in RT ToolBox3. This will set up variables which can be used in other programs.
1) Go to Online → Parameter → Parameter List in the project tree.
2) Read (search for) parameter "PRGUSR". Then double click PRGUSR and enter UBP in the field shown
below.
3) Click Write to write the parameter to the robot controller.
4) Restart the robot controller.
PRGUSR
3)
4)
1) Select Get Connected from the Application Steps window. Select the hand eye camera and click
Connect.
1)
1)
1) Repeating trigger
2)
Workpiece suction
Workpiece destination
platform
platform
(3.2) Locate the hand camera above where the workpiece is picked.
Workpiec
Workpiec
1) Select Live Video from Acquire/Load Image, or click the Repeating trigger button on the tool bar.
2) Adjust the focus and aperture to get a clear image of the workpiece suction point.
Refer to 2.5Adjusting the lens (focus and aperture) for information on how to adjust the lens.
Pixel column
Pixel row
480 px
1)
4)
3)
Additional
Drawing lines
Selecting Line from Plot Tools also allows you to draw lines on EasyBuilder View.
1) Select Inspect Part from the Application Steps window.
2) Select Line from Plot Tools in the bottom left Add Tool window.
Drag and move the end Move the end point one pixel at
point with the mouse. a time with the arrow keys.
Pink line
7) Click OK. The line will turn green, and be added to the Palette shown on the right.
8)
9) Green crosshairs
5) The hand
rotates 90°.
4) Align the crosshairs with the 6) Misaligned (P91) 7) Align the crosshairs with
feature. the feature
The crosshairs should align with the feature at all times in EasyBuilder View while the C axis rotates.
3) After checking that the alignment is correct, run the program up to line 40 using step operation.
◇ In line 35 "Tool PTool", the robot tool is reset.
◇ In line 36 "Mov P0", the robot returns to the original position.
1 '---------------------------------------------------------
2 ' Step 2. Program to adjust the hand camera
3 ' Calibration assistance program HND2.prg
4 '---------------------------------------------------------
5 Def Pos P_CMTL ' Hand camera: Camera center position data used for calibration
6 Def Pos P_CLPos ' Hand camera: Reference point to start calibration
7 Def Pos P_CMH ' Hand camera: Offset from the workpiece suction point to the imaging point
8 Loadset 1,1
9 OAdl On
10 Servo On
11 Wait M_Svo=1
12 '----- HAND INIT -----
13 Wait M_Svo=1
14 If M_EHOrg=0 Then ' If hand has not returned to origin:
15 EHOrg 1 ' returns Hand 1 to origin.
16 Wait M_EHOrg=1 ' waits for Hand 1 to return to origin.
17 EndIf
18 EHOpen 1,100,100 ' Opens Hand 1 (speed = 100%, Force = 20%).
19 Wait M_EHBusy=0 ' Checks if operation is complete.
20 '---------------------
21 PTool=P_CMTL
22 Tool P_NTool
23 P0=P_CLPos
24 Mov P0
25 ' Place the crosshair in the center of the camera over the crosshair on the jig within the camera's
FOV.
26 ' Align the user-defined points with the crosshairs.
27 ' Put the camera in "Live mode". Then after moving the robot in Tool jog mode,
28 ' find the points in the +X, -X, +Y, and -Y axes where the tool can move to without the crosshair on
the jig leaving the camera's FOV.
29 ' Set these points in the fields for travel amount below. Set the first point as the center of the image
and set the XY world coordinates to 0.
30 ' Enter the reciprocal of the points as world coordinates into the camera calibration tool.
31 ' (*Convert the camera coordinates into tool coordinates)
32 Mov P0*(-30.00,+0.00,+0.00,+0.00,+0.00,+0.00)
(End of HND2.prg)
Feature
Ruler
Example)
Direction: +X
Distance: 30 mm
Direction: -Y
Distance: 45 mm
(2) Set user-defined points. (3) Enter the size of the camera's FOV.
Point2
Mov P0(+30,0,0,0,0,0,0)
Point 1 Mov P0(-30,0,0,0,0,0,0)
Mov P0(0,+45,0,0,0,0,0)
Mov P0(0,-45,0,0,0,0,0)
Point3 Point 5
Mov P0(+30,0,0,0,0,0,0)
(5) Create N
Workpiece
(1.2) Measure the distances in the X or Y axis in which the arm can move without the feature leaving the
camera's FOV.
* Select Continuous from the Trigger drop-down box under Edit Acquisition Settings.
1) Put the ruler in the direction of the X or Y axis of the robot tool coordinates beside the feature set in
Step 4.2.2 (6).
2) Move the robot in Tool jog mode along the ruler in the +X or +Y direction.
The hand and camera move together, and thereby the camera image in In-Sight Explorer changes
according to the movement.
3) Stop the robot at a point where the feature placed in the center of the crosshairs does not leave the
camera's FOV by checking EasyBuilder View.
4) Write down the distance and in which direction the feature has moved in EasyBuilder View.
5) Move the robot in Tool jog mode in the opposite direction and write down the distance in which the
feature does not leave the camera's FOV.
Workpiece Ruler
Feature
Feature
Ruler
Example)
Direction: +X
Distance: 30 mm
Example)
Direction: -X
Distance: 30 mm
1) Put the ruler in the opposite direction. Repeat Step 1.2 and write down the distance in the Y direction
(X direction) in which the arm can move without the feature leaving the camera's FOV. Steps 1.2 and
1.3 allow you to measure four distances (+X, -X, +Y, -Y).
Feature
Example)
Example) Direction: -Y
Direction: +Y Distance: 45 mm
Distance: 45 mm
1)
3)
2)
3)
Pink point
on the camera image
4)
4)
The point turns green.
Created user-
defined points
are displayed.
User-defined
point properties
With the enlarged image, align the highlighted point with the
feature as accurately as possible.
(This improves the accuracy of the calibration.)
1) Open HND2.prg.
2) Enter the -X value measured in "(1) Find the camera's FOV" in the X component of line 32.
3) Do the same for +X of line 35, -Y of line 38, and +Y of line 41.
4) Save HND2.prg in the robot.
Tool coordinates
In the program +X
Line 32 +Y
Mov P0(-30,0,0,0,0,0,0)
-X direction
Line 41 30 mm
Mov P0(0,+45,0,0,0,0,0)
+Y direction
45 mm
-Y direction
45 mm
Distances measured in
Steps 1.2 and 1.3
(example) +X direction Line 38
Mov P0(0,-45,0,0,0,0,0)
30 mm
Line 35
Mov P0(+30,0,0,0,0,0,0)
(4.2) Take an image of the jig and fine-tune the positions of the user-defined points.
1) In In-Sight Explorer, press the F5 key (trigger) on the keyboard to take an image. The camera image
will be updated.
2) Align the user-defined point near the crosshairs with the feature of the workpiece. (Fine-tune the
position of the points with the mouse or arrow keys.)
3) Run the program up to lines 33, 36, 39, and 42 using step operation, and perform Steps 1 and 2 in
each line. Align the user-defined points with the feature (five points in total).
* Select Manual from the Trigger drop-down box under Edit Acquisition Settings.
1) Select Inspect Part from the Application Steps window.
2) Select N Point from Calibration Tools in the bottom left Add Tool window.
3) Click Add. All the user-defined points on the camera image will highlight in green.
* Clicking Add may highlight some parts in blue. Ignore them. (Blue circles may appear depending on
the image.)
4) Left-click the five user-defined points.
Selected user-defined points will turn pink. (Refer to the following image for how to select user-defined
points.)
* It is helpful to remember in what order the user-defined points have been selected.
5) After all the user-defined points have been selected, click OK. N points will be listed in the bottom right
window.
1)
4) Select five green
user-defined points.
3)
2)
5)
N points are named in the order in which the points were selected in Step 5.1. For example, the point
selected first is "Point 0", the point selected second is "Point 1", and so on.
2) Calibration name
1) Created N
i
Note
If the number of listed N points differs from the number of N points you want to create...
Something else other than the user-defined points may have been clicked in Step 5.1.
(Click user-defined points to create N points.)
In this case, recreate N points in the following steps.
3) Select the calibration name from Palette, then right-click and select Delete to delete the data.
4) Create N points by repeating the process from Step 3 in Section 5.1 onwards.
User-defined points
Points 0 to 4
Five points in total
Direction in tool
N point Measured value Program World X World Y
coordinates
Point 0 - 0 P0*(0,0,0,0,0,0) 0 0
Point 1 -X 30 P0*(-30,0,0,0,0,0) 30 0
Point 2 +Y 45 P0*(+45,0,0,0,0,0) 0 -45
Point 3 +X 30 P0*(+30,0,0,0,0,0) -30 0
Point 4 -Y 45 P0*(0,-45,0,0,0,0) 0 +45
1) Select the Settings tab from the Edit Tool in the bottom center.
2) Enter a file name and press Export.
3) Pass: Export Complete will be shown in the Results tab of the Palette if the data has been exported
successfully.
2)
1)
2)
(8.2) Move the robot and make the hand touch the surface of the identified workpiece.
1) Move the robot using jog operation and make the suction pad touch the surface of the identified
workpiece.
2) Once the robot has touched the surface, run the program to END using step operation.
◇ In lines 47 and 48, an offset from the workpiece suction point to the imaging point is set to P_CMH.
3) After the program goes to END, move up the robot to the area where no interference occurs.
◇ Program HND2.prg
......
Workpiece
identification point 21 PTool=P_CMTL
(P0) 22 Tool P_Ntool ...Reset the position of the tool.
......
46 ' Touch the hand onto the surface of th
e workpiece to be identified.
47 PSur=P_Fbc
48 P_CMH=Inv(PSur)*P0
1 '-------------------------------------------------
2 ' Step 3. Program to adjust the hand camera
3 ' Calibration assistance program HND3.prg
4 '-------------------------------------------------
5 Def Pos P_CMTL ' Hand camera: Camera center position data
6 Def Pos P_CLPos ' Hand camera: Reference point to start calibration
7 Def Pos P_CMH ' Hand camera: Offset from the workpiece suction surface to the imaging point
8 Def Pos P_HVSP1 ' Hand camera: Default imaging point
9 Def Pos P_WRK03 ' Hand camera: Master workpiece grasp position
10 Def Pos P_PVS03 ' Hand camera: Master workpiece identification point
11 Def Pos P_PH03 ' Hand camera: Calculated coefficient of the handling position from the position of
the identified workpiece
12 Def Char C_C03 ' Hand camera: COM name
13 Def Char C_J03 ' Hand camera: Job name
14 Loadset 1,1
15 OAdl On
16 Servo On
17 Wait M_Svo=1
18 '----- HAND INIT -----
19 Wait M_Svo=1
20 If M_EHOrg=0 Then ' If hand has not returned to origin:
21 EHOrg 1 ' Returns Hand 1 to origin.
22 Wait M_EHOrg=1 ' Waits for Hand 1 to return to origin.
23 EndIf
24 EHOpen 1,100,100 ' Opens Hand 1 (speed = 100%, Force = 20%)
25 Wait M_EHBusy=0 ' Checks if operation is complete
26 '---------------------
27 Tool P_NTool
28 ' Move the robot to the grasp/suction position.
29 ' Open and close the hand several times and check to see if the workpiece deviates from its position.
30 If M_Mode=1 Then P_WRK03=P_Fbc ' The grasp position of the workpiece to be registered.
31 ' Touches the hand on the surface of the workpiece.
32 If M_Mode=1 Then PWork=P_Fbc ' Job position data
33 If M_Mode=1 Then P_HVSP1=PWork*P_CMH ' The position in which the touched workpiece will be
identified. (Fine adjustments can be made in the XY axes using the Jog mode.)
34 Mov P_HVSP1
35 ' Create an identification job.
(End of HND3.prg)
(1) Adjusting the suction and identification points of the master workpiece
(1.1) Place the workpiece to be sucked.
1) Place the workpiece on the platform. (This workpiece is handled as the master workpiece.)
1) Master workpiece
1) Master workpiece
◇ Program HND3.prg
...... Tool P_NTool
Tool (0,0,0,0,0,0,0,0) (0,0)
27 Tool P_NTool
The tool position is set back to the
28 ' Move the robot to the grasp/suction position.
flange position.
(1.3) Move the robot to a place where it sucks the master workpiece.
1) Move the robot using jog operation to a place where it sucks the
master workpiece.
* If grasping the workpiece, adjust the opening and closing
mechanism of the tool to ensure the workpiece
does not deviate from its position.
* If sucking the workpiece, close Hand 1 with the teaching pendant
and lower the suction pad.
In the pneumatic hand (standard hand) window, press -C.
2) Move the robot using jog operation and make it touch the surface of the workpiece.
(If the hand is already on the workpiece, go to Step 3.)
3) Run HND3.prg up to line 33 using step operation.
◇ In lines 32 and 33, the workpiece detection point (P_HVSP1) is set.
4) Move up the robot hand slightly to prevent the workpiece from shifting.
5) Run HND3.prg up to line 35 using step operation.
The robot will move to the workpiece detection point.
◇ Program HND3.prg
......
30 If M_Mode=1 Then P_WRK03=P_Fbc ' The grasp position of the workpiece to be registered.
31 ' Touches the hand on the surface of the workpiece.
32 If M_Mode=1 Then PWork=P_Fbc ' Job position data
33 If M_Mode=1 Then P_HVSP1=PWork*P_CMH ' The position in which the touched workpiece will be
identified. (Fine adjustments can be made in the XY axes
using the Jog mode.)
34 Mov P_HVSP1
35 ' Create an identification job.
P_HVSP1 P_CMH
The formula for calculating the position in which the touched workpiece will be identified:
P_HVSP1=PWork*P_CMH
1)
2)
1)
1) 4) Model region
3)
Workpiece to be identified
Search
1)
2) The registered image of the workpiece to be identified (Model region) will be displayed in the bottom
right.
1)
Accept Threshold value: The amount (%) in which the identified workpiece matches the
registered image.
・ A rotation tolerance value is an angle at which the actual workpiece can be identified
even if the workpiece is rotated (±deg).
1)
2)
Communication settings
5)
1)
2)
3)
6)
1)
2)
Online
Online
4.5.2 Setting up the relative position between the workpiece and the robot
(1) Running HND3.prg automatically
1) Change the robot's speed to 3%. (* Set the robot's speed to 3% during automatic operation.)
2) Switch the operation mode from "Manual" to "Automatic" with the key switch.
Caution If the operation mode is set to "Manual", automatic operation will not run.
2) Check that Ovrd is set to 3% and run HND3.prg automatically.
3) Wait for the program to stop at line 52.
P_CMTL
P_HVSP1
Flange center for
the workpiece suction point
P_WRK03
PTRG0
P_PH03
PVS
1 '-------------------------------------
2 ' Step 3. Program to adjust the hand camera
3 ' Calibration assistance program HND4.prg
4 '-------------------------------------
5 Def Pos P_CMTL ' Hand camera: Camera center position data
6 Def Pos P_CLPos ' Hand camera: Reference point to start calibration
7 Def Pos P_CMH ' Hand camera: Offset from the surface of the identified workpiece to the imaging
point
8 Def Pos P_HVSP1 ' Hand camera: Default imaging point
9 Def Pos P_WRK03 ' Hand camera: Master workpiece grasp position
10 Def Pos P_PVS03 ' Hand camera: Master workpiece identification point
11 Def Pos P_PH03 ' Hand camera: Coefficient for calculating the handling position from the position of
the identified workpiece
12 Def Char C_C03 ' Hand camera: COM name
13 Def Char C_J03 ' Hand camera: Job name
14 Def Pos P_PHome ' Safe position
15 Loadset 1,1
16 OAdl On
17 Servo On
18 Wait M_Svo=1
19 '----- HAND INIT -----
20 Wait M_Svo=1
21 If M_EHOrg=0 Then ' If hand has not returned to origin:
22 EHOrg 1 ' returns Hand 1 to origin.
23 Wait M_EHOrg=1 ' waits for Hand 1 to return to origin.
24 EndIf
25 EHOpen 1,100,100 ' Opens Hand 1 (speed = 100%, Force = 20%)
26 Wait M_EHBusy=0 ' Checks if operation is complete.
27 '---------------------
28 Tool P_NTool
29 MCNT=1
30 Mov P_PHome
31 HOpen 1
32 M_Out(10129)=0
33 M_Out(10128)=1 Dly 0.1
34 Dly 0.5
(End of HND4.prg)
◇ Program HND4.prg
......
42 NVRun #3,C_J03$
43 EBRead #3,,MNUM,PVS ‘--- Workpiece output position information from the vision sensor
......
52 PTRG=P_HVSP1*P_CMTL*PVS*P_PH03 ‘--- Calculate the workpiece picking point.
......
60 Mov PTRG,-100
61 HClose 1
P_CMTL
P_HVSP1
PTRG
P_HVSP1*P_CMTL*PVS
PVS
P_PH03
1) Move the robot using jog operation to a place where the hand releases the workpiece.
2) Open HND4.prg and teach the release position PPUT.
3) After the teaching process, move the robot with the workpiece to a place where no interference
occurs.
PPUT
+Z P1 +X
P1 +Y
+Z
Robot origin
+Y
+X Fig. 1.1
Vector
+X +X
Vector
Vector
Vector Vector
Vector
+Y +Y
Fig. 1.2 Fig. 1.3
By using inverse functions, the following points (1 to 3) become the basic principles behind integrating
vision sensors and robots.
1) Teach the robot the grasp position (PWK) of the master workpiece to make the robot identify it (PVS).
2) Calculate the vector "PH" from PWK and PVS. (The vector for where the workpiece is identified up to
where it is grasped by the robot.)
3) In actual operation, when the output position (PVS) is multiplied by PH, the grasp position becomes
PTRG.
PH PH
+X
Inv(PVS) +X
PVS
PWK PWK
+Y +Y
Fig. 1.4 Fig. 1.5
Regarding Fig. 1.4, the relationship between vectors is expressed in the following formula.
PWK = PVS*PH
To find PH, multiply both sides by (PVS)-1 as follows:
(PVS)-1*PWK = (PVS)-1*PVS*PH
Then, (PVS)-1*PVS on the right side becomes 1, and the equation becomes
(PVS)-1*PWK = PH
When the equation is reversed, it becomes
PH = (PVS)-1*PWK
In the robot language, (PVS)-1 is Inv(PVS) . Therefore, PH becomes as follows:
PH = Inv(PVS)*PWK
Command
Function
name
NVOpen Connects and logs on to the vision sensor
NVClose Disconnects the vision sensor
NVLoad Changes the state of a specified program so that it can be run
NVRun Runs the specified program
Requests an image from the vision sensor, and acquires encoder values after a
NVTrg
specified period of time
EBRead Specifies a vision sensor tag and reads the tag's data.
EBWrite Specifies the tag name of the vision sensor to write data.
More than two ports cannot be opened with a setup consisting of one robot controller and one vision
sensor. When configuring the parameter "NETHSTIP", the error "Ethernet parameter NETHSTIP
setting error" will occur if the IP address of NETHSTIP and NVOpen are the same.
4) A username and password are required to logon to the vision sensor. The same username and
password set for the vision sensor need to be set in robot controller's parameters "NVUSER" and
"NVPSWD".
The username and password must be within 15 characters. It can contain any uppercase letters from
A to Z, any numbers from 0 to 9, a hyphen ( - ) or an underscore ( _ ). (Do not use lowercase
characters when creating a new vision sensor user as the teaching pendant does not support them.)
The vision sensor's default administrator username is "admin". The password is "" (two quotation
marks).
"NVUSER" and "NVPSWD" share the same administrator username and password.
The administrator password can be changed in MELFA-Vision. When changing the administrator
password or adding a new user, be sure to change the administrator usernames and passwords of
"NVUSER" and "NVPSWD" too. * * * * will appear when changing the username and password of
"NVPSWD".
Once the vision sensor password has been changed, open parameter "NVPSWD" and change the
password. Restart the robot controller after the password has been changed.
Appendix 2. Vision sensor commands and status variables 138
Caution
When connecting multiple vision sensors to a robot controller, ensure that all the usernames a
nd passwords of the vision sensors are the same.
5) The communication status of vision sensors can be checked with M_NvOpen once NVOpen has been
executed. For further details, refer to M_NvOpen.
6) Communication stops immediately if the program is aborted while executing this command.
To logon to the vision sensor, reset the robot program and restart MELFA-Vision.
7) The following limitations apply when using this command for multi-tasking.
Different tasks cannot have the same <COM No.> and <Vision sensor No.> when multi-tasking.
(1) The error "COM file already open" will occur if the same COM number is used for a different
task.
SLOT 2 SLOT 3
10 NVOpen "COM2:" As#1 10 NVOpen "COM2:" As#2
20 …….. 20 ……..
(2) The error "Attempt was made to open an already open communication file" will occur if the
same vision sensor number is used for a different task.
SLOT 2 SLOT 3
10 NVOpen "COM2:" As#1 10 NVOpen "COM2:" As#1
20 …….. 20 ……..
8) Not supported if the program's startup settings are set to "ALWAYS" or the continuity function
(CTN) is enabled.
9) Up to three robots can control the same vision sensor simultaneously.
If four robots logon to the same vision sensor, one of the four robots will be disconnected. Take
this into account during the system design process.
10) The program's End command called by the CallP command does not close the port.
However, the main program's End command does close the port. The port also closes when
the program is reset.
11) If interrupt conditions are satisfied while this command is executing, interrupt processing is
performed immediately even if the processing of the command is in progress.
Errors
1) If the data types of arguments are different, the error "Syntax error in input command" will occur.
2) If there is a discrepancy with the number of arguments (too many or too few), the error "Incorrect
argument count" will occur.
3) Specifying a COM number outside the range of COM2 to COM8 for <COM No.> will result in the error
"Argument out of range".
4) Specifying a value outside the range of 1 to 8 for <Vision sensor No.> will result in the error
"Argument out of range".
Appendix 2. Vision sensor commands and status variables 139
5) If a COM port <COM No.> which is already connected is specified, the error "COM file already open"
will occur. (This also applies to files <File No.> already opened by the Open command.)
6) If a COM port is opened before a vision sensor is connected, the error "Vision sensor not connected"
will occur. (The same manufacturer parameter "COMTIMER" as in the Ethernet specifications is used.
Currently set at "1s".)
7) If the same COM number or vision sensor number is used for a different task, the error "COM file
already open" will occur.
8) If the username and password specified by parameters "NVUSER" (username) and "NVPSWD"
(password) are different, the error "Incorrect password" will occur.
9) If communication is disconnected while this command is being executed, the error "The
communication is abnormal" will occur and the robot controller's port will close.
10) If the program's start conditions are set to "ALWAYS", the error "The command cannot be used when
the start conditions are ERR or ALW." will occur.
Comments
1) Disconnects from the vision sensor to which a connection was established with the NVOpen
command.
2) If the <Vision sensor No.> is omitted, all connections are closed.
3) If a connection has been already closed, the process proceeds to the next step.
4) It is possible to connect to up to seven vision sensors simultaneously. Therefore, the <Vision sensor
No.> is used to identify which vision sensor is to be disconnected.
5) If the program is aborted while executing this command, execution is continued until processing of
this command has completed.
6) If this command is used for multi-tasking, execute the command "NVOpen" for the relevant task and
close only the connection that is open. Use the number specified with the NVOpen command for the
vision sensor number that is to be used.
7) Not supported if the program's startup settings are set to "ALWAYS" or the continuity function (CTN)
is enabled.
8) If the End command is used, all connections established by the NVOpen or Open command are
closed. However, connections are not closed with End command inside programs called with the
CallP command.
Furthermore, connections are also closed when resetting the program. Therefore, if the End
command is specified or the program is reset, there is no need to close connections using this
command.
9) The continuity function is not supported.
10) If interrupt conditions are satisfied while this command is executing, interrupt processing is
performed after processing of this command has completed.
Errors
1) If the value specified for the <Vision sensor No.> is anything other than 1 to 8, the error "Argument
out of range" will occur.
2) If there are more than eight arguments in the command, the error "Argument out of range" will occur.
Term
<Vision sensor No.> (cannot be omitted)
Specify the number of the vision sensor you want to control. Setting range: 1 to 8
<Delay Time> (cannot be omitted)
Specify the delay time (ms) from when the image capture request is output to the vision sensor
until the encoder value is obtained.
Setting range: 0 to 150 ms
<Encoder n value read-out variable> (Can be omitted from the second one on)
Specify the double precision numeric variable into which the read external encoder n value is set
. Note: n is 1 to 8
Examples
1 If M_NvOpen(1)<>1 Then ' If vision sensor number 1 is not logged on
2 NVOpen "COM2:" As #1 ' connect to the vision sensor connected to COM2, and set the nu
mber to 1.
3 EndIf
4 Wait M_NvOpen(1)=1 ' Connect to vision sensor No. 1 and wait until it has logged on.
5 NVLoad #1,"TEST" ' Load program "TEST".
6 NVTrg #1,15,M1#,M2# ' Request the vision sensor to capture an image and acquire enc
oders 1 and 2 after 15 ms.
7 EBRead #1,,MNUM,PVS1,PVS2 ' Read the tag data of ”Job.Robot.FormatString” and save
it in the variables MNUM, PVS1, and PVS2.
8 ...
30 NVClose #1 ' Disconnect from the vision sensor connected to COM2.
Comments
1) Outputs the image capture request to the specified vision sensor and acquires the encoder value
after the specified period of time. The acquired encoder value is stored in the specified numeric
variable.
2) The timing of when this command finishes processing differs depending on the setting of the
parameter NVTRGTMG. If the parameter NVTRGTMG is a factory setting, the next command is
executed after communication with the image processing command (image request) has finished.
3) This command will suspend immediately if the program is aborted while it is being executed.
4) Use the EBRead command to get data from the vision sensor.
5) If this command is used for multi-tasking, it is necessary to use the NVOpen command for each task.
Use the vision sensor number specified with the NVOpen command.
6) Not supported if the program's startup settings are set to "ALWAYS" or the continuity function (CTN)
is enabled.
7) Set the trigger of EasyBuilder's image capture settings to "External trigger", "Manual trigger" or
"Network" ("Camera" can be used when the setting value of NVTRGTMG is "0".)
8) Up to three robots can control the same vision sensor at the same time, but this command cannot be
used by more than one robot at the same time. Use this command for only one of the robots.
9) If interrupt conditions are satisfied while this command is executing, interruption processing will be
executed immediately.
Syntax
EBRead #<Vision sensor No.>, [<Tag name>] , <Variable name 1> [,<Variable name 2>]...
[,<Time out>]
Term
<Vision sensor No.> (Cannot be omitted)
Specify the number of the vision sensor you want to control. Setting range: 1 to 8
<Tag name> (Can be omitted)
Specify the name of the symbolic tag where data read by the vision sensor is to be stored.
When omitting the tag name, the value of parameter EBRDTAG (initial value is the custom format tag
name "Job.Robot.FormatString") is specified.
<Variable name> (Cannot be omitted):
Specify the variable read from the vision sensor to send to the robot. Multiple variables can be delimited
by a comma.
Numeric value variables, position variables, and string variables can be specified.
When the position variable is specified, the components that have vision data set in them are X, Y, and
C. The values of components that do not have any data set in them are set to "0".
<Time out> (If omitted, 10)
Specify the time-out time in seconds.
Setting range: 1 to 32767 (integers)
Examples
1 If M_NvOpen(1)<>1 Then ' If vision sensor number 1 is not logged on
2 NVOpen "COM2:" As #1 ' connect to the vision sensor connected to COM2, and set the numb
er to 1.
3 EndIf
4 Wait M_NvOpen(1)=1 ' Connect to vision sensor No. 1 and wait until it has logged on.
5 NVLoad #1,"TEST" ' Load program "TEST".
6 NVRun #1,"TEST" ' Run program "TEST".
7 EBRead #1,,MNUM,PVS1,PVS2 ' Read the tag data of ”Job.Robot.FormatString”
and save it in the variables MNUM, PVS1, and PVS2.
20 ...
21 NVClose #1 ' Disconnect from the vision sensor connected to COM2.
Comments
1) Reads data by specifying a tag name from an active vision program of a specified vision sensor.
2) Stores the data read by the vision sensor in a specified variable.
3) In cases where vision data consists of multiple values delimited by commas, the data is stored in the
order that the specified variable names were enumerated and delimited in. In such cases, the data
and variables should be of the same type.
4) When the position variable is specified, the components that have vision data set in them are X, Y,
and C. The values of components that do not have any data set in them are set to "0". The value
converted into the radian is set to the C component.
5) Values are only set in specified variables when the number of specified variables is less than the
amount of data received.
6) Variables exceeding the amount of data are not updated when the number of specified variables
exceed the amount of data received.
7) If the tag name is omitted, the setting value of parameter EBRDTAG is set instead. (The factory
setting is "Job.Robot.FormatString".)
Appendix 2. Vision sensor commands and status variables 148
8) The time-out period can be specified with numeric values. Within the time-out period, the program will
not move to the next step until it has received data from the vision sensor. However, this command
will be suspended if the program is aborted. The program will resume once it has been restarted.
9) When this command is used with multi-tasking, it is necessary to execute the NVOpen command and
NVRun command depending on the task. Use the number specified with the NVOpen command for
the vision sensor number that is to be used.
10) Not supported if the program's startup settings are set to "ALWAYS" or the continuity function (CTN)
is enabled.
11) If interruption conditions have been satisfied during this command, interruption processing will be
executed immediately. Processing will be executed after interruption processing has complete.
12) In order to reduce tact time, other work can be done after executing the NVRun command and
EBRead can be executed when required.
13) Set "1" in parameter NVTRGTMG if the EBRead command is on the line immediately after the NVRun
command in the program.
If parameter NVTRGTMG is set to the factory setting (NVTRGTMG = "2"), the NVRun command
starts processing the next command without waiting for completion of vision identification processing.
Therefore, the results of previous identification data may be read if the EBRead command is still
being executed.
14) Note that if the program stops between NVRun and EBRead, the results when NVRun is executed
and the results when EBRead is executed may be different.
Syntax
EBWrite□#<Vision sensor No.>, [<Tag name>], <Writing data> [, <Time out>]
Term
<Vision sensor No.> (Cannot be omitted)
Specify the number of the vision sensor you want to control with a numeric constant. Setting range: 1 to
8
<Tag name> (Can be omitted)
Specify the name of the symbolic tag for the cell to which data is written.
If the name is not specified, the value of parameter EBWRTAG is set.
<Writing data> (Cannot be omitted)
Specify the data to be written to the vision sensor.
Numeric constants, numeric variables, string constants, string variables, position component data,
joint component data, numerical expressions, and character string expressions can be used.
<Time out> (Can be omitted)
Specify the time-out time (in seconds) with a numeric constant. If the time is not specified, a
timeout of 10 seconds is set.
Setting range: 1 to 32767 (integers)
Examples
1 If M_NvOpen(1)<>1 Then ' If vision sensor No. 1 is not logged on
2 NVOpen "COM2:" As #1 ' connect to the vision sensor connected to COM2, and set the numb
er to 1.
3 Wait M_NvOpen(1) = 1 ' Wait until vision sensor No. 1 has logged on.
4 End If
5 NVOpen #1, "TEST" ' Load program (job) "TEST".
6 EBWrite #1, "Sample.Float", 5 ' Rewrite "Sample.Float" tag data as 5.
7 EBWrite #1, "Sample.String", "Test" ' Rewrite "Sample.String" tag data as "Test".
8 NVRun #1, "TEST" ' Run program (job) "TEST".
:
:
20 End
Comments
1) Writes data to the cell specified with the tag name in the active vision program (job) of the specified
vision sensor.
2) The error (L.3141) occurs if no NVOpen command is executed for the vision sensor specified with
<Vision sensor No.>.
3) If <Tag name> is not specified, the value of parameter EBWRTAG is set. (The factory setting is
""(NULL).)
4) The error (L.8637) occurs if the active vision program does not have the specified <Tag name>.
6) Processes are performed according to the combinations of the types of <Writing data> and the cell
value types of the vision program specified with <Tag name> as shown below.
<Writing data> Cell value type Process
Numeric value type Boolean value editing control Cell value update (integer)
(integer) Integer editing control Cell value update (integer)
Floating-point number editing Cell value update (single-precision real
control number)
Text editing control Execution error (L.8637)
Numeric value type Boolean value editing control Cell value update (integer, rounding
(real number) Integer editing control down decimals)
Floating-point number editing Cell value update (integer, rounding
control down decimals)
Text editing control Cell value update (single-precision real
number)
Execution error (L.8637)
Character string Boolean value editing control Execution error (L.8637)
type Integer editing control Execution error (L.8637)
Floating-point number editing Execution error (L.8637)
control Cell value update (character string)
Text editing control
7) You can specify the time-out time with a numeric constant. During the time-out time, the next step is
not performed until data containing writing results is received from the vision sensor.
8) When the execution of a robot program is stopped, the processing of this command is interrupted.
The interrupted processing restarts by executing the program again.
9) When this command is used with multi-tasking, it is necessary to execute the NVOpen command in
the task slot you use. Use the number specified with NVOpen command for <Vision sensor number>.
10) This command cannot be used if ALWAYS is specified in the start conditions of the task slot.
11) If interruption conditions have been satisfied during this command, interruption processing will be
executed immediately.
Array meaning
Array elements 1 to 8: Vision sensor numbers
Usage
After the NVOpen command is executed, M_NvOpen connects the vision sensor to the port
and checks if the vision sensor has been logged on to.
Examples
1 If M_NvOpen(1)<>1 Then ' If vision sensor number 1 is not logged on
2 NVOpen "COM2:" As #1 ' connect to the vision sensor connected to COM2, and set the
number to 1.
3 EndIf
4 Wait M_NvOpen(1)=1 ' Connect to vision sensor No. 1 and wait until it has logged on.
5 …
10 NVClose #1 ' Disconnect from the vision sensor connected to COM2.
Comments
1) Indicates the connection status of the port connected to the vision sensor when it was opened with the
NVOpen command.
2) The initial value is "-1". At the point in time when the NVOpen command is executed and the port is
connected, the value changes to "0" (connecting to port). At the point in time when the vision sensor
has successfully been logged on to, the value changes to "1" (logon complete).
3) This variable strongly resembles the status of status variable M_Open, but whereas M_Open
changes to "1" when the connection is verified, M_NVOpen becomes "1" when the vision sensor is
successfully logged on to.
Errors
1) If the type of data specified as an array element is incorrect, the error "Syntax error in input command"
will occur.
2) If there is a discrepancy with the number of array elements (too many or too few), the error "Incorrect
array element" will occur.
3) If an array other than 1 to 8 is specified, the error "Array element mistake" will occur.
A 4 digit error number also appears on the teaching pendant's LCD screen. (The letter at the beginning of
the error number is not displayed.) For example, when the error C0010 occurs, "0010"and an error message
are displayed.
An alarm will also sound in 0.5 second intervals. When power is restored or when the time between power
OFF and ON is too short, an alarm will sound in 0.1 second intervals (constantly).
Error codes, messages, causes and solutions can be found in Section 4.2.2 "2D vision sensor error code
list".
Detailed information on the error number is displayed in the Error history screen of the teaching box. Check
the Error history screen after any errors have been reset.
0000 *
Error codes with asterisks attached to them require the power to be reset.
Refer to the error code list for information on how to solve the error.
A 4 digit number indicates the type of error.
There are three error levels.
H: High-level error……….Servos turn OFF
L: Low level error………..Operation stops
C: Caution…………………Operation continues
L3130 Error Attempt was made to open an already open communication file.
message
Solution Check the COM number and vision sensor number. Re-enter them if
necessary. Or check communication parameters.
Cause The port for communication with the vision sensor cannot be opened.
L3287 Error This command cannot be used if the start condition is ERR or ALW.
message
Cause This command cannot be used if the start condition is ERR or ALW.
Cause The type of data received by EBRead command is different from the
type of variable specified.
Solution 1. Edit the array elements so that they are within the array element
limit.
2. Do not specify that variable.
Solution Check the vision sensor program number and settings of parameters
such as COMDEV.
Cause The communication port was opened, but there is no response from the
vision sensor.
Cause The password for the user set in the parameter "NVUSER"
is not set in the parameter "NVPSWD".
Cause Communication with the vision sensor was cut off before or during
command execution.
Solution Check the communication cable connecting the robot and vision sensor.
Cause The specified vision sensor number is not defined with an NVOpen
command.
Solution Check whether the vision sensor number is correct. Check that the
number is also defined by an NVOpen command.
Solution Specify a vision program name that does not exceed 15 characters.
Solution Check whether the program has been loaded to the vision sensor.
Check whether the program name is correct.
Cause There was no response from the vision sensor within the specified time
limit.
Cause The symbolic tag name does not exist in the active vision program.
Solution Check that the symbolic tag name of Easy Builder is the same as the
tag name specified by the robot program.
If it is not the same, correct it.
Cause The image capture settings are set to something other than "Camera",
"External", or "Manual".
Solution Turn the vision sensor online and enable external control.
Cause The NVUSER and NVPSWD parameters set for logging on to the
vision sensor
do not have full access rights to logon to the vision sensor.
Solution Check the list of registered users for the vision sensor and specify
which users are allowed full access in parameters NVUSER and
NVPSWD.
Cause The program was started without being reset after it stopped.
This textbook was published in May 2019. Note that the specifications are subject to change without notice. Issued: May 2019 (1905)MEE