Design and Implementation o Robotic
Design and Implementation o Robotic
By
MAGISTER TECHNOLOGIAE:
ENGINEERING : ELECTRICAL
In the
Faculty of Engineering
January 2004
Promoters
Mr Frank Adlam
Table of Contents
Glossary ………………………………………………………………………..…….… xv
CHAPTER 1 INTRODUCTION
1.1 Objectives……………………….………………………………………………2
1.2 Hypothesis…………………….………………………………………….……..4
1.4 Assumption……………………………………………………………….……..5
ii
2.1.3. Robot Motion…………………………………………………………….………20
2.3.1 Real-Time………………………………………………………………………..31
2.4.1.1 Background………………..…………………………………………….43
iii
2.4.1.6 Image Acquisition……………………………………………………….55
2.5 Conclusion……..…………………………………………………………………….57
iv
3.4.3.2 S4 DDE Server………………………………………………………………78
3.5 Conclusion……………………………………………………………………….82
v
4.9 Remote RAPID DDE Robot Programming Environment………………….….145
vi
5.3.3 Averaging………………………………………………………………………160.
5.6 Conclusion…………………………………………………….….………….…167
CHAPTER 6 CONCLUSION
References ……………………………………..……………………………………..175
vii
List of Figures
CHAPTER 1 INTRODUCTION
Figure 1.1: System architecture for an industrial application……...…………………….3
Figure 2.8: Illustration of a Rotation 3x3 Matrix for the 3 axes [x,y,z] ………………17
Figure 2.9: A simple diagram indicating the relationship between direct and inverse
robot kinematics……………………..…………………………………….18
ix
Figure 2.14: Industrial LAN Network Architecture………………………………….….34
Figure 2.23: This diagram simplifies the relationship between the three functions of
machine vision…………………………………………………………..…45
Figure 3.2a: 1400 ABB Robot, Vision Feedback Camera & Bridge PC – system setup..63
x
Figure 3.2b: 1400 ABB Robot Controller & Bridge PC – system setup…………….….63
Figure 3.2c: 1400 ABB Robot Controller & Bridge PC Hardware – system setup….…64
Figure 3.4: Experimental Ethernet TCP/IP Configuration Robot & LAN Connection..68
xi
CHAPTER 4 PC-BASED ROBOT TRAJECTORY PATH
CONTROL SYSTEM
Figure 4.13: Link parameters for ABB IRB 1400 Industrial Robot. …………………..103
Figure 4.14: ABB Industrial Robot link co-ordinate transformation matrices. ….……106
xii
Figure 4.15: Planar 3-R Manipulator with the three reference joint angles. …………109
Figure 4.17: Program Modules and architecture for motion control. …..………….…117
Figure 4.20: RobComm Active Server establishing communication with ABB Robot
Controller. …………………………………………………………………………...…128
Figure 4.22: software environment for manual robot control commands. ……………130
Figure 4.24: software environment for manual robot control commands. ……………135
Figure 4.27: Microsoft Excel DDE data simulation with DDE RobComm Server. …..146
xiii
Figure 5.3: Memory allocation for the 3D Flash Point Frame Grabber………………155
Figure 5.4: Mechanism utilized to transfer the image into the allocated memory……155
Figure 5.12: 3x3 matrix convolution filter implemented as a software call function….160
xiv
Abbreviations
xv
PC Personnel Computer
RAP Rapid Application Protocol
RGB Red, Green, and Blue
S/N Signal/Noise
TCP Tool Centre Point
TCP/IP Transmission Control Protocol / Internet Protocol
UDP Universal Data Protocol
VGA Video Graphics Accelerator
VT Virtual Terminal
WAN Wide Area Network
3D Three Dimensional
VPR Vision Pixel Ratio
xvi
Glossary
application.
xvii
Application Layer: The highest layer of the 7-Layer OSI model structure,
black.
image.
or similarities.
xviii
Closed-Loop Control: The use of a feedback loop to measure and compare
adjustment.
measured value.
accordingly.
xix
Coordinate Transformation: In robotics, a 4×4 matrix used to describe the positions
Degree of Freedom: The number of independent ways the end effectors can
completely defined.
xx
End Effector: Also known as end-of-arm tooling or, more simply,
parts.
shapes of interest.
xxi
Frame: A full video image comprising of two fields. A PAL
lines).
memory.
can have more than two values which are typically 128
image.
system.
xxii
Handshaking: Exchange of predefined signals between two devices
establishing a connection.
device.
xxiii
K
systems.
xxiv
Modular Programming: A software design methodology which requires
attitude in space.
picture-cell.
xxv
Protocols: A format set of conventions governing the formatting
communicating systems.
object.
Robot Calibration (for vision): The act of determining the relative orientation of the
coordinate system.
Tool Centre Point (TCP): A tool-related reference point that lies along the last
xxvii
Translation: A movement such that all axes remain parallel to what
xxviii
CHAPTER 1 INTRODUCTION
“One machine can do the work of a hundred ordinary men, but no machine can do the
- Elbert Hubbard
With the pressing need for increased productivity and delivery of end products of uniform
At the present time, most of industrial automated manufacturing is carried out by special-
The inflexibility and generally high cost of these machines often referred to as hard
automation systems, have led to a broad-based interest in the use of robots capable of
perform various assembly tasks. A robot may possess intelligence, which is normally due
to computer algorithms associated with its controls and sensing systems. Industrial robots
Most of today’s industrial robots, though controlled by mini and microcomputers are
basically simple positional machines. They execute a given task by playing back a
1
prerecorded or preprogrammed sequence of motion that has been previously guided or
taught by the hand-held control teach box. Moreover, these robots are equipped with little
or no external sensors for obtaining the information vital to its working environment.
As a result robots are used mainly for relatively simple, repetitive tasks. More research
effort has been directed in sensory feedback systems, which has resulted in improving the
(CCD) system. This can be utilized to manipulate the robot position dependant on the
surrounding robot environment (various object profile sizes). This vision system can only
1.1 Objectives
Due to the rapid changes in the manufacturing environment, there has become a growing
need for integrated vision based systems and automated remote robot trajectory motion
control. Figure 1.1 illustrates the architecture proposed to fulfill the overall objective for
an industrial application.
2
Figure 1.1: System architecture for an industrial application
Figure 1.1:
Study the fundamentals of an ABB robot (ABB 1400 Series), its control
robotic communication.
3
Study the RAPID program language of the industrial robot in order to realize
Build an interface for robot communication based on Ethernet hardware for local
and remote access to industrial robots. The remote access will be based on TCP/IP
protocols.
Study the kinematics of the industrial robot and its possible operations
(movement) for industrial applications such as, material handling, sealing, etc.
to operational tasks. Interpolation and dynamic control may be taken into account
in the path control. The control mechanism may be achieved via an industrial
SCADA (MMI).
Develop algorithms to extract the object profile from a dynamic image, which can
be utilized to provide the industrial robot with position feedback. This creates an
1.2 Hypothesis
A vision-based CCD sensory system can be integrated with an industrial robot to sense an
object’s profile and manipulate the robot’s position online, according to the object
4
1.3 Delimitations of research
The remote RAPID program language environment will not include a fully
No Ethernet hardware will be designed and manufactured for both the Bridge PC
and Robot controller, neither will it include the TCP/IP protocol software which is
used to communicate with the robot via the network software layers. The Ethernet
software protocol will be handled via the ProComm software platform, which has
been developed by the manufacturer, ABB. The use of this platform reduces
study more time can be spent developing the user interface. RobCOMM uses
standard software packages such as Visual C and Visual Basic and the industrial
The user interface will be developed using only Visual C and Visual Basic. This
will focus only on the visual control of the industrial robot, such as system status
1.4 Assumption
The necessary hardware and software tools required to do the research will be available,
5
1.5 Significance of the study
Currently in industry, industrial robots are beginning to take over repetitive tasks, which
were previously performed by humans. Industrial robots significantly improve the quality
of the end product. It also results in improved efficiency as high product volumes are
produced.
This research will attempt to illustrate the fact that robotic equipment should possess
some form of sensory feedback, which would give the robot the ability to automatically
feedback devices, as well as remote automated robot control in terms of literature survey.
form a platform for profile recognition and integrated robot control via a PC-Based
system. Chapter 4 involves the architecture of PC-Based robot trajectory path planning
6
detailed insight of all the components required to achieve the overall system objective.
Chapter 7 provides the conclusion to the research, introducing possible future extensions
7
CHAPTER 2
INDUSTRIAL ROBOT CONTROL
This chapter will serve as a background to topics and mathematic fundamentals related to
this dissertation. This includes robotic manufacturing systems and robotic interfacing
software. New technologies and trends related to these areas will also be discussed. In
order to understand the project as a whole and its relevance to manufacturing, industrial
Background to Robotics
With a pressing need for increased productivity and the delivery of end products of
uniform quality, industry is turning more and more towards computer-based automation.
Most automated manufacturing tasks, at the present time, are carried out by special-
process. The inflexibility and generally high cost of these machines, often called hard
automation systems, have led to a broad-based interest in the use of robots capable of
The word ROBOT originated from the Czech word “robota”, meaning – WORK.
Webster’s dictionary defines a robot as: “ an automatic device that performs functions
A definition used by the Robot Institute of America gives a more precise description of
8
to move materials, parts, tools or specialized devices, through variable programmed
perform assembly tasks. With this definition, a robot must possess intelligence, which is
normally due to computer algorithms associated with its control and sensing systems.
several rigid links connected in series by revolute or prismatic joints. One end of the
chain is attached to a supporting base, while the other end is free and equipped with a tool
to manipulate objects or perform assembly tasks. The motion of the joints results in
relative motion of the links. Mechanically, a robot is composed of an arm, wrist and tool.
The work volume is the sphere of influence of a robot whose arm can deliver the wrist
subassembly unit to any point within the sphere. The arm subassembly generally can
pitch,
yaw; and
roll.
Hence, for a six-jointed robot the arm subassembly is the positioning mechanism, while
9
Figure 2.1: Illustration of a Cincinnati Milacron T3 robot arm [1]
Many commercially available industrial robots are widely used in manufacturing and
assembly tasks, such as material handling, spot / arc welding, parts assembly, spray
Robots are defined into four basic motion defining categories, illustrated in Figure 2.2.
a. Cartesian Co-ordinates
b. Cylindrical Co-ordinates
c. Spherical Co-ordinates
10
Figure 2.2: Illustration of various robot arm categories [1]
Most of today’s industrial robots are controlled by mini- and micro-computers and are
basically simple positional machines. They execute a given task by playing back
taught by the user with a hand-held control-teach pendant. Moreover, these robots are
equipped with little or no external sensors for obtaining the information vital to its
working environment. As a result, robots are used mainly for relatively simple, repetitive
tasks. More research effort is being directed towards improving the overall performance
of the manipulator system. Automation and Robotics are two closely related technologies.
11
AUTOMATION is defined as: “a technology that is concerned with the use of
production.”
(i) Fixed Automation – is used when the volume of production is very high and
components.
relatively low and there are a variety of products to be made. In this case the
(iii) Flexible Automation – other terms used include FMS and Computer-
12
for the mid-volume production range. Flexible automated systems typically
handling and storage system. A central computer is used to control the various
Of the three types of automation, robotics coincide most closely with programmable
humanlike characteristic of existing robots is their movable arms. The robot can be
programmed to move its arm through a sequence of motions in order to perform some
useful task. It will repeat that motion pattern over and over until reprogrammed to
perform some other task. Hence, the programming feature allows robots to be used for a
variety of different industrial operations, many of which involve the robot working
13
2.1.1 Robot Structure
The manipulator is the part of the robot that consists of links connected by revolute or
The controller contains the electronics required to control the manipulator, external axes
14
Figure 2.5: ABB 1400 Robot Controller [22]
Robot arm kinematics deals with the analytical study of the geometry of motion of a
robot arm with respect to a fixed reference co-ordinate system. This system is without
regard to the force / moments that cause the motion. Thus, kinematics deals with the
particular the relations between the joint-variable space and the position and orientation
The two fundamental concepts with respect to robot arm kinematics are -
inverse kinematics.
15
Since independent variables in a robot arm are the joint variables, and a task is usually
stated in terms of the reference co-ordinate frame, the inverse kinematics problem is used
more frequently.
A systematic and generalized approach which utilizing matrix algebra to describe and
represent the spatial geometry of the links of a robot arm by systematically establishing a
co-ordinate system (body-attached frame) to each link of an articulated chain [42]. This
relationship between two adjacent mechanical links and reduces the direct kinematic
can be transformed and expressed in the “base co-ordinates” which make up the inertial
Euclidean space and maps its coordinates expressed in a rotated coordinate system
2.6. [22][3][10] Z
P
W
O
V Y
U
16
Figure 2.7 shows the OUVW coordinate system rotated at an α angle about the OX axis,
then rotated an φ angle about the OY axis, and then rotated an θ angle about the OY.
V
Z Z
O θ Y
W W
α φ
V θ
X
α Y O U
φ
O U X
1 0 0
R x ,α
= 0 cos α − sin α
0 sin α cos α
cos φ 0 sin φ
R y ,φ = 0 1 0
− sin φ 0 cos φ
cosθ − sin θ 0
R z ,θ = sin θ cosθ 0
0 0 1
Figure 2.8: Illustration of a Rotation 3x3 Matrix for the 3 axes [x,y,z] [1][3]
Composite Rotation Matrix form the basic rotation matrices which can be multiplied
together to represent a sequence of finite rotations about the principle axes of the OXYZ
coordinate system. Since matrix multiplications do not commute, the order or sequence of
17
about the OX axis (yaw) followed by a rotation of θ angle about the OZ (roll) followed by
a rotation of φ angle about OY (pitch) axis is given by the resultant rotation matrix as:
rotation of θ angle about the OZ axis followed by a rotation of α angle about the OX, the
CθCφ − Sθ CθSφ
R = R x ,α R z ,θ R y ,φ
= CαSθCφ + SαSφ CαCθ CαSθSφ − SαC θ
SαSθCφ − CαSφ SαCθ SαSθSφ + CαCφ
Figure 2.9: A simple diagram indicating the relationship between direct and
inverse robot kinematics [1]
18
Since the links of a robot arm may rotate and / or translate with respect to a reference
co-ordinate frame, the total spatial displacement of the end-effector is due to the
Figure 2.10: A PUMA robot arm illustrating joints and links [1]
Each joint-link pair constitutes one degree of freedom (DOF). For an N degree of
freedom manipulator, there are N joint-link pairs with link 0 (not considered part of
the robot) attached to a supporting base where an inertial co-ordinate frame is usually
established for this dynamic system and the last link is attached with a tool. The joints
19
and links are numbered outwardly from the base. Thus, joint 1 is the point of
connection between link 1 and the supporting base. A joint axis (for joint i) is
established at the connection of two links as illustrated in Figure 2.10 and 2.11.
Robot motion is sub-divided into the following co-ordinate frames as listed below with
(ii) The Base Co-ordinate System – is attached to the base mounting surface of
the robot.
(iii) The Tool Co-ordinate System – specifies the tool’s center point and
orientation.
(iv) The User Co-ordinate System – specifies the position of a fixture or work
piece.
(v) The Object Co-ordinate System – specifies how a work piece is positioned
World frame
With the knowledge of kinematics and dynamics of a serial link manipulator, one would
like to servo the manipulator’s joint actuators to accomplish a desired task by controlling
the manipulator to follow a desired path. Before moving the robot arm it is of interest to
know whether there are any obstacles present in the path that the robot arm has to
traverse (obstacle constraints) and whether the manipulator hand needs to traverse along a
The space curve that the manipulator hand moves along from an initial location (position
and orientation) to the final location is called the robot path. The trajectory planning
interpolates and / or approximates the desired path by a class of polynomial functions and
generates a sequence of time based “control set points” for the control of the manipulator
Control Analysis – the movement of the robot arm is usually performed in two distinct
control phases:
1. Main motion control is to move the arm from the initial position / orientation to
the vicinity of the desired target position / orientation along a planned trajectory.
2. Fine motion control is when the end-effector of the arm dynamically interacts
with the object using sensory feedback information from the sensors in order to
The current industrial approach to robot arm control is – treat each joint of the robot as a
22
The servo mechanism approach models the varying dynamics of a manipulator
inadequately because it neglects the motion and configuration of the whole arm
mechanism. Robot arm control requires the consideration of more efficient dynamic
models, sophisticated control approaches, the use of dedicated computer architectures and
lack of suitable and efficient communication between the user and the robotic system so
that the user can direct the manipulator to accomplish a given task. There are several
ways to communicate with a robot, such as: Discrete word recognition, teach and
playback and a high-level programming language. The most general approach used in
programming. Robots are commonly used in areas such as arc welding, spot welding and
paint spraying.
imprecise; and
23
sensory information has to be monitored, manipulated and properly utilized.
of robot motions. The robot is guided and controlled by the program throughout the entire
task with each statement of the program roughly corresponding to one action of the robot.
the object rather than the motion of the robot needed to achieve these goals and hence no
existing high-level language to meet the requirements of robot programming. Its design
interfaces may be built. It has a rich set of primitives for robot operations and allows the
users to design high-level commands according to their particular needs. The following
In robot assembly the robot and the parts are generally confined to a well-defined
workspace. The parts are usually restricted by fixtures and feeders to minimize
24
positional uncertainties. Assembly from a set of randomly placed parts requires
The most common approach used to describe the orientation and the position of the
(specifying the orientation) and a vector (specifying the position) which are defined
The most common operation in robot assembly is the pick and place operation. It
However, only specifying the initial and final configurations is not sufficient. Both
constraints must be considered, such as obstacles in the present planned path [1].
The location and the dimension of the object in the workspace can be identified
only to a certain degree of accuracy. For a robot to perform tasks in the presence of
also acts as a feedback from the environment enabling the robot to examine and
25
Sensing in robot programming can be classified into three types :
usually achieved by encoders that measure the joint angle and compute the
2. Force and Tactile Sensing – used to detect the presence of objects in the
workspace.
user to support it. Complex robot programs are difficult to develop and can be
assembly task can best be described in terms of the objects being manipulated rather than
A task-level programming system allows the user to describe the task in a high level
language (task specification). A task planner will then consult a database (world models)
26
and transform the task specification into a robot-level program (robot program synthesis)
Architecture for a robot task planner is displayed in Figure 2.13 The task specification is
extracted, such as: Initial state, final state, grasping position, operand, specifications and
attachment relations. The subtasks then pass through the subtask planner which generates
The concept of task planning is quite similar to the idea of automatic program generation
27
program, and the program generator then generates a program that will produce the
Task-level programming, like automatic program generation, is in the research stage with
many problems still unsolved. The problems encountered in task planning and some of
This modeling is required to describe the geometric and physical properties of the
object (including the robot) and to represent the state of the assembly of the objects
in the workspace.
Geometric and Physical Models. For the task planner to generate a robot program
that performs a given task, it must possess information about the objects and the
robot itself.
In the AUTOPASS system [38], objects are modeled by utilizing a modeling system
representation to describe objects. Within this procedure, the shape of the object is
GDP provides a set of primitive objects (all of them are polyhedra) which can be
cuboid, cylinder, wedge, cone, hemisphere, laminum and revolute. These primitives
28
are internally represented as a list of surfaces, edges and points, which are defined
will invoke the procedure SOLID to define a rectangular box called Block with
dimensions xlen, ylen, and zlen. More complicated objects can then be defined by
In this research project, the same software coding approach is utilized as illustrated
below.
q3, q4)
RAPID robot target variables can be created on the fly when the robot trajectory
program is automatically generated. The call function will ensure that a variable
named “HOME” declared as a PERS = persistence, with the following x,y,z co-
ordinate position will be created in the master robot sub-routine RAPID program
requested.[22]
This is done with a high-level language. At the highest level one would like to have
natural languages as the input, without having to give the assembly steps. An entire
task like building a water pump could then be specified by the command “build
29
water pump”. However, this level of input is still quite far away. The current
The synthesis of a robot program from a task specification is one of the most
grasping,
planning,
plan checking.
Before the task planner can perform the planning, it must first convert the symbolic
constraints from the symbolic relationships. The RAPT Interpreter extracts the
symbolic relationships and forms a set of matrix equations with the constraint
using a set of rewrite rules to simplify them. The result obtained is a set of
constraints on the configurations of each object that must be satisfied to perform the
operation.
30
2.3 Communication Controls
The advent of the microprocessor created a new class of manufacturing devices – the
digital controller. Digital devices are now not only commonplace in manufacturing but in
many cases are essential to the manufacturing process. The management, maintenance
and fully optimized use of these devices are greatly enhanced by communications
between the control devices and supervisory computer systems. While in theory this
The evolution of modern industrial control has centered around the microprocessor.
It is the microprocessor that has created the need to communicate and integrate
Short of storing the program variations locally, the only viable alternative is to store them
remotely, then communicate them to the device. This is also the only practical way to
process and archive the data remotely. To obtain the benefits from the flexibility and the
2.3.1 Real-Time
The definition of real time is not precise, it is very situational. Within the banking
industry applications that deal with ATM’s are considered “real time”. However, during
the four to seven seconds it often takes for an ATM transaction, hundreds of rands of a
31
product may be mismanufactured or serious safety problems may arise in a typical
process.
Yet in the 500 milliseconds that a DCS may take to measure a variable, calculate a
response and execute a control action, a racing car can travel over ten times its own
length. Real time is then relative to the environment. In batch and continuous processing
operations, overall system response times are generally measured in the tens to hundreds
of milliseconds or, at the worst, in seconds. Thus, the data communications networks
within these systems must have performance characteristics that are consistent with this
range.
Most industrial robot controllers have to be configured and programmed via a hand held
teach pendant, making programming very tedious and time consuming [3]. By providing
the system with an Ethernet communication link between a bridge PC and the robot
controllers opens the system to true real-time control application [6]. This ensures that
when robot path position is manipulated due to the sensory feedback devices, such as a
vision system.
developed that allowed these devices to communicate with each other in a simpler
fashion than with point-to-point technologies. These networks reflect either the shared
32
A variety of systems are available, each of which are unique in operation, hardware and
capabilities.
LAN’s are essentially transparent to users and can provide a variety of benefits,
depending on the configuration and usage. Some of the benefits could include -
Local area networking is a critical step in wiring either the office or the factory. By
properly planning the network, one can create a system that links a group of
33
In its simplest form, a local area network utilizes standard cabling, which acts as an
electronic highway for transporting data and other information to and from different
DEC had their own model for communications, the DNA. DNA (Figure 2.15) is rich
in peer-to-peer services. It first lacked terminal connectivity. DEC remedied that with
the addition of LAT protocols, which were optimized for the terminal environment in
a network-based system.
34
Figure 2.15: DNA by DEC and Internet model
(ii) ISO Model – the seven functional communication layers are: [32]
message and how well it serves the user. This is where application
2. Presentation Layer : This layer is used to prepare the information for the
application.
between units and gradually feeds or buffers the information to the devices
or the program that performs the Presentation function. The Session layer
35
also provides the critical identification and authentication functions. It
the other higher layers might have into something the network can
the expedited delivery of priority messages. It checks the data, puts it into
connected into the circuit and the paths are defined by the network
topology.
6. Data Link Layer : This layer does the accounting and traffic control
(characters). The Data Link layer puts every piece of information into the
right place and checks it out before releasing it. Similarly, incoming
device.
36
7. Physical Layer : This layer describes the electrical and physical
37
There are ISO Standards for all seven ISO layers. At the lower two layers one of the
common ISO/IEEE standards that are in place and use the ISO model as a reference is
collection of protocols employed by the Internet. The Internet evolved from the old
key protocols had to do with end-to-end message integrity and routing information
over the wide area network (WAN). This became the TCP/IP portion of the network.
While TCP/IP play a significant part of the Internet, there are other protocols, for
packet size varies between intermediate segments and the two end networks.
38
Figure 2.18: IP Frame
IP Addresses – IP packets use a 4 byte address field for both the source and
destination addresses. Within this 32 bit field is network and station address
information.
determines if the destination is part of the local network. If it is, the packet is sent
directly to the destination. If the destination is not on the local network, the IP layer
which gateway device to send the packet to. The gateway is then responsible for the
TCP – Transport Control Protocol – Because the underlying IP layer does not
39
The use of ports, as they are called in a TCP environment, facilitates multiple
sessions. The ports serve as a means to establish a virtual connection at this level,
ordering as the packets are received. The WINDOW parameter is used for flow
control. If interfacing through TCP, it is advisable to use this full TCP packet as
opposed to the lower overhead user datagram protocol (UDP) packets. The UDP
error recovery and, as such, is of little value as an interfacing protocol for real-time
networks. If the information is important enough to burden the real-time system with
its handling, it should certainly warrant the additional integrity that the full TCP
As shown in Figure 2.21, a Client Server approach that uses TCP/IP is popular.
Generally, these systems have dedicated serial interfaces to the process control
systems from which they extract data. However, as more TCP/IP-based systems are
40
available. This means, as with so many other developments in networking, that
The use of an external sensing mechanism allows a robot to interact with its environment
taught to perform repetitive tasks via a set of preprogrammed functions. Although the
latter is by far the most predominant form of operation of current industrial robots, the
dealing with the environment is indeed an active topic of research and development in the
robotic field.
41
Robot sensing is divided into two functional areas, internal and external states:
1. Internal State Sensors – deal with the detection of variables such as arm joint
2. External State Sensors – deal with the detection of variables, such as range,
proximity and touch. Although proximity, touch and force sensing play a
Machine vision (other names include computer vision and artificial vision) is an
anticipated that vision technology will play an increasingly significant role in the future
of robotics.
Vision systems designed to be utilized with robot or manufacturing systems must meet
two important criteria which currently limit the influx of vision systems to the
manufacturing community. The first of these criteria is the need for a relatively low-cost
vision system. The second criterion is the need for relatively rapid response time needed
Nevertheless, there has been a significant influx of vision systems into the manufacturing
world. The systems are used to perform tasks, which include selecting parts that are
randomly orientated from a bin or conveyer, parts identification and limited inspection.
42
These capabilities are selectively used in traditional applications to reduce the cost of part
and tool fixturing, to allow the robot program to test for and adapt to limited variations in
the environment.
Advances in vision technology for robotics are expected to broaden the capabilities to
allow for vision-based guidance of the robot arm, complex inspection for close
dimensional tolerances, improved recognition and part location capabilities. These will
result from the constantly reducing cost of computational capability, increased speed and
The field of computer vision was one of the fastest growing commercial areas in the latter
part of the twentieth century. Computer vision is a complex and multidisciplinary field
Advances in vision technology and related disciplines are expected within the next
decade, which will permit applications not only in manufacturing, but also in photo
2.4.1.1 Background
information from images of a two dimensional world. This process, also commonly
areas, the operation of the vision system consists of six functions as illustrated in
43
1. Sensing – Is the process that yields a visual image.
enhancement of details.
interest.
44
Machine Vision is concerned with the sensing of vision data and its interpretation by
a computer. The typical vision system consists of the camera and digitizing hardware,
a digital computer, and hardware and software necessary to interface them. The
operation of the vision system consists of three functions as illustrated in Figure 2.23:
3. Application
Figure 2.23: This diagram simplifies the relationship between the three functions
45
The sensing and digitizing functions involve the input of vision data by means of a
camera focused on the scene of interest. Special lighting techniques are frequently
used to obtain an image of sufficient contrast for later processing [12][13]. The image
The digital image is called a frame of vision data and is frequently captured by a
The frames consist of a matrix of data representing projections of the scene sensed by
the camera. The elements of the matrix are called picture elements, or pixels. The
frame. A single pixel is a projection of a small portion of the scene which reduces that
portion to a single value. The value is the measure of the light intensity for that
element of the scene. Each pixel intensity is converted into a digital value.
The digitized image matrix for each frame is stored and then subjected to image
processing and analysis functions for data reduction and interpretation of the image.
These steps are required in order to permit real-time application of vision analysis
Typically an image frame will be thresholded to produce a binary image, and then
various feature measurements will further reduce the data representation of the image.
This data reduction can change the representation of a frame from several hundred
thousand bytes of raw image data to several hundred bytes of feature value data. The
resultant feature data can be analyzed in the available time for action by the robot
system.
46
Various techniques to compute the feature values can be programmed into the
computer to obtain feature descriptors of the image which are matched against
previously computed values stored in the computer. These descriptors include shape
and size characteristics that can be readily calculated from the thresholded image
matrix.
To accomplish image processing and analysis, the vision system must be trained
computer models.
The information gathered during training consists of features such as the area of the
object, its perimeter length, major and minor diameters and similar features. During
viewed by the camera are compared with the computer models to determine if match
has occurred.
The final function of a machine vision system is the applications function. The
Many two-dimensional vision systems can operate on a binary image which is the
between the object(s) and the background. Image contrast can be manipulated by
47
Another way of classifying vision systems is according to the number of gray levels
used to characterize the image. In a binary image the gray level values are divided
into either of two categories, black or white. Other systems permit the classification
of each pixel’s gray level into various levels, the range of which is called a gray scale.
cognition.
Plenty of image preprocessing techniques are available in the field of robot vision.
The method used for image preprocessing range from spatial-domain and frequency-
domain. Only a subset of them is suited for real-time image processing if the
48
Convolution technique is one of the spatial-domain techniques used most frequently
(also referred to as templates, windows, or filters) [32]. The desired image f(x, y) can
be obtained by convoluting the original image i(x, y) with a convolution mask h(x, y).
In robot vision system, the convolution masks are usually a 3×3, 5×5 or 7×7 matrices.
For computational purposes the 3×3 matrix is widely utilized in real-time systems
3 3
f ( x, y ) = ∑∑ i ( x + j − 2, y + k − 2) ⋅ h( j , k )
j =1 k =1
The image preprocessing techniques, which are also employed in this project, are the
following:
Smoothing
Smoothing operations are used for reducing noise and other spurious effects that may
49
High Pass Filtering
High pass filtering is utilized to sharpen images that are out of focus or fuzzy. Its
convolution mask is
− 1 − 1 − 1
H High − pass = − 1 9 − 1
− 1 − 1 − 1
Median Filtering
Median filtering ranks the current set of nine pixel intensities in order of magnitude
and places the median intensity value into the destination image at the central point.
The whole image is processed in turn by sliding the window over the entire image.
Low pass filtering is exploited to smooth out a sharp image. Its convolution mask is
given as:
1 1 1
H Low− pass = 1 2 1
1 1 1
Noise Cleaning
Noise cleaning is employed to remove random noise spikes on the captured image. Its
1 2 1
H Noise −cleaning = 2 4 2
1 2 1
50
Averaging
Averaging can be used to remove random noise spikes and clean edge features in the
1 1 1
H Averaging = 1 0 1
1 1 1
Thresholding
Digital image thresholding is a crucial process in robot vision system, which is used
to manipulate the captured image. To separate and extract the object from the
The thresholding technique is not limited to a fixed value T. A thresholded image can
be acquired by
1 if f ( x, y ) > T
g ( x, y ) =
0 otherwise
In the case of dark objects on a light background, thresholding takes the selected
grayscale value T and compares each pixel intensity in the image. If the intensity at
pixel f(x, y) < T that pixel is replaced by a logic 0 value. If the intensity f(x, y) > T
that pixel is replaced by a logic 1 value. A typical image intensity histogram with two
51
.
In general, thresholding falls into two categories, which are manual thresholding and
intensities in the images, detects the pixel intensity most frequent in the image and
follows the histogram curve down to identify the minimum. An adaptive thresholding
52
Contour Detection
Contour detection plays a central role in robot vision. Using the information from the
image analysis. In addition, using the contours obtained from the image is their
relative stability under fluctuations in the lighting of the scene. The standard
approaches to contour detection are implicitly based on a very simple model in which
step edges. The classical approach [3] to contour detection makes use of digital
gradient or Laplacian.
The first derivative of a contour model is zero in all region of constant intensity. The
second derivative is zero in all locations, except at the onset and termination of an
intensity transition.
The Laplacian is a scalar second derivative operator for functions of two dimensions,
given by:
∂2 ∂2
∇f ( x , y ) = 2 f ( x , y ) + 2 f ( x , y )
∂x ∂y
53
2.4.1.3 Industrial Machine Vision
Researchers in the field of Industrial Machine Vision (IMV) concentrate their efforts
may be able to control the background, the lighting, the camera position, or other
parameters. Such control may allow the use of techniques that would be inappropriate
couples devices), and others [40][41]. These devices differ both in the ways in which
they form images and in the properties of the images so formed. However, all the
Vision systems are occasionally supplied by the robot manufacturer and integrated
with the controller, but usually are separate, with an interface to the robot controller
54
Figure 2.26: The components of a robot vision system
The principal devices used for robotic vision are television cameras, consisting either
As far as a color CCD camera is concerned, the captured image, which is made of a
55
where size is m × n, contains the colour information and intensity of three colour
The representation of pixel varies from the purposes of image processing. RGB (red,
green, and blue) method, HSB (hue, saturation, and brightness) method, and CMYK
(cyan, magenta, yellow, and black) method are commonly used. In the field of
machine vision, RGB method is generally employed. In this case, each pixel is
represented by three channels which denote red, green, and blue intensity
respectively[32].
7. Object counting
56
8. Inspection, e.g. of printed circuit boards to detect incorrectly inserted
components
2.5 Conclusion
A large aspect of industrial robotics has been discussed as a whole, such as how robots
are categorized according to their arm movement configuration, how they are utilized in
automation applications, robot manipulator axis structure, robot kinematics and dynamics
for both direct and inverse kinematics, the parameters of robot arm links and joints as
location of the link with respect to the fixed co-ordinate frame. Another area of
and methods, robot sensing and flow control methods which will play an important role
in later chapters.
protocols were discussed, focusing on the real-time issues as well as the Ethernet
communication protocols, and protocol layers, which will ensure that the real-time
Robot movement flexibility can be enhanced, by providing the robot with a sensing
The fundamental concepts covered in this chapter will be used extensively in the chapters
to follow, for deriving the equations of motion of an industrial ABB robot manipulator
57
CHAPTER 3
SYSTEM SETUP :
HARDWARE AND SOFTWARE ARCHITECTURE
This chapter provides a brief overview of the system architecture in terms of hardware
and software components required and developed to achieve the integrated vision-based
trajectory control system for an industrial ABB Robot. The system architecture provides
the system platform overview of how the components were implemented in this research
project.
3.1 Introduction
58
In this research project to perform profile recognition and integrated robot control for
industrial applications the following principle modules, as illustrated in Figure 3.1, were
established:
Robot Vision, which consists of a CCD camera [23] and an AGP bus frame
grabber [24]. The camera is fixed to the robot arm and captures an image of the
object. It is interfaced to the frame grabber hardware, which provides the live
images, this is controlled through 2D Flash Point API’s, the components are
Bridge PC-Based control system, which provides the system with an on-line robot
59
RobComm Server provides a status monitoring display, which
care of the communication with the robot and presents the robot
An ABB industrial robot IRB-1400 and controller, which can be accessed through
remote Ethernet devices, which utilized standard TCP/IP Ethernet protocols. This
hardware enables the Ethernet integration between the Bridge Computer and the
hardware rank.
60
The aim of this PC-based experimental setup is to provide a platform for the
environment has a major effect on system setup. This will reduce the loss of production.
The robot with vision sensory feedback is used to manipulate the robot trajectory path
when there is a change in object profile and orientation. The robot is given this ability by
means of a PC-Based Control system with vision and path manipulation software. A
CCD camera provides a video image to a video frame grabber, which captures a 2D
image of the environment of the robot. The Bridge PC software will process the image
data and generate an image map which contains the object profile co-ordinates. The co-
ordinates are then used to generate the RAPID Robot trajectory path. The RAPID
program is downloaded to the robot controller via an Ethernet serial communication link
(TCP/IP Protocol). All robot control command events are managed via the Ethernet link.
The Ethernet communication link is set up for peer-to-peer communication for this
project setup.
The Bridge PC is set up as the backbone and provides the link between the camera and
robot controller. It has been set up with a host name “S4” and the Ethernet IP address
100.100.100.1. The RobComm Server manages the Ethernet protocol between the Bridge
PC and robot controller. The second Ethernet card manages the communication with the
61
When the system software starts up so does the RobComm Server and a dedicated
communication link with the robot controller is established. This allows the ActiveX
component to access all the relevant components needed for the research project.
Before the installation of the relevant software the ABB IRB 1400 Industrial Robot at PE
Technikon, did not have the capability of remote access from a remote server via an
Ethernet link. The Ethernet hardware and all relevant Ethernet software protocols were
The Bridge PC was equipped with 2D-Video Frame Grabber hardware, which provides
hardware sub-systems
In this project the three major hardware sub-systems, namely: Industrial Robot Controller
and the vision sensory system were integrated with a PC-Based control system to
Figure 3.1 illustrates the hardware components and the software interface modules that
was developed and integrated. The following sub-sections will give a brief outline on the
62
Figure 3.2a: 1400 ABB Robot, Vision Feedback Camera & Bridge PC -System Setup
Figure 3.2b: 1400 ABB Robot Controller & Bridge PC – System Setup
63
Figure 3.2c: 1400 ABB Robot Controller & Bridge PC Hardware – System Setup
The robot being utilized in this research is an ABB IRB 1400 Industrial Robot, which
Technikon. The robot consists of two major components, namely the manipulator and the
controller.
The robot manipulator has six axes, with spherically-jointed geometry. This provides the
robot with six degrees of freedom (6DOF). The manipulator has three axes for
positioning and three axes for orientation. The robot was designed for a manufacturing
The controller has a variety of flexible hardware enabling it to communicate with remote
hardware using one of the following methods, e.g. I/O, Analog, Serial RS232, Serial
Ethernet, etc. All of the methods use standard industrial interface mediums. The robot has
64
a work envelope of approximately 1.444 m radius with a repeatable position accuracy of
0.05 mm.
The S4C Controller is a standard controller and is used as the platform for twenty-two
different ABB robots. The robot controller consists of three onboard computers to
achieve control of the robot, its axis and to perform I/O. A dedicated Ethernet hardware
controller manages all data transfer between the remote peripherals (under the TCP/IP
protocol). The robot utilizes Baseware as the operating kernel system. This kernel
communication and motion. Application programs run on top of the operating kernel
system. The ABB Robot programming language RAPID is used to program the robot for
A module contains all data and functions. A program may consist of a number of
modules, which include user-defined modules and system modules. Only one module will
A teach pendant is connected to the controller and is used as the control and
programming user interface. Pull-down menus, dialogues and windows are used to
display information to the user. Touch keys and a joystick are provided as input devices.
The controller is provided with Ethernet hardware, which allows the remote user to
access robot information and status at high speed. This opens the window to the true
sense of real-time control. The Ethernet link can be connected directly to the PC via
peer-to-peer interface or via a network hub. The ABB Ethernet hardware and software
65
require four IP addresses for remote access, which increases the speed of access. The IP
address of the remote device must not overlap the IP address of the robot.
The network platform of the robot controller is managed via Factoryware software that is
loaded on top of the baseware software and it utilizes the RAP protocol. The RAP
protocol manages all data flow between the robot controller and the remote device.
Figure 3.3 illustrates the robot controller Ethernet configuration sequence that is required
in order for the Bridge-PC to establish a communication link via the FactoryWare
software. Without this setup sequence the communication link would be unavailable.
The following robot controller component entities are required, which will provide the
Sub-Net of 255.255.255.0
handle critical data information about the robot controller and manipulator such
as, controller status, manipulator TCP position, controller program position, etc
66
Figure 3.3: Robot Controller Ethernet Communication Configuration
3.2.2 Bridge PC
The PC utilized for this research is a standard office PC with an AMD processor and
512MB of onboard memory, which enhances the overall processing speed of the
simulation software. Windows 98 operating system was selected due to the constraints of
the software components needed for the robot communication. The PC was equipped
with two standard 10 Base / 100 Ethernet hardware cards, which are used for robot
communication and LAN. An enhanced video frame grabber 3D graphic hardware card
The Ethernet hardware utilized for the peer-to-peer robot communication was set up with
dynamic and configured via the remote server, which is illustrated in Figure 3.4.
67
Figure3.4: Experimental Ethernet TCP/IP Configuration Robot & LAN Connection
The robot vision system consists of a CCD camera and a Flash Point 3D frame grabber
CCD camera is the hardware core of vision system. SRC-503HP CCD camera is
68
up to 752×582V (440,000 pixels) and its scanning system provides 15,625 kHz (H)
and 50 Hz (V). In addition, this camera model supports auto backlight compensation,
grabber. The Flash point 3D video image brightness, contrast, saturation, etc are
manipulating the image processing software to ensure that the best possible image
will be processed correctly. Flashpoint 3D frame grabber with PCI bus is employed in
able to capture and display full-frame color and video in real time to VGA display
memory. It supports pixel format of 8/16/32 bits per pixel. It supports non-destructive
Figure 3.5 shows the software architecture indicating all the software components
developed and how these components were integrated to establish an integrated robot
vision control system. The software components were logically divided into the following
modules:
69
S4 Robot RAPID Program Structure
components developed:
70
3.3.1 Vision Recognition Classes and Image API’s
FP3D header is the primary library, which contains many vital type definition constants,
structure definitions, and prototypes used when calling the Flash Point 3D API.
Initialization of the Flash Point 3D library functions must be called to put a live image
In general, four FlashPoint 3D library functions must be called to put a live video
71
FPV_Int function initializes FlashPoint VGA to the current loaded configuration
values.
configuration.
FPV_SetVideoWindow function sets the size and location of the video window on
The grabbing video image is performed by using the following API functions: [24]
FPV_Savefile, saves an image bitmap from memory to disk. The file type is
The default mode of grabbing the Flash Point 3D image assumes that the video is always
on top. This means that if the video is partially covered by a window or graphics all of the
video image is still copied. Once the image is grabbed into video memory, the image is
processed and analyzed via algorithms. These algorithms were created in a Microsoft
Visual C environment. The profile image map is extracted and processed to create a robot
72
Figure 3.7: Vision Profile Extraction Architecture
Object profile image - captured into the frame grabber’s video memory, via the
Image Noise filtering - performs the image pre-processing module, which makes
use of different filters, such as: Low pass, high pass, median, noise cleaning,
Profile Extraction - performs edge thinning to produce a one pixel thick boundary
73
Image profile co-ordinate map is uploaded to an access database, which will be
unpredictable. The path trajectory engine must be planned online, the engine must receive
a continuous flow of information about occurring events and generate new controls, while
previously controlled motions are being executed. All relevant path trajectory co-ordinate
An architecture of trajectory program generation engine is shown in Figure 3.8 For robot
handing system, the relationship between the robot coordinate system (tool coordinate
system) and object coordinate system must be created. Transformation matrices are
74
Robot trajectory generation engine architecture is composed of the following
components:
Initialize Robot Path trajectory engine, which will automatically generate the
The image map is downloaded from the Access Database, which contains the
image data such as, profile position with respect to the object co-ordinate frame,
The image map is used to generate a kinematic trajectory path RAPID robot
program for the current image profile, which will be mimicked by the robot.
The trajectory profile RAPID program is uploaded into the ABB control via an
mechanism to ensure that the robot controller does not react erratically during
data transfer.
The Factoryware Interface option enables the robot system to communicate with a PC
whereby a unique Alias Name and IP address has to be created which is mapped to
75
the Bridge PC’s static IP address. This will provide the crucial link for the ActiveX
RobComm requires RAP communication services, which are uploaded to the robot
configured via the control properties. The OCX components provide a flexible,
76
RobComm is designed to run multiple applications, including multi-threaded
Applications developed with RobComm work via an Ethernet link to multiple robots.
The Factoryware Interface includes RAP, based on MMS functionality. RAP is used
Read logs
Figure 3.10, the Ethernet network configuration has been utilized for this
experimental setup, which adds an advantage to the real-time processing for this
application [22].
77
Figure 3.10: S4 Robot Controller communication protocols
ABB S4 DDE Server is designed to utilise the robot communication protocols and
make you more productive by combining the power of PCs with that of the robots.
The ABB S4 DDE Server is a software building block that takes care of the
communication with the robot and presents the robot data in the standard DDE
format. Any application that can “talk” the DDE “language” can communicate with
the ABB robots via the ABB S4 DDE Server. Examples of applications that do DDE
InTouch the user can build his own custom user interface, visualizing his production
process. InTouch then needs the ABB S4 DDE Server in order to communicate with
the robots.The DDE Server communicates with the robots using the RAP protocol. It
maintains a database of the relevant variables in the robot and makes sure that these
78
If new RAPID variables are introduced in the robot program, the DDE Server will
create corresponding DDE variables “on-the-fly”. The application using the DDE
Server can therefore concentrate on the user interface and rely on the updated DDE
variables . The ABB S4 DDE Server provides reading and writing of I/O, RAPID
variables and robot system variables. It supports spontaneous messages from the
robot (SCWrite), as well as file operations. DDE Server DDE stands for Dynamic
Windows applications to send and receive data from each other. It is implemented as
a client/server mechanism. The server application (like the ABB S4 DDE Server)
provides the data and accepts request from any other application that is interested in
its data. Requesting applications (like InTouch) are called clients. To obtain data from
another application the client program opens a channel to the server application by
specifying three things: Figure 3.12 illustrates the ABB S4 DDE Server environment,
robot, and identifies the individual robot. The topic names are defined when you
configure the robots in the DDE server. Figure3.11 illustrates the format for
79
Figure 3.12: DDE Server Engine
A RAPID program consists of instructions and data. The program is usually made up of
A main routine
Several subroutines
Program data
The program memory contains system modules. The main routine is the routine from
which program execution starts. Subroutines are used to divide the program up into
smaller parts in order to obtain a modular program that is easy to read and maintain. Data
is used to define positions, numeric values (registers, counters) and co-ordinate systems,
80
Figure 3.13: S4 ABB Controller RAPID program structure
System modules are programs that are always present in the memory. Routines and data
related to the installation rather than the program, such as tools and service routines, are
ABB Robot RAPID trajectory path structure is composed of the following components.
This program structure will be utilized to form the main robot motion components in the
experimental setup. The RAPID program structure, which is utilized in the experimental
Static Sub-Routines – this handles the static robot target co-ordinate routines.
81
Figure 3.14: ABB Robot RAPID trajectory path structure
3.4 Conclusion
This chapter focused on the integrated hardware and software architecture for the
communication and software development modules that will enable real-time robotic
displacement and orientation, which would in turn possibly provide the system with
additional flexibility. This can be achieved by a robot vision sensory feedback system and
software development. PC-bus interface cards (frame grabber, PMAC card, data
acquisition card, and Ethernet card) are utilized, as well as ActiveX techniques are widely
exploited to build up a configurable system. The proposed architecture in Figure 3.1 puts
82
forward a generic framework for a remote PC-Based robot control system, which
organizes the system hardware and software components and reveals their relationships.
Chapter 4 will describe the development of the software modules required to control the
robot motion based on the visual information. The software components to perform
83
CHAPTER 4
CONTROL SYSTEM
To achieve this control software algorithms were developed using the robotic RAPID
inverse kinematics path trajectory control fundamentals, RAP Communication and Robot
Figure 4.1 illustrates the control architecture that was constructed to achieve the PC-
Based robot control mechanism as well as the interface to the vision sensory system
described in Chapter 5. In order to achieve the above control mechanism, robot motion
fundamentals were developed into algorithms, which will be discussed in detail. The
84
Figure 4.1: PC-Based Robot Trajectory Path Control Architecture
The PC-Based Robot Trajectory control system receives a profile map from the vision
Frame Grabber
The CCD Camera provides the vision sensory feedback system for this research
Image Processing
Image processing Class – A raw image is captured into dynamic memory where
algorithms analyze the binary map of the image by filtering and cleaning random
85
noise. Threshold is utilized to provide a gray-scale of the image. The object can
Profile Extraction
Profile Extraction Class – Utilizes close chain vectors, which utilizes standard
mathematic alogorithms. This ensures that the chain profile segments contain the
This uses the extracted profile to automatically create a robot trajectory path that
Image Database
This provides a storage medium that handles the image profile co-ordinate map.
The PC-Based RAPID program engine was developed to provide a platform for
the trajectory path control. This engine provides an environment for basic RAPID
motion program development and control. This environment utilizes direct and
inverse kinematics path modeling algorithms to control TCP position of the robot
building blocks for the research platform, which provides the tools for the system
86
integration. The vision feedback system is integrated via a PC-Based robot control
Communication Control
Motion Control
The DDE server engine provides additional programming flexibility for industrial
The following sub-sections provide the fundamentals for the RAPID programming
environment, which are used to construct software algorithms that will transform the
image profile to actual trajectory motion commands in order for the system to trace the
Path motion forms the main component of any industrial robot. Figure 4.2 illustrates the
move command structure for an ABB robot controller, which will servo the robot
manipulator to specific path co-ordinates that have been predefined or calculated from
87
robot kinematics algorithms. The Industrial robot MOVE command is structured as
Z10 = Zone size (accuracy) , i.e. how close the robot must be to the destination
88
Motion MOVE Instruction
The robot manipulator position control managed in this research project is illustrated in
Figure 4.3.
This JOINT motion was not utilized in the research project, but is available to the
robot system.
89
Motion SPEED and ZONE size specification
The speed and zone size refer to different data fields, which include the desired speed in
mm/s, zone size in mm, etc. You can create and name these data fields yourself, but the
One must then specify the tool, its dimensions and weight, in the tool data. The TCP of
the tool is moved to the specified destination position when the instruction is executed, as
90
4.2 RAPID Program Data Types
Figure 4.5 displays the variable declaration for a robot target position which will be
The robot target variables consist of the following structures. It contains a (x, y, z) co-
ordinate frame with (q1, q2, q3, q4) which indicates the robot orientation position. The
data type “Robtarget” is used for the robot’s position, which includes the orientation of
The data type “Orient” is used for orientation (such as the orientation of a tool) and
The orientation must be normalized and the sum of the system must be equal to one.
1 = q 21 + q 2 2 + q 2 3 + q 2 4
91
A quarternion describes the rotational matrix. Quarternions are calculated based on the
elements of the rotational matrix. Figure 4.6 illustrates the orientation algorithms.
x1 + y 2 + z 3 + 1
q1 =
2
x1 − y 2 − z 3 + 1
q2 = sign q2 = sign (y3-z2)
2
y 2 − x1 − z 3 + 1
q3 = sign q3 = sign (z1-x3)
2
z 3 − x1 − y 2 + 1
q4 = sign q4 = sign (x2-y1)
2
In this research project the robot motion is controlled from the RAPID program, which
has been automatically generated from the PC-Based Trajectory application. There is
numerous programming structures that can be utilized, these structures are selected
depending on required motion tasks. The current programming structure that was
developed creates a flexible approach to achieve the end goal of the research project. The
program has been segmented into a main program with a dynamic sub-routine and
The static components are position co-ordinates that are utilized for calibration, object
viewing, and object co-ordinate marker positions. The dynamic component manipulates
the robot manipulator with respect to the object profile position co-ordinate map. Figure
4.7 illustrates the program components architecture (static and dynamic). These
92
This co-ordinate map contains the orientation and position of the object profile. The CCD
camera sensory feedback provides a closed loop system for the robot.
The object image is viewed via a CCD camera. Image processing software manipulates
the captured object profile, which extracts the image co-ordinate map. This co-ordinate
map is fed back to the robot controller which servo’s the robot manipulator to trace the
object profile.
93
The RAPID program consist of the following components:
Main RAPID Program routine handler ensures that the correct routines are
been developed. This handshake has been configured in such that the software
“INDEX_MARKER” provides an index. This index ensures that the correct sub-
routine is called at the appropriate time. Due to the constraints of the RobComm
jump to the “MIMIC_PROFILE” sub-routine. This would in turn trace the image
profile, once the routine has been completed the “INDEX_MAKER=0” and return
94
//****************************************************************
PROC main()
WHILE(TRUE)
IF INDEX_MARKER=1 THEN
HOME;
ENDIF
IF INDEX_MARKER=2 THEN
CAMERA_BIRD_VW;
ENDIF
IF INDEX_MARKER=3 THEN
CAL_OBJ;
ENDIF
IF INDEX_MARKER=4 THEN
MIMIC_PROFILE;
ENDIF
IF INDEX_MARKER=5 THEN
OBJ_CFRAME;
ENDIF
ENDPROC
//****************************************************************
95
Robot Home Position Sub-routine
The robot HOME position routine forms part of the static movement commands.
Once the command has been called, the system will request the robot controller to
rotate and move all robot manipulator axes to their HOME position, which is the
zero degree state. Figure 4.8 graphically illustrates the RAPID sub-routine call
procedure.
//****************************************************************
96
Robot Camera view position Sub-routine
movement commands .The robot controller will move the robot manipulator to
the relevant position where the test object can be viewed correctly. A software
BOOL flag “CAPTURE” is utilized for the system to be synchronized when the
image has been correctly captured. The position will be maintained until the
system request the robot controller to return the manipulator to the “HOME”
position. Figure 4.9 graphically illustrates the RAPID sub-routine call procedure.
97
MoveJ CBV4,v100,z50,tool0;
WaitUntil CAPTURE=1;
MoveAbsJ[0,0,0,0,0,0],
[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]
,v200,z50,tool0;
INDEX_MARKER:=0;
ENDPROC
//****************************************************************
The robot “CAL_OBJ” position routine forms part of the static movement
commands. This routine is utilized for system calibration where the robot
manipulator will be commanded to mark off a specific calibration length that has
been pre-programmed, for the vision system which will view the refer calibration
marker and the object image pixel ratio can be calculated, to ensure that the
MoveJ CO1,v150,z50,tool0;
MoveJ CO2,v150,z50,tool0;
98
MoveL Offs(CO2,0,200,-30),v50,z1,tool0;
MoveAbsJ[[0,0,0,0,0,0],
[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]],
v200,z50,tool0;
INDEX_MARKER:=0;
ENDPROC
//****************************************************************
The robot “MIMIC_PROFILE” position routine forms the main part of the
dynamic movement routines. This routine is utilized for the object profile, to trace
the captured image, which is used to manipulate the robot manipulator to the
RAPID dynamic sub-routine call procedure which traces the object profile.
99
Figure 4.11b: Mimic profile object sequence
PROC MIMIC_PROFILE()
MoveJ PRP1,v150,z50,tool0;
MoveJ PRP2,v150,z50,tool0;
START_POS:=PRP2;
MoveJ START_POS,v150,z50,tool0;
MoveL Offs(START_POS,0,0,0,Z_OFFSET),v150,z50,tool0;
MoveL Offs(START_POS,X_POS1,Y_POS1,Z_OFFSET),v150,z50,tool0;
MoveL Offs(START_POS,X_POS2,Y_POS2,Z_OFFSET),v150,z50,tool0;
MoveL Offs(START_POS,X_POS3,Y_POS3,Z_OFFSET),v150,z50,tool0;
MoveL Offs(START_POS,X_POS4,Y_POS4,Z_OFFSET),v150,z50,tool0;
MoveJ PRP3,v150,z50,tool0;
MoveAbsJ
[[0,0,0,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]],v200,z50,tool0;
ENDPROC
MoveAbsJ[[0,0,0,0,0,0],
[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]],
v200,z50,tool0;
ENDPROC
//****************************************************************
100
In order for the remote system to servo the robot manipulator to a required target
ensures that the RobComm handler can re-configure the target positions on the fly.
//*********************************************************************
PERS robtarget PRP4:=[[955.01,0,1195],[0.707106,2E-06,0.707108,2E-06],
[0,0,-1,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]];
rigid bodies (links) connected in series by either revolute or prismatics joints driven
actuators. One end of the chain is attached to a supporting base while the other end is free
and attached with a tool (end-effector) to manipulate objects or perform assembly tasks.
The relative motion of the joints results in the motion of the links that positions the hand
description of the end-effector of the robot manipulator with respect to a fixed reference
co-ordinate system.
Robot arm kinematics deals with the analytical study of the geometry of the motion of a
robot arm with respect to a fixed reference co-ordinate system. In this research project
two modeling methods are dealt with which form the basis in which the ABB IRB 1400
robot is controlled.
101
Figure 4.12: Direct and inverse kinematics relationship
Since the independent variables in a robot arm are the joint variables and a task is usually
stated in terms of a reference co-ordinate frame, the inverse kinematics problem is used
more frequently. Figure 4.12 illustrates the relationship between these two modeling
methods.
There are numerous methods and algorithms used to determine and display the tool center
position value of the robot. The method used in this project, the Denavit-Hartenberg (D-
H) theory, states that a 4x4 homogeneous transformation matrix represents each link co-
ordinate system at the joint with respect to the previous link co-ordinate system. Thus,
can be transformed and expressed in the “base co-ordinate” which makes up the inertial
102
4.4.2 Direct Kinematic Computation for an IRB 1400 Robot
An orthonormal cartesia co-ordinate system (xi, yi, zi) can be established for each link at
its joint axis, where i = 1, 2, …, n (n = number of degrees of freedom) plus the base co-
ordinate frame. Since a rotary joint has only one degree of freedom, each (xi, yi, zi) co-
ordinate frame of the robot arm corresponds to joint i + 1 and is fixed to link i. When the
The base co-ordinates are defined as the 0th co-ordinate frame (xo, yo, zo), which is also
the inertial co-ordinate frame. With respect to the ABB IRB 1400 robot, the tool
The zi –1 axis lies along the axis of motion of the i’th joint.
The xi axis is normal to the zi –1 axis, and points away from it.
Figure 4.13: Link parameters and joint angle range for the ABB 1400 Industrial
Robot
103
These link robot parameters, which are illustrated in Figure 4.13, are the true offset
values for a ABB robot which is utilized in the forward kinematics algorithms. This,
R
The homogenous matrix, TH which provides the robot tool centre position and
orientation with respect to the base co-ordinate system, needs to be calculated by a matrix
i −1
chain product of successive co-ordinate transformation matrices of Ai , and is expressed
as
o
Ti = 0 A1 .1 A2 .2 A3 .3 A4 .4 A5 ...........i −1 Ai __ for _ i = 1,2......n
x yi zi pi
= i
0 0 0 1
where
pi
= position vector which points from the origin of
the base co-ordinate system to the origin of the ith
co-ordinate system. It is the upper right 3x1
0
partitioned matrix of Ti
104
cosθ 1 0 − sin θ 1 a 1 . cosθ 1 cosθ 2 − sin θ 2 0 a 2 . cosθ 2
sin θ sin θ cosθ 2 0 a 2. sin θ 2
0 cosθ 1 a 1 . sin θ 1 1
0
A1 = 1
A2 = 2
0 −1 0 d1 0 0 1 0
0 0 0 1 0 0 0 1
0 −1 0 0 0 0 1 d6
0 0 0 1 0 0 0 1
105
Figure 4.14: ABB Industrial Robot link co-ordinate transformation matrices
Specifically, for [i]=6, we obtain the T matrix, T = 0 A6 , which specifies the position and
orientation of the endpoint of the manipulator with respect to the base co-ordinate system.
The final robot arm matrix T for an ABB robot manipulator is given below with the
equations for each of the matrix structures. The calculations equation was utilized to
106
x yi zi pi
T = i
0 0 0 1
n x sx ax px
n sy ay py
= y
nz sz az pz
0 0 0 1
where
The direct kinematics solution of the six-link ABB robot manipulator is, simply a matter
i −1
of calculating T = 0 A6 , by chain multiplying the six Ai , matrices and evaluating each
element of T matrix.
T = T1T2 = 0 A6 = 0 A1 .1 A2 .2 A3 .3 A4 .4 A5 .5 A6
n x sx ax px
n sy ay py
= y
nz sz az pz
0 0 0 1
107
where
All calculations were implemented in the RAPID program generation engine for an
industrial ABB IRB1400 manipulator, shown in section 4.4.4. The robot manipulator was
positioned in specific locations and the robot controller TCP was correlated with the
108
4.4.3 Analytical computation of the inverse kinematic model
Computing the inverse kinematic position model (IKPM) of a robot arm is explained in
terms of the basic trigonometric method for simple plane robot arms[27][28]. This
approach has been utilized on the ABB IRB 1400 robot with respect to the plane joint
axis.
Figure 4.15: Planar 3-R Manipulator with the three reference joint angles
109
Transform joint co-ordinates to the end effector co-ordinates:
Solving nonlinear trigonometric equations using a ‘atan2’ equation. The following non-
trigonometric equations steps, illustrate how joint manipulator angles can be calculated as
follows:
STEP2:
STEP3:
110
STEP4: The calculation of Angle2:
111
4.4.4 Software Implementation of Robot Kinematics
During the operational setup of this research project all software implemented for
robot kinematics was constructed using Microsoft Visual Basic 6, instead of Visual C.
This was due to the limitations of the RobComm ActiveX components. A kinematic
function was created, which managed and manipulated the robot co-ordinate frame.
This was dependent on the axis orientation angles. This function provides the full
(q1 – q4).
The quarternion value represents the rotational matrix of the tool co-ordinate system
angle with respect to the robot base co-ordinate system. As seen in the previous
section, the ABB robot positional structure is divided into the tool centre position
(x, y, z) and the orientation of this TCP with respect to the quarternions (q1 – q4).
This is the reason why the robot orientation is expressed in quarternion instead of axis
angle position.
R
The final arm matrix TH is the relationship between the base and the tool as
illustrated below.
112
Rotational Matrix
correspond to the rotated x, y, z, axis with respect to the tool co-ordinate system and
The value of nx will then be the x component of the x vector. The quarternion values
can be calculated using this matrix value as illustrated by the following equation:
x1 + y 2 + z 3 + 1
q1 =
2
x1 − y 2 − z 3 + 1
q2 = sign q2 = sign (y3-z2)
2
y 2 − x1 − z 3 + 1
q3 = sign q3 = sign (z1-x3)
2
z 3 − x1 − y 2 + 1
q4 = sign q4 = sign (x2-y1)
2
Figure 4.16 illustrates the ABB robot remote kinematic programming environment,
environment.
113
Figure 4.16: Robot Path Engine environment
The Visual Basic source code for the direct kinematics computation is illustrated
//*******************************************************************
// Convert the Angle Degree value to RAD value for computation requirements
A1 = AD1 * (PI / 180)
A2 = AD2 * (PI / 180)
A3 = AD3 * (PI / 180)
A4 = AD4 * (PI / 180)
A5 = AD5 * (PI / 180)
A6 = AD6 * (PI / 180)
ROBOT_ANGLE1 = AD1
ROBOT_ANGLE2 = AD2
ROBOT_ANGLE3 = AD3
114
ROBOT_ANGLE4 = AD4
ROBOT_ANGLE5 = AD5
ROBOT_ANGLE6 = AD6
//Link parameters
ROBOT_OFFSET_ANGLE1 = Val(AO1) // 0
ROBOT_OFFSET_ANGLE2 = Val(AO2) // -90
ROBOT_OFFSET_ANGLE3 = Val(AO3) // 0
ROBOT_OFFSET_ANGLE4 = Val(AO4) // 0
ROBOT_OFFSET_ANGLE5 = Val(AO5) // 0
ROBOT_OFFSET_ANGLE6 = Val(AO6) // 180
115
Sy = -SN1 * (CS2 * CS3 - SN2 * SN3) * (CS4 * CS5 * SN6 + SN4 * CS6) - CS1 *
(SN4 * CS5 * SN6 - CS4 * CS6) + SN1 * SN5 * SN6 * (CS2 * SN3 + SN2 * CS3)
Syy = Val(Sy)
Sz = (SN2 * CS3 + CS2 * SN3) * (CS4 * CS5 * SN6 + SN4 * CS6) - SN5 * SN6 *
(SN2 * SN3 - CS2 * CS3)
Szz = Val(Sz)
Ax = -CS1 * CS4 * SN5 * (CS2 * CS3 - SN2 * SN3) - SN1 * SN4 * SN5 - CS1 *
CS5 * (CS2 * SN3 + SN2 * CS3)
Axx = Val(Ax)
Ay = -SN1 * CS4 * SN5 * (CS2 * CS3 - SN2 * SN3) - CS1 * SN4 * SN5 - SN1 *
CS5 * (CS2 * SN3 + SN2 * CS3)
Ayy = Val(Ay)
Az = CS4 * SN5 * (SN2 * CS3 + CS2 * SN3) + CS5 * (SN2 * SN3 - CS2 * CS3)
Azz = Val(Az)
Px = (-CS1 * CS4 * SN5 * d6 * (CS2 * CS3 - SN2 * SN3)) - (SN1 * SN4 * SN5 *
d6) - (CS1 * (CS2 * SN3 + SN2 * CS3) * (d6 * CS5 + d4)) + (CS1 * (aa3 * CS2 *
CS3 - aa3 * SN2 * SN3 + aa2 * CS1 * CS2 + aa1))
Py = -SN1 * CS4 * SN5 * d6 * (CS2 * CS3 - SN2 * SN3) + CS1 * SN4 * SN5 * d6 -
SN1 * (CS2 * SN3 + SN2 * CS3) * (d6 * CS5 + d4) + aa3 * SN1 * SN2 * SN3 - aa3
* SN1 * SN2 * SN3 + aa2 * SN1 * CS2 + aa1 * aa1 * SN1
Pz = CS4 * SN5 * d6 * (SN2 * CS3 + CS2 * SN3) + (SN2 * SN3 - CS2 * CS3) * (d6
* CS5 + d4) - (aa3 * SN2 * CS3 + aa3 * CS2 * SN3 + aa2 * SN2) + d1
//*******************************************************************
It was discovered during the research that programming a robot with a teach pendant was
very time consuming. A Visual Basic programming platform was developed to minimize
116
Figure 4.17 illustrates the remote RAPID program architecture for motion control.
Program Header
To ensure that the ABB robot controller will respond correctly to the
is generated via the remote robot program engine as illustrated in Figure 4.17. The
following source code below displays the source code required to generate the
The “RobotPrgMAINHeader function will structure the robot program with the
117
//***************************************************************
// Robot Program HEADER
Sub RobotPrgMAINHeader(FileName_X As String)
Dim FileName As String
Dim FileString As String
FileName = FileName_X
'Robot File *.prg
Open "c:\" + FileName + ".prg" For Append As #1 ' Open file for output.
FileString = ""
Print #1, FileString
List1.AddItem FileString
FileString = " PROC main()"
Print #1, FileString
List1.AddItem FileString
Close #1 ' Close file.
End Sub
//****************************************************************
Program Footer
//****************************************************************
Sub RobotPrgFooter(FileName_X As String, PRG_Name As String)
Dim FileName As String
Dim FileString As String
FileName = FileName_X
'Robot File *.prg
Open "c:\" + FileName + ".prg" For Append As #1 ' Open file for output.
FileString = "ENDMODULE"
List1.AddItem FileString
Print #1, FileString
Close #1 ' Close file.
End Sub
//****************************************************************
Once the user has generated the robot trajectory header structure, manipulator
the software function as discussed above. The software source code is illustrated
below:
118
//****************************************************************
Sub Creat_Robot_robtargetVar(FileName_X As String, VAR_Type As String,
VAR_Name As String, VAR_Name_ArrayCNT As String, x As Variant, y As
Variant, z As Variant, q1 As Variant, q2 As Variant, q3 As Variant, q4 As
Variant)
' format
' PERS robtarget START_POS:=[[1054.66,-320.13,571.33],[0.002583,-
0.053022,0.998589,0.001637],[-1,0,-
1,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]];
q1 = Format(q1, "0.000000")
q2 = Format(q2, "0.000000")
q3 = Format(q3, "0.000000")
q4 = Format(q4, "0.000000")
FileName = FileName_X
'Robot File *.prg
Open "c:\" + FileName + ".prg" For Append As #1 ' Open file for output.
FileString = " " + VAR_Type + " robtarget " + VAR_Name + ":=[[" + x + "," + x
+ "," + z + "],[" + q1 + "," + q2 + "," + q3 + "," + q4 + "],[-1,0,-
1,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]];"
List1.AddItem FileString
Print #1, FileString
End Sub
//****************************************************************
119
Robot_ Number Variable declaration
//****************************************************************
Sub Creat_Robot_NUMVar(FileName_X As String, VAR_Type As String,
VAR_Name As String, VAR_Name_ArrayCNT As String, NUM As Variant)
' format
' PERS num Z_POS8:=0;
FileName = FileName_X
'Robot File *.prg
Open "c:\" + FileName + ".prg" For Append As #1 ' Open file for output.
FileString = " " + VAR_Type + " num " + VAR_Name + ":=" + NUM + ";"
Print #1, FileString
List1.AddItem FileString
Close #1 ' Close file.
End Sub
//****************************************************************
the software function as discussed above. The software source code is illustrated
below:
//****************************************************************
Sub Create_InstructionMOVE(FileName_X As String, INSTRU As String,
Pos_Var As String, SPEED As String, ACC As String, x As Variant, y As
Variant, z As Variant, q1 As Variant, q2 As Variant, q3 As Variant, q4 As
Variant, cf1 As Variant, cf4 As Variant, cf8 As Variant, cfx As Variant)
120
INST = INSTRU
FileName = FileName_X
'Robot File *.prg
Open "c:\" + FileName + ".prg" For Append As #1 ' Open file for output.
End Sub
//****************************************************************
Robot Static sub-routines are generated via the HOME_POS function. The type
structure items can be requested via the software function as discussed above. The
//****************************************************************
Sub HOME_POS(FileName_X As String)
' MoveAbsJ
[[0,0,0,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]],v200,z50,tool0;
End Sub
//****************************************************************
121
4.6 OFF-LINE Development Environment to program robot
Figure 4.18 illustrates the ABB robot remote programming environment, which provides
122
The ABB RAPID programming environment consists of the following components
functionality:
This provides a selection of motion instructions available for the ABB robot
GetROB_Pos Button allows the user to get the current robot manipulator co-
ordinate position, which can be utilized with the development of a new RAPID
robot program. This is a useful tool to reduce RAPID robot programming time.
Should the base program variables be configured correctly, certain variables may
be updated on the fly. The GetRobPos position source code is illustrated below.
123
lblPosDatWobj(Index) = Format$(robpos.WObj)
lblPosDatEx1(Index) = Format$(robpos.eaxA)
lblPosDatEx2(Index) = Format$(robpos.eaxB)
lblPosDatEx3(Index) = Format$(robpos.eaxC)
lblPosDatEx4(Index) = Format$(robpos.eaxD)
lblPosDatEx5(Index) = Format$(robpos.eaxE)
End If
End Sub
Speed ComboBOX
This provides a selection of speed setting available for the ABB robot controller
This provides a selection of Zone Size setting available for the ABB robot
controller move instruction set such as, fine, 10, 20, etc.
Tool ComboBOX
This provides a selection of Tool setting available for the ABB robot controller
Once all the motion instruction variables have been selected, the add command
button will update the program window with the correct MOVE instruction which
124
Save PRG Command Button, this will save all the robot motion sequences to
RAPID program file with the correct file format that has been illustrated in
Chapter 2.
While developing the system application, one of the main objectives was to establish the
amount of flexibility the remote server would allow the remote user to access the robot
controller functionality and at what possible update rate. This would establish if it was
It was established that in order to control the position of the robot, there were two
1. This first approach was to generate a complete RAPID program on the PC after
which this was uploaded via the remote Ethernet link. Thus for every new
position change the robot requires the system to regenerate an entirely new
RAPID program structure. The new program must then be uploaded to the robot.
A large amount of processing time was wasted uploading information to the robot.
This method certainly does not create a true real-time system and should only be
2. The second approach was to create a base RAPID program with persistent
variables. These variables are utilized for robot target positions. The persistent
variables can be updated during the auto-processing cycle on the fly. This
125
provides the robot with the capability of real-time position versatility within the
To ensure that the remote Ethernet programming environment can be achieve a stable and
C:\ping 100.100.100.102
Ping 100.100.100.102 with 32 bytes of data:
REMOTE device REPIES with a the following system packet information, if available :
126
RobComm Server Configuration
Once the Ethernet network has been correctly establish as demonstrated above, the
RobComm server can be configured to provide the vital link between the Bridge PC and
the Robot Controller. Figure 4.19 Illustrates the RobComm Ethernet server setup
environment. In this environment a Alias Name is establish such as ‘S4’ which provides
the link for the RobComm ActiveX. With this Alias Name, a network address has to be
manual configured which is mapped to the network setting of the robot controller. The
127
RobComm Status Monitor
The RobComm server provides a monitoring status window that allows the user to
confirm that the server has correctly established a connection with the robot controller.
Figure 4.20 Illustrates the RobComm connectivity environment server, to provide the
128
4.8 Software Modules to Initiate Motion Control
This user environment provides the trajectory path manipulation and control mechanism
software. The robot system integration is handled via the RobComm ActiveX
Figure 4.21 illustrates the ABB robot remote trajectory application, which provides an
129
RobComm Object Configuration
In order to ensure that the RobComm ActiveX components are initialized correctly the
following software object has to be initialized and a software alias created as follows:
//****************************************************************
//Create an RobotHelper Object
robotHelper(0).Robot = "S4" 'change this to your robot alias
setUpRobots
//****************************************************************
This user environment provides the manual functionality to manipulation and control to
robot manipulator to pre-define co-ordinate positions. Figure 4.22 illustrates the ABB
robot remote manual motion environment for the dynamic and static RAPID sub-routines
130
The following visual basic source code below demonstrates the manual system
this was utilized while testing the remote communication link. This code provides the
//*********************************************************************
Private Sub ManRoutines1_Click(Index As Integer)
End Select
End Sub
//*********************************************************************
131
The OptionButton, “ManRoutines1(x)” triggers the manual sub-routine via the
appropriate INDEX_MARKER value, this will call the correct robot controller sub-
routine that has been pre-configured in the RAPID robot program. Figure 4.23 illustrates
when the HOME_ROBOT manual function has been selected. This will cause the
request the robot controller RAPID program variable to CALL the HOME_ROBOT sub-
routine which will in turn servo the robot manipulator to the HOME position. The robot
HOME position forces all the arm linkages to a zero degree state.
//*********************************************************************
Sub MANUAL_S4_HOME()
Dim S4_VarNum_Name As String
Dim S4_VarNum As Single
Dim S4_VarNumResult As Integer
Dim ResultSpec As Integer
Dim resultid As Long
'iNDEX MARKER = 1
S4_VarNum_Name = "INDEX_MARKER"
S4_VarNum = 1
132
S4_VarNumResult = Helper3.S4ProgramNumVarWrite(S4_VarNum_Name, 0,
S4_VarNum, ResultSpec, resultid)
End Sub
//*********************************************************************
In most industrial applications the robot vision system is located in a fixed viewing
position, which limits the object to a specific viewing position. In this research
environment the CCD camera has been attached to the end of the robot arm, providing
the system with additional viewing flexibility. This means, that a number of objects can
be viewing in different positions within the robot workcell. For this application only one
viewing position has been utilized, a pre-defined position has been programmed, the ‘z’
co-ordinate can be manipulated via the remote application. This ‘z’ co-ordinate
displacement, references back to the camera viewing height. This software sub-routine is
triggered in the same manner as the HOME_ROBOT sub routine, the only difference is
that the software has a CAPTURE software flag that has been added to provide a viewing
wait time period which synchronizes the robot controller from return to the HOME
position before the image has been captured correctly. These software algorithms provide
the control mechanism platform for automatic robot control. Software source code has
//*********************************************************************
Sub MANUAL_S4_CBV()
'CAPTURE FLAG = 0
S4_VarNum_Name = "CAPTURE"
133
S4_VarNum = 0
S4_VarNumResult = Helper3.S4ProgramNumVarWrite(S4_VarNum_Name, 0,
S4_VarNum, ResultSpec, resultid)
'iNDEX MARKER = 2
S4_VarNum_Name = "INDEX_MARKER"
S4_VarNum = 2
S4_VarNumResult = Helper3.S4ProgramNumVarWrite(S4_VarNum_Name, 0,
S4_VarNum, ResultSpec, resultid)
End Sub
//*********************************************************************
The following visual basic source code below is the vital component that handles the
download routine for the object profile co-ordinate map to the relevant persistent robot
variable which will be utilized to track the captured object profile. Figure 4.24 illustrates
134
Figure 4.24: Software environment for object profile tracking
The download routine is synchronized via the data handshake routine that is discussed in
later Chapters. Software source code which manages the robot trajectory position
//*********************************************************************
Sub S4_PERS_VARIABLE_DOWNLOAD()
135
ROB_INDEX_MARKER = 0
S4RobotAlias = "S4"
S4_VarNum_Name = "INDEX_MARKER"
136
For CNT = 1 To 4
S4_VarNum_Name = "Y_POS" + Trim(Str(CNT))
S4_VarNum = Z_POS(CNT)
S4_VarNumResult = Helper3.S4ProgramNumVarWrite(S4_VarNum_Name, 0,
S4_VarNum, ResultSpec, resultid)
Next CNT
S4_VarNum_Name = "INDEX_MARKER"
S4_VarNum = 1
S4_VarNumResult = Helper3.S4ProgramNumVarWrite(S4_VarNum_Name, 0,
S4_VarNum, ResultSpec, resultid)
End Sub
//*********************************************************************
The following visual basic source code below will become a vital link for the automated
robot control, this code provides the control mechanism to control the robot controller via
the Ethernet serial link, establish and manipulate control status of the robot controller.
137
//*********************************************************************
Dim status As Integer 'return status variable
Dim i As Integer
Dim resultid As Long
For i = 0 To (numRobots - 1)
Select Case button.Index:
Case 1: 'motors off button
status = robotHelper(i).S4Standby(NOTIFY_IF_ERROR, resultid)
Case 2: 'motors on button
status = robotHelper(i).S4Run(NOTIFY_IF_ERROR, resultid)
Case 4: 'Stop program cycle
status = robotHelper(i).S4Stop(0, 3, NOTIFY_IF_ERROR, resultid)
Case 5: 'run program 1 cycle
status = robotHelper(i).S4Start(0, "", 1, 1, NOTIFY_IF_ERROR, resultid)
Case 6: 'run program continuous
status = robotHelper(i).S4Start(0, "", -1, 1, NOTIFY_IF_ERROR, resultid)
Case 8: 'move to start of main, could be start of routine by emptying quotes
status = robotHelper(i).S4ProgramPrep(0, "main", -1, 1, NOTIFY_IF_ERROR,
resultid)
Case 9: 'load program
status = robotHelper(i).S4Standby(0, resultid)
Case 11: 'file manager
robotExplorer.Show
End Select
Next i
mechanism link. This code will ensure the current robot controller event status, ensuring
that the control mechanism has manipulated the correct system functionality during
//*********************************************************************
Private Sub robotHelper_StatusChanged(Index As Integer, OprState As Integer, CtlState
As Integer, PgmCtlState As Integer, PgmState As Integer)
Dim msg As String
138
If OprState <> prevOprState(Index) Then
'set up the info software versions data whenever opr state changes
lblS4Boot(Index).Caption = robotHelper(Index).BootVersion
lblS4Sys(Index).Caption = robotHelper(Index).SysVersion
lblRC(Index).Caption = robotHelper(Index).RAPVersion
lblApp(Index).Caption = robotHelper(Index).ControlId
Select Case OprState
Case 0
'msg = "CommLink Down"
S4OPERSTATE = "CommLink Down"
OprSTATEZ = "CommLink Down"
Case 1
'msg = "Initialization"
S4OPERSTATE = "Initialization"
OprSTATEZ = "Initialization"
Case 2
'msg = "Test < 250 mm/s"
S4OPERSTATE = "Test < 250 mm/s"
OprSTATEZ = "Test < 250 mm/s"
Case 3
'msg = "Going to Auto"
S4OPERSTATE = "Going to Auto"
OprSTATEZ = "Going to Auto"
Case 4
'msg = "Auto"
S4OPERSTATE = "Auto"
OprSTATEZ = "Auto"
Case 5
'msg = "Going Test 100%"
S4OPERSTATE = "Going Test 100%"
OprSTATEZ = "Going Test 100%"
Case 6
'msg = "Test 100%"
S4OPERSTATE = "Test 100%"
OprSTATEZ = "Test 100%"
Case Else
'msg = "Unknown "
S4OPERSTATE = "Unknown "
OprSTATEZ = "Unknown "
End Select
lblOprState(Index) = msg
prevOprState(Index) = OprState
End If
139
If CtlState <> prevCtrlState(Index) Then
Select Case CtlState
Case 1
'msg = "Initialization"
CtlSTATEZ = "Initialization"
Case 2
'msg = "Stand-By"
CtlSTATEZ = "Stand-By"
Case 3
'msg = "Power On"
CtlSTATEZ = "Power On"
Case 4
'msg = "Run"
CtlSTATEZ = "Run"
Case 5
'msg = "Power Off"
CtlSTATEZ = "Power Off"
Case 6
'msg = "Guard Stop"
CtlSTATEZ = "Guard Stop"
Case 7
'msg = "Emergency Stop"
CtlSTATEZ = "Emergency Stop"
Case 8
'msg = "Guard E-Stop"
CtlSTATEZ = "Guard E-Stop"
Case 9
'msg = "Stand-By E-Rst"
CtlSTATEZ = "Stand-By E-Rst"
Case Else
'msg = "Unknown"
CtlSTATEZ = "Unknown"
End Select
lblCtrlState(Index) = msg
prevCtrlState(Index) = CtlState
End If
140
msg = "Stopped"
Case 5
msg = "Full"
Case Else
msg = "Unknown"
End Select
lblPgmCtrlState(Index).Caption = msg
prevPgmCtrlState(Index) = PgmCtlState
End If
End Sub
//*********************************************************************
ordinate frame system. This will be utilized to map the captured object profile co-ordinate
system to the correct object with respect to the world co-ordinate system. Figure 4.26
141
Figure 4.26: Co-ordinate Access Database Structure
//*********************************************************************
'Setup DataBase File Structure
AccessFile = "c:\" + "Co_OrdinateFrameSet" + ".mdb"
FileCheck = Dir(AccessFile)
142
Set WBFlds(1) = WB.CreateField("x", dbLong)
WBFlds(1).Size = 50
//*********************************************************************
synchronize the file information transfer to the robot controller. This file transfer is only
possible when the robot controller has been stopped and in a manual control state. The
source code below illustrates the file transfer mechanism, which handles the crucial file
information.
//*********************************************************************
Public Sub copyFiles()
'this routine copies files between devices
'the from list is the selection in the file list window
'the to device and directory are found in the treeview highlight fullpath
Dim status As Integer
Dim resultid As Long
143
Dim RobotName As String
Dim fromEquip As String
Dim fromDev As String
Dim fromName As String
Dim toEquip As String
Dim toDev As Strings
Dim toName As String
Dim tag As String
Dim strg As String
Dim i As Integer, j As Integer
Dim flist As ListItem
If InStr(1, tvwPlant.DropHighlight.FullPath, ":") > 0 Then 'legal drop point
For i = 1 To FileList.ListItems.Count
Set flist = FileList.ListItems(i)
If flist.Selected Then
'Text1.Text = Text1.Text + flist.Text + ", "
tag = flist.tag
'first on the from side...
strg = Mid$(tag, 13)
fromEquip = Left$(strg, InStr(1, strg, "\", 1) - 1)
strg = Mid$(tag, InStr(1, tag, fromEquip, 1) + Len(fromEquip) + 1)
j = InStr(1, strg, "\", 1)
If j > 1 Then
fromDev = Left$(strg, j - 1)
fromName = Mid$(strg, j) + "\" + flist.Text
Else
fromDev = strg
fromName = flist.Text
End If
'now the to side
strg = Mid$(tvwPlant.DropHighlight.FullPath, 13)
toEquip = Left$(strg, InStr(1, strg, "\", 1) - 1)
strg = Mid$(tvwPlant.DropHighlight.FullPath, InStr(1,
tvwPlant.DropHighlight.FullPath, toEquip, 1) + Len(toEquip) + 1)
j = InStr(1, strg, "\", 1)
If j > 1 Then
toDev = Left$(strg, j - 1)
toName = Mid$(strg, j) + "\" + flist.Text
Else
toDev = strg
toName = flist.Text
End If
If Not ((fromEquip = "This PC") Or (toEquip = "This PC")) Then
j = MsgBox("Can't copy from ramdisk to ramdisk (yet)", vbCritical, "Copy
Error")
Else
144
'set up the helper to work with the proper robot
If (fromEquip <> "This PC") Then
FMHelper.Robot = fromEquip
ElseIf (toEquip <> "This PC") Then
FMHelper.Robot = toEquip
End If
j = vbYes
If mnu_confirm.Checked Then
j = MsgBox("Do you want to copy: " + flist.Text + " to directory: " +
tvwPlant.DropHighlight.FullPath, vbYesNo + vbQuestion, "Confirmation")
End If
If j = vbYes Then
StatusBar.SimpleText = "Copying: " + flist.Text + " to directory: " +
tvwPlant.DropHighlight.FullPath
status = FMHelper.S4FileCopy(fromDev, fromName, toDev, toName, 3,
resultid)
If status <> 0 Then
j = MsgBox("Copy failed, status=" + Str(status), vbCritical, "Copy
Error")
StatusBar.SimpleText = ""
Else
StatusBar.SimpleText = "Copied: " + flist.Text + " to directory: " +
tvwPlant.DropHighlight.FullPath
End If
End If
End If
End If
Next i
Set tvwPlant.DropHighlight = Nothing
indrag = False
End If
End Sub
//*********************************************************************
DDE items are utilized as a placeholder for the different variables in the S4 robot
controller. To address those variables, the item naming must follow certain rules.
145
To connect a cell in an MS Excel worksheet to a digital output (ex: do1) in the S4 robot
Figure 4.27: Microsoft Excel DDE data simulation with DDE RobComm Server
To connect to a digital output (ex: do1) from Visual Basic, write: LinkTopic =
146
Connection to variables in the DDE Server are achieved by specifying the name of
the variable. There are pre-defined variables and variables defined by the user. The
variable names used to connect to the S4 robots are built up using the following
system:
Some variables can only be read, some can only be written to, and others can be
both read and written to. The name of the variable will indicate this in the first
character:
r_ read only
w_ write only
a_ read and write (automatic update variables)
The variables have different types that must be specified. The two most used
147
num_ number (single float)
string_ string (text)
Other variable types used in the DDE Server are: (as well as complex types like)
There are many more data types supported by the DDE Server. Although you have
to address complex variables with their correct type, they are reported back as a
string with each field separated by a comma. Only RAPID variable types that you
can reach from the DDE Server are those that are declared as persistent in your
RAPID program.
This last field in a complete variable name is the name of the variable as it appears
in the S4 controller: an I/O name, a RAPID variable name, a system variable name,
etc.
a_digio_raplong_di1
a_rpvar_string_Message
a_rpvar_num_Counter
r_sys_raplong_pgmstate.PgmState
148
Access method: A_ or W_
Functional group: DIGIO_
Variable type: RAPLONG
Variable name: User defined
A_DIGIO_RAPLONG_di1
A_DIGIO_RAPLONG_do1
A_DIGIO_RAPLONG_ingroup1
A_DIGIO_RAPLONG_outgroup1
W_DIGIO_RAPLONG_do1
raplong_ number (long integer)
bool_ boolean (0 or 1) (i.e. true or false)
wobjdata_ work object data
pos_ position data
speeddata_ speed data
tooldata_ tool data
Persistent (PERS) Rapid variables defined and declared in your program modules.
Access method: A_ or W_
Functional group: RPVAR_
Variable type: STRING or NUM, as well as: wobjdata, pos, speeddata, tooldata, etc.
A_RPVAR_STRING_Message
A_RPVAR_NUM_Counter
W_RPVAR_STRING_Message
149
W_RPVAR_NUM_Counter
4.10 Conclusion
The discussion in this chapter emphasizes the robot motion programming structure, and
different approaches of robot motion, different formulations for robot arm dynamic,
components ensure that the trajectory application is capable of generating on-line robot
Trajectory control generates robot RAPID programs in terms of data from object profile
robot coordinate systems and vision coordinate system. Practical equations that build up
the relationship between robot tool frame (represented by TCP of the three-fingered
gripper) and object frame have been derived from the transformation matrices. An
generation of robot programs. On this basis, modular RAPID programs are successfully
Software I/O variables flags are employed as handshaking signal between execution of
robot program and manipulation of the TCP so as to coordinate, these two relatively
independent processes.
robot. A robot planner attempts to find a path from our initial robot world to a final robot
150
world. The path consists of a sequence of operations that are considered primitive to the
151
CHAPTER 5
The advances in vision technology for robotics are expected to broaden the capabilities of
robotic vision systems to allow for vision-based guidance of the robot arm, complex
inspection for dimensional tolerances, and improved recognition and part location
capabilities. These will result from the constantly reducing cost of computational
capability, increased speed and new better algorithms currently being developed.
Robot vision plays a critical role in robot intelligence. In robot vision systems, geometric
feature extraction and representation are the two most important issues to which we must
find solution in terms of the application requirements. To make the present robot vision
systems suitable for various eye-hand applications, further researches and improvement
still needs to be done. To implement a profile-oriented robot vision system, the following
specifications.
151
This research focused on developing a profile that is used by the automated robot control
With an automated PC-Based Robot control system, the object position and orientation as
well as the object’s profile must be identified and represented accurately. A high
resolution CCD camera is utilized to acquire visual information of the object profile to be
of a vision system. To represent the objects efficiently and effectively, edge vector
The first sub-section is the low-level vision. This is where the image acquisition and
The second sub-section is the high-level vision. This includes extraction, modeling and
profile recognition.
152
In this research project, the objective of the vision feedback system is to produce an
image data co-ordinate map of the image profile captured. The software components
In this research project, high-level vision provides an image profile co-ordinate map for
the robot trajectory application, which will in turn be utilized by the robot manipulation
and trajectory program generation engine to direct the robot to its final positions.
The raw captured image needs to be processed to ensure that an accurate profile can be
extracted. The mechanism utilized for the preprocessing of the image uses various filters
and templates. These are used to eliminate distortion of the original image and will ensure
To ensure that a high quality image is captured, a high resolution CCD camera is utilized
to view the digitized image. The CCD camera is interfaced to a high-speed frame grabber
card, which processes the digital image. The 3D flash point frame grabber card is
equipped with programmable hardware in order to improve the viewed image. The
153
following are examples of programmable features, moderate brightness, contrast,
sharpness, etc. To facilitate image processing and increased precision, a clear contrast
between the object and the background is needed. In this research project, a dark object
was placed on a white background. The vision system views a flat object from a vertical
position in order to eliminate shadows, non-uniform illumination and all distortions that
cannot be compensated by adjusting the parameters of the frame grabber. Figure 5.2
The digital image captured is stored in the VGA system memory of the computer. The
RGB colour image is transferred into the program buffer for further processing. The
source code declaration illustrates the memory allocation for the captured image.
LPBYTE AllocateMemory() {
154
Figure 5.3: Memory allocation for the 3D Flash Point Frame Grabber [23]
Figure 5.4: Mechanism utilized to transfer the image into the allocated memory
Bitmaps that contain a color table are device-independent. A color table describes how
pixel values correspond to RGB color values. RGB is a model for describing colors that
are produced by emitting light. A DIB contains the following color and dimension
information:
The color format of the device on which the rectangular image was created
The resolution of the device on which the rectangular image was created
The palette for the device on which the image was created
An array of bits that maps red, green, blue (RGB) triplets to pixels in the
rectangular image
A data-compression identifier that indicates the data compression scheme (if any)
BITMAPINFOHEADER
155
typedef struct tagBITMAPINFOHEADER{ // bmih
DWORD biSize;
LONG biWidth;
LONG biHeight;
WORD biPlanes;
WORD biBitCount
DWORD biCompression;
DWORD biSizeImage;
LONG biXPelsPerMeter;
LONG biYPelsPerMeter;
DWORD biClrUsed;
DWORD biClrImportant;
} BITMAPINFOHEADER;
In order for the object profile to be captured correctly, the image needs to be
preprocessed by applying standard matrix filter algorithms, which extract unwanted noise
and image distortion. The algorithms utilized sequentially on the captured image are -
image filtering,
noise cleaning,
averaging; and
image thresholding.
The spatial convolution technique of the digital image is the platform basis for the
software.[26]
156
int nTemp;
high-pass filters,
low-pass filters.
157
The three filters are combined to ensure the ideal image is achieved.
The high-pass filter is utilized to sharpen the image that is out of focus or fuzzy. The
function uses a 3x3 convolution matrix filter, which emphasizes differences in grey pixel
levels in the 3x3 neighborhood about the central pixel window. Figure 5.5 illustrates the
− 1 − 1 − 1
H High − pass = − 1 9 − 1
− 1 − 1 − 1
const int nConV [3][3] = {{-1, -1, -1}, {-1, 9, -1}, {-1, -1, -1}};
ConvolutionMech (nConV);
Once the filtering process has been completed, the image is sharpened and the low
frequency component in the image is removed. The template matrix used in the function
is described in chapter 3.
Image random noise spikes on a noisy image. The noise is removed by utilizing a median
pixel intensity levels in a 3x3 matrix window. This is used to smooth out ‘salt and
pepper’ noise effects. Random noise becomes less apparent after a median filter has been
applied to the image. This creates a smoother image. The implementation of median
filtering selects windows of pixel data from a 3x3 array of pixels. The pixel set consists
of nine pixels. The intensity of the pixel set is analyzed in order of grey scale magnitude.
158
The central matrix point is dependent on its current intensity. A 3x3 matrix window is
moved over the entire video memory image grid, which is used to analyze the entire
well as smooth or soften sharp images. A 3x3 matrix convolution filter is utilized to
provide a filtering mechanism to eliminate the high frequency components from the
1 1 1
H Low − pass = 1 2 1
1 1 1
const int nConV [3][3] = {{1, 1, 1}, {1, 2, 1}, {1, 1, 1}};
ConvolutionMech (nConV, GREYSCALE, 10);
Random noise signals, which occur in the captured image are reduced or eliminated with
noise cleaning techniques. A 3x3 matrix convolution is utilized to smooth out this noise.
1 2 1
H Noise − cleaning = 2 4 2
1 2 1
const int nConV [3][3] = {{1, 2, 1}, {2, 4, 2}, {1, 2, 1}};
ConvolutionMech (nConV, GREYSCALE, 12);
159
5.3.3 Averaging
The averaging technique is utilized to eliminate noise spikes and clean edge features in
the image. The averaging function uses a 3x3 matrix sliding convolution window to
1 1 1
H Averaging = 1 0 1
1 1 1
const int nConV [3][3] = {{1, 1, 1}, {1, 0, 1}, {1, 1, 1}};
ConvolutionMech (nConV, GREYSCALE, 8);
Figure 5.12: 3x3 matrix convolution filter implemented as a software call function
This technique is the main mechanism, which provides the image building block for
intensity range is established. The entire pixel image is analyzed and each pixel is altered
according to its level of intensity. Should the current pixel fall below the greylevel range,
the pixel is assigned to the background, while the other pixels that are equal or above the
greylevel range are considered to be the object and are assigned to an object reference
pixel. The greylevel ranges from 0 – 255. The manual threshold value is determined by
calculating the difference of the intensity level between the object and the background.
160
Figure 5.13 illustrates an accurate binary image achieved through filtering, noise
void CImageProfileProcessing::TH()
{
int i, j;
Threshold is one of the most commonly used image preprocessing tools utilized in image
161
5.4 Boundary Detection
processing. The boundary features a sharp geylevel transition. If the edges are reliably
strong, and the noise level is low, one can establish the edge magnitude of the image with
a close chain mechanism. Close chain discontinuities occur at boundaries, which result
from noise and other interferences. Edge linking and edge thinning must be applied after
edge detection.
Edge detection is implemented by analyzing each pixel neighborhood and quantifying the
slope and direction of the object. In this research project, edge detection utilizes gradient
This technique analyses the video pixel binary image and manipulates any pixels, which
lie on the boundary between the object and the background. In this application, linear
interpolation is employed to create a continuous boundary. Figure 5.14 (a) shows the
discontinuity from point A to point B on the boundary, while figure 5.14 (b) shows the
162
y
B
P
A Ye
Xe x
O
(a) (b)
The discontinuity from A and B in figure 5.14 (a) is expressed with line segment OP in
figure 5.14 (b). Assume the distance from origin O to P is Xe pixels in the x axis and Ye
pixels in the y axis. The linear interpolation method is employed to join point O and point
variable.
X, Y1, Xr and Yr stand for registers that store the interpolation variables in the x and y
163
Figure 5.15: Block diagram of interpolation principle
This method allows the profile to be traced and thinned to a single pixel-wide. This
The chain code mechanism is utilized to represent the object profile. The chain codes that
are utilized in this research project comprise sets of straight-line segments of specified
length and direction, which correlate to the object boundary sides. Edge thinning is
performed while edge following is being executed. A single pixel-wide profile is then
164
generated. The chain code contains the start pixel address followed by a string of code
words. Figure 5.16 shows the profile of the object after edge linking, edge following, and
edge thinning.
Extraction of the profile from the object becomes a crucial mechanism for robot path
manipulation. Closed chain vectors represent the object profile, which is illustrated in
Figure 5.17. A continuous vector tracing method was developed, which utilizes standard
mathematic algorithms. This ensures that the chain profile segments contain the
165
Figure 5.17: Object Profile
This mechanism was developed to represent the profile of the object by means of a closed
chain of vectors. The profile is represented by a square of different sizes. This vector
tracing method locates the starting point of the vector chain within a given tolerance.
| v |= x 2 + y 2
Unit vector v0 of vector v is defined as
v ix + jy
v0 = =
|v| x2 + y2
Angle θ between two vectors v1 and v2 can be calculated by
θ = cos −1 (v10 ⋅ v 20 )
v10·v20 stands for the dot product (or inner product) of vector v10 and vector v20 which are
the unit vectors of vectors v1 and v2 respectively.
v10 ⋅ v 20 = 0
Discriminator of two parallel vectors v1 and v2 sharing the same direction is defined as
166
v10 ⋅ v 20 = 1
Discriminator of two parallel vectors v1 and v2, having opposite directions, is defined as
v10 ⋅ v 20 = −1
This algorithm is to locate each corner (or node) within given tolerance.
Figure 5.18 illustrates the profile Image co-ordinate MAP which is utilized the generate a
5.6 Conclusion
Vision feedback systems play a crucial role in robot trajectory manipulation. This system
is by far the most complex feedback system that can be utilized by a robot controller.
This complex system ensures that the robot controller gains maximum movement
167
flexibility. Real-time image processing is highly dependent on the relevant algorithms.
The overhead of the algorithm processing time influences the real-time close loop
feedback system. The real-time time frame can be analyzed by calculating the time from
which the vision system views the object to the time the robot controller reacts to the
Therefore in this research project it was proved that real-time robot manipulation can be
168
CHAPTER 6
CONCLUSION
This chapter discusses the final project results. It will also illustrate the accomplishments
and contributions of this study with regards to the objectives set out in Chapter 1.
Problems that were encountered during the project’s development as well as future
Profile recognition and an automated robot trajectory system was created to provide
A standard industrial ABB robot and bridge PC were used to test the project results. The
two entities were linked together with a standard industrial LAN (Ethernet protocol). The
system network link was configured in a peer-to-peer format. The bridge PC was
equipped with two Ethernet cards as discussed in Chapter 3. One of the network cards
was used for the network link between the robot controller and the bridge PC, the other
network card was used for the LAN which enabled remote communication from the
internet. An integrated software application was divided into two main areas, vision
sensory system and robot trajectory path control which is illustrated in Chapter 4. All the
image processing and extraction is handled by the first component while the second
component manages the motion control. Software to enable communication between the
169
The following was performed and illustrated:
The object was extracted and managed via the CCD camera and frame grabber,
The square flat black object was placed on a white background to create a
The image was grabbed into the vision extraction application, which analyzed and
The profile object was manipulated so that all the crucial co-ordinates were
extracted such as the marker position of all the corners of the square object. The
co-ordinates were utilized to create a robot trajectory path for the ABB industrial
robot controller.
The ABB industrial robot was utilized to provide position manipulation from the
results.
The Ethernet cards were installed into both devices (bridge PC and robot
controller).
The Ethernet protocol was successfully established via the configuration of the
ABB controller and bridge PC. This was done via the RobComm server.
170
The remote robot position manipulation was successfully achieved by creating a
The DDE engine ensures that a standard SCADA application can be integrated
The profile recognition system was successfully integrated with the robot
path motion.
The experimental validation of the profile recognition and automated PC-Based Robot
control system was made on a variety of distinguished objects. The experiment result is
successfully.
The generic algorithms developed in the robot vision system give practical and
RAPID programs automatically with respect to the object profile and orientation.
171
Seamless integration between individual modules. Serial data flow supports each
module effectively.
Problems that were encountered during this research project were of a software nature.
All hardware was available, but the software limited full control of the hardware. The
software problems, which were encountered, were overcome within time. This ensured
for the remote communication between the robot controller and bridge PC. The second
bridge PC Ethernet card was connected to a second remote PC to simulate a remote LAN
communication. The IP address utilized was of a static nature and had to be manually
The current Ethernet hardware for the ABB industrial robot controller required specific
firmware and system services to be installed in order for the Ethernet service to be
activated. Initially these were not available. A large amount of time was wasted sourcing
the correct version of firmware for the robot baseware software. This problem was later
solved.
The factoryware software that was available was utilized to bridge the Ethernet
communication gap. This software was found to be unstable at times due to the software
172
the Visual C environment. Therefore a Visual Basic platform had to be developed and
was used to manage the remote communication platform between the robot controller and
the bridge PC. This second platform had an impact on the real-time environment, which
There are many possible extensions from this basic setup. This research project proved to
incorporated to provide the profile depth. This will provide a 3D virtual image,
with the robot RAPID programming environment. After the entire robot
environment has been simulated around the product tooling a base robot software
time. Only fine tuning of position can be corrected during project system
commissioning.
The Ethernet communication environment provides the ideal platform for real-
173
system the robot can automatically follow a moving object and make intelligent
The aim of this research project was to provide a platform for vision feedback and an
automated remote robot environment. During the research it was proved that a robot
is able to react on vision feedback sensory information. This platform can prove to be
174
References
[1] Fu K. S. 1987. Robotics: control, sensing, vision, and intelligence, New York:
McGraw-Hill.
[2] Groover M.P., Weiss M. el al. 1986. Industrial Robotics Technology, Programming,
and Applications. New York: McGraw-Hill.
[4] Megahed M. 1993. Principles of robot modeling and simulation. King Suad
University Saudi Arabia. USA. John Wiley & Sons, Inc.
[5] Wesley E. Snyder. 1985. Industrial robots: computer interfacing and control.
Prentice-Hall International Editions. New Jersey.
[8] Miklovic D. 1993. Real-Time control networks, resources for measurement and
control services. USA.
[9] Martins J.G., Svensson M. 1988. Profitability and industrial robots. Springer-Verlag.
New York.
175
[12] Hames B. 2000. Image Processing and Analysis. Oxford University Press Inc. New
York.
[13] Toh T., Ching W. 1992. Automatic Optimization of Machine Vision Lighting.
Proceedings of the 2nd Singapore International Conference on Image Processing.
[14] Gonzalez R., Wintz P. 1987. Digital Image Processing. Second Edition, Addison-
Wesley Publishing Company.
[15] Nof S. 1985. Handbook of Industrial Robotics. USA. John Wiley & Sons, Inc.
[17] MuKai T., Ohnishi N. 1999. Sensor Fusion of a CCD Camera and an Acceleration-
Gyro Sensor for the Recovery of Three-Dimensional Shape and Scale, IEEE
Proceedings of the Second International Conference of Information Fusion, pp.221-
228.
[18] Li T. and Latombe J. On-Line Manipulation Planning for Two Robot Arms in a
Dynamic Environment, Research Report. Robotics Laboratory. Stanford
University.
[19] Choset H. Path Planning between Two Points for a Robot Experiencing
Localization Error in Known and Unknown Environments. Research Report.
[20] Zeller M. 1997. Motion planning of a pneumatic robot using a neural network.
IEEE Control Systems Magazine, vol.17, No.3, pp.89-98.
176
[23] Color CCD Camera Operating manual. Eagle Technology.
[27] Schildt H. MFC Programming from the Ground Up. California. McGraw-Hill.
[28] Kreyszig E. 1999. Advanced Engineering Mathematics. 8th Edition. New York.
John Wiley & Sons, Inc.
[30] Colombo C., Allotta B. 1999. Image-Based Robot Task Planning and Control
Using a Compact Visual Representation. IEEE Transactions On Systems, Man,
And Cybernetics – part A: Systems and Humans. vol.29, No.1, pp.92-100.
[31] Lu T. F. 1996. CAD, vision and sensor based intelligent robot server. Computer
Integrated Manufacturing Systems. Vol. 9, No. 2, 91-100, 1996.
[32] Boston Technical Books. 1994. PC Instrumentation for the 90s. 4th Edition.
[33] Microsoft Corporation. 1997. Visual Basic: Component Tools Guide. USA.
[34] Microsoft Corporation. 1997. Visual Basic: Guide to Data Access Objects. USA.
177
[36] Chapman D. 1998. Visual C++ 6. Sams Publishing. USA.
[37] Barr A., Cohen P., Feigenbaum E. 1981-1982. The Handbook of Artificial
Intelligence. William Kaufmann, Inc. California. Vols1,2,3.
[39] Wesley M.A. 1980. A geometric Modeling system for Automated Mechanical
Assembley. IBM J. Research Development. Vol 24, no. 4. pp64-74.
[40] Castleman R.R. 1977. Digital image processing. Englewood Cliffs N.J. Prentice-
Hall.
[41] Chein R.T., Snyder W.E. 1975. Hardware for Visual Image Processing. IEEE
Transactions on Circuits and systems.
[42] Denavit J., Hartenberg R.S. 1955. A Kinematic Notation for Lower-Pair
Mechanisms Based on Matrices. J. App. Mech., Vol 77, pp 215-221.
178