0% found this document useful (0 votes)
98 views

Seam Tracking

The document describes developing a predictive seam tracking system for robotic welding using computer vision. It involves: 1. Using a CCD camera and laser light to capture images of the welding joint and determine the seam position. 2. Applying image processing algorithms to the images to identify features and calculate the reference welding trajectory. 3. Sending the predicted trajectory to the robot controller to minimize tracking errors in real-time and optimize the welding process.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
98 views

Seam Tracking

The document describes developing a predictive seam tracking system for robotic welding using computer vision. It involves: 1. Using a CCD camera and laser light to capture images of the welding joint and determine the seam position. 2. Applying image processing algorithms to the images to identify features and calculate the reference welding trajectory. 3. Sending the predicted trajectory to the robot controller to minimize tracking errors in real-time and optimize the welding process.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 16

Predictive “Seam Tracking”

Robotic Welding
Optimization

GRACO
Automation and Control Group
Brasilia University - UnB
[email protected]
www.graco.unb.br
Bauchspiess, A. & Absi Alfaro, S.C.
Objective (1)

• Develop Algorithm for sensor welding guided


manipulator:
 Sensor Information;
 Manipulator dynamics;
 Path Generation Model.
• Minimise tracking error
Objective (2)

• Development and Implementation of a Vision


System for “Seam Tracking” in Robotic
Welding Cells with Dynamic Tracking Error:
“ Determinate a reference trajectory (Welding
Path) in reference to welding joint using the
”Following Algorithm Model” and sending this
information directly to the robot controller”.
Contents

• Introduction
• Material Resources
• Methodology
• Results
• Conclusions
Introduction (1)

• Welding Robot • Following Model


should follow Predictive Servo-
precisely welding controlled Algorithm
joint path: =
 Robot mechanical Inertia;  Dynamic model of the robot +

 Using CCD-camera look ahead  Captured Future Trajectory


Information +
 Computer Supervisor
 Artificial Intelligence +
 Vision System: Guidance & Weld
 Teaching and recording of Welding
Pool Watching
Path.
Introduction (2)

• Integration of • Welding Procedure


Welding Cell:
 Welding Power Source:  Definition of Welding Task
Communication
 Welding Fitting
 Welding Robot
 Definition of the Welding Path
 Computer Supervisor
 Teaching and recording of Welding
 Vision System: Guidance & Weld Path.
Pool Watching
 Off-line Programming
Material Resources
• Welding Power Source: Commander BDH320:
 Electrical Characteristic: 5A-320A, 80V (open); 1 a 24 m/min (wire feed).
Torch
 Control: Welding parameters.
 Robot Interface: Link to the Robot & Computer.
workpieces

• Industrial Robot: IRB2000W


 Type: 6 Articulated Axes
 Sensoring & Atuators: synchrony motors & resolvers.
 Input/Output: Teaching Box, flopping drive, Serial communication .

• Computer - PC Pentium, 200MHz robot


 Roboweld : Communication verifying & documentation. controller
PC
 Infoweld: documentation & power source control. compound
vídeo

• Vision System
ccd

RS232
frame
 Diode Laser: 40mW, 693,9nm, cylindrical lens. grabber
 Camera CCD: 725(H) x 582(V), 625 lines
 PC Video Frame Grabber card: 8MB, 16 bits. ROBOT
Methodology
• Problem Definition of Dynamic System Tracking
 The Tracking Problem
 The Servo Problem
 The Model Following Problem
Laser light stripes photograph.

• Image Features Definition


 Processing algorithms relies on analysis of the reflected light (laser stripe)
 Image Digitalisation: matrix of picture elements (pixels)
 l(x,y)  word of 8 bits (0=black, 255=white)
 laser stripe  y=a.x+b
Predictive Path Tracking
• Basic Ideas:

 Path error minimization in the sensor vision field


 Internal path generator model

 us ( k ) 
Goal: such that n 1
u ( k  1) 
u s (k )  


s





J   Q    R
i0
T
i i i
T

us  Mininum
us ( k  m  1)

s
Predictive CCD-Camera guided robot

1.

2.

3.
Image Processing Algorithms 1
According to figure 2 by trigonometric analysis (Triangularization) distance
of camera (torch) from workpieces can be estimated.
Depending upon this distance light stripe position on image varies from top to
bottom.
On image to be analysed two windows are assigned as seem on figure 3, where
a general situation is illustrated ( two plates to be joined together not being
necessarily on the same plane).
Inside windows left and right light stripes are characterised and in the region
between windows the gap position is to be determined.
The image (matrix of pixels) is scanned on each window along vertical straight
lines where it’s searched for the up and down edges of light stripe. A middle
point is assigned for each pair of edges.
The length of vertical scanning straight lines is adjusted according to light
stripe width.
Initial and final co-ordinates of vertical scanning straight lines are
automatically adjusted near to the previous middle point determined, so as to
run the algorithm faster.
Image Processing Algorithms 2
For each light stripe (left and right) a set of representative middle point co-
ordinates are determined.
For each set of determined middle points an equation of a straight line is
adjusted by using Least Square Method of Fitting.
Image is scanned along the adjusted straight lines so as to find out the gap if
plates are on the same plane. If they are on planes not parallel to each other the
intersection between the two lines indicates gap middle point.
At the borders of gap there is a decay of pixel grey scale. Borders are found
out and gap width determined as long as gap position on matrix image.
As a consequence, position of gap relative to torch is determined and if a
correction of it is needed a proper command is sent to the robot controller.
If plates to be joined are not on parallel planes the slope of adjusted straight
lines can be used to determine inclinations between them and torch. So
orientation of torch can be changed in order to satisfy appropriate welding
(figure 4).
Width of windows are dynamically changed so as to optimise the movement of
gap over image.
Figure 1. Torch-laser-camera
photograph.
photograph

ccd y
laser
a
f


torch

d



Figure 2 . Laser-Camera assembly
for triangularization.

D
Figure 3. Algorithms approach.

laser Figure 4 . Geometrical characterisation of


angular plates to be welded.

• Intersection between laser stripe it


 will be at the welding groove:


  tan 1[(tan  ).(tan  )]


Results:
• Developing of Vision System for “Seam Tracking” welding with
dynamic tracking error.
• Developing of Image Processing Algorithms:
• Parameter Identification of Image Characteristic.
• Identification of Reference Trajectory.
• Using Parallel Processing (Transputer) for implementation of real-
time algorithms.

Conclusion:
• Application of Computer Vision System for Welding.
• Welding with automatic “seam tracking” with error correction.
• Implementation of a full visual servo control.

You might also like