0% found this document useful (0 votes)
3 views5 pages

Safe_human_robot_collaboration__Operation_area_segmentation_for_dynamic_adjustable_distance_monitoring

This paper presents an approach for safe human-robot collaboration by segmenting the operating area of industrial robots using a time of flight sensor and an octree algorithm. The segmented data can be used to adjust safety systems, such as ultrasound distance sensors, to monitor the proximity of humans to the robot. The study demonstrates the effectiveness of this method in improving safety and simplifying risk analysis in collaborative environments.

Uploaded by

Arefin Mahin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views5 pages

Safe_human_robot_collaboration__Operation_area_segmentation_for_dynamic_adjustable_distance_monitoring

This paper presents an approach for safe human-robot collaboration by segmenting the operating area of industrial robots using a time of flight sensor and an octree algorithm. The segmented data can be used to adjust safety systems, such as ultrasound distance sensors, to monitor the proximity of humans to the robot. The study demonstrates the effectiveness of this method in improving safety and simplifying risk analysis in collaborative environments.

Uploaded by

Arefin Mahin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2018 4th International Conference on Control, Automation and Robotics

Safe Human Robot Collaboration — Operation Area Segmentation for Dynamic


Adjustable Distance Monitoring

Martin J. Rosenstrauch Jörg Krüger


Institute for Machine Tools and Factory Management Institute for Machine Tools and Factory Management
Technische Universität Berlin Technische Universität Berlin
Institute for Production Systems and Design Technology Institute for Production Systems and Design Technology
Fraunhofer IPK Fraunhofer IPK
Berlin, Germany Berlin, Germany
e-mail: [email protected] e-mail: [email protected]

Abstract—A core challenge of human robot collaboration is to


ensure safety. By renouncing guards potential hazards raise. In
this paper an approach is presented where the actual used
operating area of an industrial robot gets segmented out of his
whole workspace. Therefore an industrial robot gets monitored
with a time of flight sensor. Subsequently the recorded 3D data
gets processed and the operating area segmented by applying
an octree algorithm. The information thus determined about
the hazard zone can be used for an application specific
adjustment of safety periphery systems. This is exemplary
realized by ultrasound distance sensor modules which get Figure 1. Schematic representation of an operating industrial robot - left
individually parametrized depending on the pre-segmented side, red: robot workspace; right side, red: actual operating area.
operating area.
Knowing the operating area in any manner leads to new
Keywords-safety; industrial robot safety; human robot opportunities, generally in human robot safety. Risk analyses
collaboration; time of flight; robot operating area; distance can be supported or even simplified by easier localization
monitoring and visualization of hazard zones, in particular of already
existing or ad hoc teached in robot programs. Besides that a
I. INTRODUCTION
high technical safety effort can be reduced by the additional
The trend of human robot collaboration in the recent information of the operating area. For example joint angle
years presented new challenges regarding safety. thresholds for operating area limitations can be detected
Lightweight designs [1]–[3], force-torque sensor based more easily and integrated into the robot control. Another
control concepts, pressure sensitive skins [4]–[6] and other advantage of knowing the operating area is for example the
approaches led to developments of diverse new robots, identification of adequate positions for safety monitoring
specifically for safe applications. They enable operation camera systems.
modes in consideration of the latest safety standards [7] for
collaboration. But in 2015 the International Federation of II. APPROACH
Robotics estimated 1.6 million industrial robots worldwide
[8], where in detail only 3.000 units out of 240.000 sold
industrial robots were special collaborative robots [9], [10].
Therefore further research is required to integrate existing
industrial robots without inherent safety features as
mentioned above into new collaborative applications.
In most of industrial use cases the executed robot
program is operating repetitive, which means it has a
recurring sequence of movements. Within that sequence of
movements it also very often only uses a small part of the
fully reachable workspace, called operating area, which is
schematically illustrated in fig. 1. However, with the
emerging trend of highly flexible collaboration scenarios
including a high degree of modifying the robots tasks,
solutions were needed for a quick and efficient adjustment of
safety requirements. Figure 2. Schematic representation of the overall system.

978-1-5386-6338-7/18/$31.00 ©2018 IEEE 17


Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
The complete approach introduced in this paper consists to the master for triggering an emergency or safeguard stop.
of two parts. The main part is made up by the segmentation The individual minimum distance for each client results from
of a robot operation area. To demonstrate the benefit an the previous operation area segmentation.
exemplary ultrasound distance sensor system consisting of
identical in construction modules gets shown at second.
These modules can be individually parametrized in
compliance with the separation distance described in ISO/TS
15066 [7] by the results of the previous segmentation. The
interaction of these two separable elements is shown in fig. 2.
A. Operation Area Segmentation
To segment an industrial robot operating area out of his
whole workspace a sensor system is needed to externally
record an arbitrary robot in operation mode. In this approach
we intend to combine an image acquisition system and a
depth sensor. The 2D image data serves to visualize the
recorded data and together with the corresponding depth
information it is possible to reconstruct the operating area
and transform it in to real world coordinates. Therefore the
sensor system streams its data into a computer where it gets
processed with the following program sequence. First of all
the large incoming dataset gets preprocessed by a
downsampling algorithm and a noise reduction. This is done
with a voxelgrid filter followed by a statistical outlier
removal filter. Then the operating robot gets segmented with
a modified octree filter [11]. The separated operating area
gets completed by continuously monitoring the robot
program. In addition a sphere representing the robots total
workspace is projected into the visualization of the algorithm
Figure 3. Program flow chart of a robot operating area monitoring.
output. All data processing is intended to be made with an
existing open-source library for 2D/3D point clouds. III. EXPERIMENTAL SETUP
An object segmentation within a 3D point cloud is more
processor-intensive compared to a 2D segmentation In this approach we chose a Microsoft Kinect V2, which
algorithm followed by an assignment to their corresponding provides the opportunity of a standard 2D RGB image
3D coordinates. Nevertheless there are plausible reasons for acquisition and furthermore it generates 3D information for
using a segmentation algorithm operating on three each 2D pixel by using an additional time of flight depth
dimensional data instead of two dimensional monochrome or sensor. The suitability of this sensor and its accuracy has
colour sensor data. The major advantage is the greater already been confirmed in different publications, such as
robustness in brightness fluctuation, but also a complete [12]. But previous approaches using Kinect V2 in context
independence regarding monochrome or colour value based with industrial robots usually base on human motion tracking
segmentation methods can be considered as valuable. [13], [14]. In this approach we segment the robot to
Furthermore possible object patterns depending on the determine its operation area. For monitoring exemplary robot
surface of the robot are less disturbing. A total overview of programs a Universal Robot UR5 is used. With the help of
the operation area segmentation in form of a program flow the libfreenect2 driver [15] the acquired image and depth
chart can be seen in fig. 3. data gets transferred into the Robot Operating System (ROS)
[16]. Here it gets processed with the Point Cloud Library
B. Ultrasonic Distance Monitoring (PCL) [17].
The presented segmentation of a robots operation area The exemplary distance monitoring system is realized by
can be used for different applications. To emphasize that, a one master and up to 16 clients, all containing low cost parts.
simple safety system was developed to realize a direction The master mainly module consists of an ESP32
dependant distance monitoring of humans approaching a microcontroller. It also has a display for indicating system
operating robot. This distance monitoring system is values like the minimum distances that are set and a relay to
composed of several client modules identical in construction connect the master with the robots emergency or safeguard
which mainly contain three ultrasound sensors and a stop input. The client modules are composed of also an
microcontroller. These clients are designed to be connected ESP32 microcontroller and three HS-SR04 ultrasonic
to a master module containing of a central microcontroller. sensors. In addition, all modules are equipped with a LED
Here the individual minimum distances for all clients were for displaying their operational state and a capacitive button
centrally set. If now any moving object or a human worker for resetting them. All clients are connected to the master via
gets less than this minimum distance the client sends a signal wi-fi. A User or a stand-alone algorithm also connects via

18
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
wi-fi to set the specific minimum distances to the clients. limited area within the whole workspace (surrounding sphere)
The essential structure of the ultrasonic distance monitoring gets passed through by the robot.
system is shown in figure 4.

Figure 6. a - exemplary infinite task b, c - monitored and segmented


operating area.

Figure 4. Sensor system setup: master (left) and client (right).


Figure 7. Visualization of the progressive growth of a monitored operating
IV. RESULTS area.

A. Segmentation B. Segmentation Accuracy


The experimental setup of a static scene (no movement of To evaluate the significance of the determined 3D data,
robot) and the corresponding point cloud is represented in measurements were done to identify the accuracy. For this
fig. 5. Also projected is a workspace sphere with a center purpose the robots tool center was moved to a position with a
point in the robot base. distance of 0.5m each to the x-, y- and z-direction referring
to the robot base. This precise position was reached by
reading out the exact coordinates from the robot control itself.
In this approached position the robot got segmented with the
implemented algorithm and the distance between tool center
and robot base was measured. This procedure has been
repeated ten times, each time with a different angle of view.
The results can be seen in tab. I.
TABLE I. ACCURACY MEASUREMENT (REFERENCE VALUE = 0.5M)

measurement x [m] y [m] z [m]

Figure 5. Industrial robot and corresponding point cloud for a static case. 1 0.510 0.473 0.518
2 0.508 0.476 0.518
For monitoring a possible operation use case a robot 3 0.508 0.474 0.518
program was generated. It is a simple pick, move and place 4 0.508 0.475 0.519
algorithm with three different positions on a worktop, as 5 0.510 0.476 0.518
presented in fig. 6a. The drawn cube gets moved by the robot 6 0.509 0.475 0.519
between these positions in an infinite loop. The exemplary 7 0.510 0.477 0.519
robot program was recorded successfully. In each frame the
8 0.509 0.475 0.518
acquired data was processed and a point cloud of the
9 0.510 0.476 0.518
stationary operating area has been segmented. Thereby the
density, this means the number of segmented single 3D 10 0.509 0.475 0.519
points of the operating area, grows continuously. Fig. 6b, c mean value 0.5091 0.4752 0.5184
shows the segmented point cloud after three minutes of
monitoring the operating robot UR5. standard deviation 0.0008 0.0011 0.0005
For a better understanding the step by step growing of the
complete operating area by merging the particular point
clouds can be seen in fig. 7 a-c. Even though it is a 2D The maximum amounts of the x-/y-/z-deviation
visualization of 3D data the figure also shows that only a result in a measurement error of:

19
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
2 2 2 removing concerns about safety in a very intuitive way.
∆dist = ∆x abs + ∆yabs + ∆z abs Apart from that the inadequate performance in accuracy is to
be improved by approaches like [18], [19].
∆dist = (0.010m) 2 + (0.027 m) 2 + (0.019m) 2
REFERENCES
∆dist = 0.034m ⇒ 6.8% [1] R. Bischoff, J. Kurth, G. Schreiber, R. Koeppe, A. Albu-Schaeffer,
Beyer, O. Eiberger, S. Haddadin, A. Stemmer, G. Grunwald, and G.
The test measurements showed unintentional deviations. Hirzinger, “The kuka-dlr lightweight robot arm - a new reference
Reasons for this are among others partly occurred sensor platform for robotics research and manufacturing,” in ISR 2010 (41st
irritations during image acquisition caused by the reflective International Symposium on Robotics) and ROBOTIK 2010 (6th
German Conference on Robotics), June 2010, pp. 1–8.
surface of the universal robot and the downsampled data.
[2] T. Lens and O. von Stryk, “Design and dynamics model of a
C. Distance Monitoring Accuracy lightweight series elastic tendon-driven robot arm,” in 2013 IEEE
International Conference on Robotics and Automation, May 2013, pp.
In a safety context, it is particularly important to ensure 4512–4518.
that a sensed distance of a human is not closer to the [3] S. Parusel, S. Haddadin, and A. Albu-Schffer, “Modular state-based
hazardous zone, than the safety system has measured. To get behavior control for safe human-robot interaction: A lightweight
a statement about the accuracy of the developed distance control architecture for a lightweight robot,” in 2011 IEEE
monitoring system, measurements have been carried out. International Conference on Robotics and Automation, May 2011, pp.
4298–4305.
This has been done by placing an object with different
[4] V. Duchaine, N. Lauzier, M. Baril, M. A. Lacasse, and C. Gosselin,
distances and angles starting from the client modules center, “A flexible robot skin for safe physical human robot interaction,” in
represented as blue points in fig. 8. The measured distances Robotics and Automation, 2009. ICRA ’09. IEEE International
were interpolated into red lines. Conference on, May 2009, pp. 3676–3681.
[5] J. O’Neill, J. Lu, R. Dockter, and T. Kowalewski, “Practical,
stretchable smart skin sensors for contact-aware robots in safe and
collaborative interactions,” in 2015 IEEE International Conference on
Robotics and Automation (ICRA), May 2015, pp. 624–629.
[6] T. Mazzocchi, A. Diodato, G. Ciuti, D. M. D. Micheli, and A.
Menciassi, “Smart sensorized polymeric skin for safe robot collision
and environmental interaction,” in Intelligent Robots and Systems
(IROS), 2015 IEEE/RSJ International Conference on, Sept 2015, pp.
837–843.
[7] ISO/TS 15066:2016, “Robots and robotic devices - collaborative
robots,” International Organization for Standardization, Standard
ISO/TS 15066:2016, Feb. 2016.
[8] IFR - International Federation of Robotics. Worldwide estimated
operational stock of industrial robots. [Online]. Available:
Figure 8. Distance monitoring accuracy. www.ifr.org/index.php?id=59&df=Presentation_market_overviewWo
rld_Robotics_29_9_2016_01.pdf
V. CONCLUSION AND FUTURE WORK [9] ——. Worldwide sales of industrial robots from 2004 to 2015 (in
1,000 units), (accessed march 16, 2017). [Online]. Available:
In this paper a dynamic adjustable safety system https://www.statista.com/statistics/264084/worldwide-sales-of-
approach for flexible human robot collaboration has been industrial-robots/
presented. With the help of a robot operating area [10] ABIresearch. Collaborative robots: Total units, world market, forecast:
2015 to 2020, (accessed march 16, 2017). [Online]. Available:
segmentation it is possible to identify the actual used https://www.abiresearch.com/market-research/product/1020016-
operating area without knowing anything about the operating collaborative-robotics/
program a priori. In a further step a safety system consisting [11] S. Blumenthal, E. Prassler, J. Fischer, and W. Nowak, “Towards
of individually configurable ultrasound modules was identification of best practice algorithms in 3d perception and
integrated. In this way it is possible to realize safeguard or modeling,” in 2011 IEEE International Conference on Robotics and
emergency stops with different distance thresholds Automation, May 2011, pp. 3554–3561.
depending on the direction from which a human worker [12] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. E. Saddik,
approaches. The major advantage of the operation area “Evaluating and improving the depth accuracy of kinect for windows
v2,” IEEE Sensors Journal, vol. 15, no. 8, pp. 4275–4285, Aug 2015.
segmentation presented in this paper is, that there is no need
[13] H. Alabbasi, A. Gradinaru, F. Moldoveanu, and A. Moldoveanu,
to get into the robots control. Thus the system developed is “Human motion tracking evaluation using kinect v2 sensor,” in 2015
completely robot independent and can be used for any robot EHealth and Bioengineering Conference (EHB), Nov 2015, pp. 1–4.
without any robot-specific knowledge. [14] R. A. El-laithy, J. Huang, and M. Yeh, “Study on the use of microsoft
A further application in progress uses the generated data kinect for robotics applications,” in Proceedings of the 2012
of a segmented operating area and combines it with new IEEE/ION Position, Location and Navigation Symposium, April 2012,
approaches in human machine interaction, e.g. a live pp. 1280–1288.
visualization in mixed reality (Microsoft HoloLens). The [15] L. Xiang, F. Echtler, C. Kerl, T. Wiedemeyer, Lars, hanyazou, R.
Gordon, F. Facioni, laborer2008, R. Wareham, M. Goldhoorn, alberth,
highlighting of hazard zones within glasses might increase gaborpapp, S. Fuchs, jmtatsch, J. Blake, Federico, H. Jungkurth, Y.
the level of acceptance for human robot collaboration by Mingze, vinouz, D. Coleman, B. Burns, R. Rawat, S. Mokhov, P.

20
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
Reynolds, P. Viau, M. Fraissinet-Tachet, Ludique, J. Billingham, and [18] M. Miknis, R. Davies, P. Plassmann, and A. Ware, “Near real-time
Alistair, “libfreenect2: Release 0.2,” Apr. 2016. [Online]. Available: point cloud processing using the pcl,” in 2015 International
https://doi.org/10.5281/zenodo.50641 Conference on Systems, Signals and Image Processing (IWSSIP),
[16] M. Quigley, K. Conley, B. P. Gerkey, J. Faust, T. Foote, J. Leibs, R. Sept 2015, pp. 153–156.
Wheeler, and A. Y. Ng, “Ros: an open-source robot operating system,” [19] C. Moreno, Y. Chen, and M. Li, “A dynamic compression technique
in ICRA Workshop on Open Source Software, 2009. for streaming kinect-based point cloud data,” in 2017 International
[17] R. B. Rusu and S. Cousins, “3d is here: Point cloud library (pcl),” in Conference on Computing, Networking and Communications
2011 IEEE International Conference on Robotics and Automation, (ICNC), Jan 2017, pp. 550-555
May 2011, pp. 1–4.

21
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.

You might also like