Safe_human_robot_collaboration__Operation_area_segmentation_for_dynamic_adjustable_distance_monitoring
Safe_human_robot_collaboration__Operation_area_segmentation_for_dynamic_adjustable_distance_monitoring
18
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
wi-fi to set the specific minimum distances to the clients. limited area within the whole workspace (surrounding sphere)
The essential structure of the ultrasonic distance monitoring gets passed through by the robot.
system is shown in figure 4.
Figure 5. Industrial robot and corresponding point cloud for a static case. 1 0.510 0.473 0.518
2 0.508 0.476 0.518
For monitoring a possible operation use case a robot 3 0.508 0.474 0.518
program was generated. It is a simple pick, move and place 4 0.508 0.475 0.519
algorithm with three different positions on a worktop, as 5 0.510 0.476 0.518
presented in fig. 6a. The drawn cube gets moved by the robot 6 0.509 0.475 0.519
between these positions in an infinite loop. The exemplary 7 0.510 0.477 0.519
robot program was recorded successfully. In each frame the
8 0.509 0.475 0.518
acquired data was processed and a point cloud of the
9 0.510 0.476 0.518
stationary operating area has been segmented. Thereby the
density, this means the number of segmented single 3D 10 0.509 0.475 0.519
points of the operating area, grows continuously. Fig. 6b, c mean value 0.5091 0.4752 0.5184
shows the segmented point cloud after three minutes of
monitoring the operating robot UR5. standard deviation 0.0008 0.0011 0.0005
For a better understanding the step by step growing of the
complete operating area by merging the particular point
clouds can be seen in fig. 7 a-c. Even though it is a 2D The maximum amounts of the x-/y-/z-deviation
visualization of 3D data the figure also shows that only a result in a measurement error of:
19
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
2 2 2 removing concerns about safety in a very intuitive way.
∆dist = ∆x abs + ∆yabs + ∆z abs Apart from that the inadequate performance in accuracy is to
be improved by approaches like [18], [19].
∆dist = (0.010m) 2 + (0.027 m) 2 + (0.019m) 2
REFERENCES
∆dist = 0.034m ⇒ 6.8% [1] R. Bischoff, J. Kurth, G. Schreiber, R. Koeppe, A. Albu-Schaeffer,
Beyer, O. Eiberger, S. Haddadin, A. Stemmer, G. Grunwald, and G.
The test measurements showed unintentional deviations. Hirzinger, “The kuka-dlr lightweight robot arm - a new reference
Reasons for this are among others partly occurred sensor platform for robotics research and manufacturing,” in ISR 2010 (41st
irritations during image acquisition caused by the reflective International Symposium on Robotics) and ROBOTIK 2010 (6th
German Conference on Robotics), June 2010, pp. 1–8.
surface of the universal robot and the downsampled data.
[2] T. Lens and O. von Stryk, “Design and dynamics model of a
C. Distance Monitoring Accuracy lightweight series elastic tendon-driven robot arm,” in 2013 IEEE
International Conference on Robotics and Automation, May 2013, pp.
In a safety context, it is particularly important to ensure 4512–4518.
that a sensed distance of a human is not closer to the [3] S. Parusel, S. Haddadin, and A. Albu-Schffer, “Modular state-based
hazardous zone, than the safety system has measured. To get behavior control for safe human-robot interaction: A lightweight
a statement about the accuracy of the developed distance control architecture for a lightweight robot,” in 2011 IEEE
monitoring system, measurements have been carried out. International Conference on Robotics and Automation, May 2011, pp.
4298–4305.
This has been done by placing an object with different
[4] V. Duchaine, N. Lauzier, M. Baril, M. A. Lacasse, and C. Gosselin,
distances and angles starting from the client modules center, “A flexible robot skin for safe physical human robot interaction,” in
represented as blue points in fig. 8. The measured distances Robotics and Automation, 2009. ICRA ’09. IEEE International
were interpolated into red lines. Conference on, May 2009, pp. 3676–3681.
[5] J. O’Neill, J. Lu, R. Dockter, and T. Kowalewski, “Practical,
stretchable smart skin sensors for contact-aware robots in safe and
collaborative interactions,” in 2015 IEEE International Conference on
Robotics and Automation (ICRA), May 2015, pp. 624–629.
[6] T. Mazzocchi, A. Diodato, G. Ciuti, D. M. D. Micheli, and A.
Menciassi, “Smart sensorized polymeric skin for safe robot collision
and environmental interaction,” in Intelligent Robots and Systems
(IROS), 2015 IEEE/RSJ International Conference on, Sept 2015, pp.
837–843.
[7] ISO/TS 15066:2016, “Robots and robotic devices - collaborative
robots,” International Organization for Standardization, Standard
ISO/TS 15066:2016, Feb. 2016.
[8] IFR - International Federation of Robotics. Worldwide estimated
operational stock of industrial robots. [Online]. Available:
Figure 8. Distance monitoring accuracy. www.ifr.org/index.php?id=59&df=Presentation_market_overviewWo
rld_Robotics_29_9_2016_01.pdf
V. CONCLUSION AND FUTURE WORK [9] ——. Worldwide sales of industrial robots from 2004 to 2015 (in
1,000 units), (accessed march 16, 2017). [Online]. Available:
In this paper a dynamic adjustable safety system https://www.statista.com/statistics/264084/worldwide-sales-of-
approach for flexible human robot collaboration has been industrial-robots/
presented. With the help of a robot operating area [10] ABIresearch. Collaborative robots: Total units, world market, forecast:
2015 to 2020, (accessed march 16, 2017). [Online]. Available:
segmentation it is possible to identify the actual used https://www.abiresearch.com/market-research/product/1020016-
operating area without knowing anything about the operating collaborative-robotics/
program a priori. In a further step a safety system consisting [11] S. Blumenthal, E. Prassler, J. Fischer, and W. Nowak, “Towards
of individually configurable ultrasound modules was identification of best practice algorithms in 3d perception and
integrated. In this way it is possible to realize safeguard or modeling,” in 2011 IEEE International Conference on Robotics and
emergency stops with different distance thresholds Automation, May 2011, pp. 3554–3561.
depending on the direction from which a human worker [12] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. E. Saddik,
approaches. The major advantage of the operation area “Evaluating and improving the depth accuracy of kinect for windows
v2,” IEEE Sensors Journal, vol. 15, no. 8, pp. 4275–4285, Aug 2015.
segmentation presented in this paper is, that there is no need
[13] H. Alabbasi, A. Gradinaru, F. Moldoveanu, and A. Moldoveanu,
to get into the robots control. Thus the system developed is “Human motion tracking evaluation using kinect v2 sensor,” in 2015
completely robot independent and can be used for any robot EHealth and Bioengineering Conference (EHB), Nov 2015, pp. 1–4.
without any robot-specific knowledge. [14] R. A. El-laithy, J. Huang, and M. Yeh, “Study on the use of microsoft
A further application in progress uses the generated data kinect for robotics applications,” in Proceedings of the 2012
of a segmented operating area and combines it with new IEEE/ION Position, Location and Navigation Symposium, April 2012,
approaches in human machine interaction, e.g. a live pp. 1280–1288.
visualization in mixed reality (Microsoft HoloLens). The [15] L. Xiang, F. Echtler, C. Kerl, T. Wiedemeyer, Lars, hanyazou, R.
Gordon, F. Facioni, laborer2008, R. Wareham, M. Goldhoorn, alberth,
highlighting of hazard zones within glasses might increase gaborpapp, S. Fuchs, jmtatsch, J. Blake, Federico, H. Jungkurth, Y.
the level of acceptance for human robot collaboration by Mingze, vinouz, D. Coleman, B. Burns, R. Rawat, S. Mokhov, P.
20
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.
Reynolds, P. Viau, M. Fraissinet-Tachet, Ludique, J. Billingham, and [18] M. Miknis, R. Davies, P. Plassmann, and A. Ware, “Near real-time
Alistair, “libfreenect2: Release 0.2,” Apr. 2016. [Online]. Available: point cloud processing using the pcl,” in 2015 International
https://doi.org/10.5281/zenodo.50641 Conference on Systems, Signals and Image Processing (IWSSIP),
[16] M. Quigley, K. Conley, B. P. Gerkey, J. Faust, T. Foote, J. Leibs, R. Sept 2015, pp. 153–156.
Wheeler, and A. Y. Ng, “Ros: an open-source robot operating system,” [19] C. Moreno, Y. Chen, and M. Li, “A dynamic compression technique
in ICRA Workshop on Open Source Software, 2009. for streaming kinect-based point cloud data,” in 2017 International
[17] R. B. Rusu and S. Cousins, “3d is here: Point cloud library (pcl),” in Conference on Computing, Networking and Communications
2011 IEEE International Conference on Robotics and Automation, (ICNC), Jan 2017, pp. 550-555
May 2011, pp. 1–4.
21
Authorized licensed use limited to: Ahsanullah University of Science & Technology. Downloaded on November 17,2024 at 09:52:13 UTC from IEEE Xplore. Restrictions apply.