Motion Control of Robot by Using Kinect Sensor
Motion Control of Robot by Using Kinect Sensor
net/publication/290594972
CITATIONS READS
24 5,798
4 authors, including:
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Ahmed Saad Ali on 19 March 2016.
Abstract: In the presented work, a remote robot control system is implemented utilizes Kinect based gesture
recognition as human-robot interface. The movement of the human arm in 3 d space is captured, processed and
replicated by the robotic arm. The joint angles are transmitted to the Arduino microcontroller. Arduino receives the
joint angles and controls the robot arm. In investigation the accuracy of control by human's hand motion was tested.
Corresponding Author: Mohammed A. Hussein, Department of Electronics Technology, Faculty of Industrial Education, Beni-
Suef University, Egypt
1384
Res. J. App. Sci. Eng. Technol., 8(11): 1384-1388, 2014
Movement
Coordinator Commands
Programming
(Pc based )
Desired Data
Command Packet
Fig. 2: Kinect-sensor
The official release of Kinect in North America was on examine the depth values (http://msdn.microsoft.
November 4th, 2010, in Europe on November 10th, com/en-us/library/hh438998.aspx).
2010.
The skeleton data consists of a set of joints. These STRUCTURE OF THE CONTROL SYSTEM
joints are shown in Fig. 3. The coordinate system for
the skeleton data is a full 3D system with values in The structure of system is shown in Fig. 4 consist
meters. It is shown in the following diagram Fig. 1. of user, Kinect sensor, computer, Arduino
There are operations for converting between Skeleton microcontroller and robot arm. This system consists of
Space and Depth Image space. The Kinect for Windows two parts: angle determination and transfer data to robot
SDK supports up to two players (skeletons) being arm.
tracked at the same time. A player index is inserted into
the lower 3 bits of the depth data so that you can tell User: Is anyone standing in front of the Kinect at a
which depth pixels belong to which player. You must certain distance committed to the rules dealing with
take these 3 bits into account when you want to Kinect.
1385
Res. J. App. Sci. Eng. Technol., 8(11): 1384-1388, 2014
where,
Fig. 7: Calculating angels
𝜃𝜃 = The angle between three joints (two vectors)
A = Vector from joint hand to joint elbow
B = Vector from joint shoulder to joint elbow
CONCLUSION
Cole, J.B., D.B. Grimes and R.P.N. Rao, 2007. Pons-Moll, G., A. Baak, T. Helten, M. Müller, H.P.
Learning full-body motions from monocular Seidel and B. Rosenhahn, 2010. Multi sensor-
vision: Dynamic imitation in a humanoid robot. fusion for 3D full-body human motion capture.
Proceeding of IEEE/RSJ International Conference Proceeding of IEEE Conference on Computer
on Intelligent Robots and Systems (IROS, 2007), Vision and Pattern Recognition (CVPR, 2010), pp:
pp: 240-246. 663-670.
Goodrich, M.A. and A.C. Schultz, 2007. Human-robot Thanh, N.N.D., D. Stonier, S.Y. Lee and D.H. Kim,
interaction: A survey. Foundations Trends Hum. 2011. A new approach for human-robot interaction
Comput. Interaction, 1(3): 203-275. using human body language. Proceeding of the 5th
Luo, R.C., B.H. Shih and T.W. Lin, 2013. Real time International Conference on Convergence and
human motion imitation of anthropomorphic dual
Hybrid Information Technology (ICHIT, 2011),
arm robot based on Cartesian impedance control.
Proceeding of IEEE International Symposium on September 22-24, pp: 762-769.
Robotic and Sensors Environments (ROSE, 2013), Waldherr, S., R. Romero and S. Thrun, 2000. A gesture
pp: 25-30. based interface for human-robot interaction. Auton.
Ott, C., D. Lee and Y. Nakamura, 2008. Motion capture Robot., 9(2): 151-173.
based human motion recognition and imitation by
direct marker control. Proceeding of 8th IEEE-
RAS International Conference on Humanoid
Robots (Humanoids, 2008), pp: 399-405.
1388