0% found this document useful (0 votes)
36 views

Motion Control of Robot by Using Kinect Sensor

Uploaded by

Ebube
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Motion Control of Robot by Using Kinect Sensor

Uploaded by

Ebube
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/290594972

Motion Control of Robot by using Kinect Sensor

Article  in  Research Journal of Applied Sciences, Engineering and Technology · September 2014


DOI: 10.19026/rjaset.8.1111

CITATIONS READS
24 5,798

4 authors, including:

Ahmed Saad Ali Fathy Elmisery


Assiut University Beni Suef University
28 PUBLICATIONS   158 CITATIONS    8 PUBLICATIONS   37 CITATIONS   

SEE PROFILE SEE PROFILE

Ramadan Mahmoud Mostafa


Beni Suef University
42 PUBLICATIONS   107 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

single phase inverter View project

Arabic Sign Language Recognition Using Kinect Sensor View project

All content following this page was uploaded by Ahmed Saad Ali on 19 March 2016.

The user has requested enhancement of the downloaded file.


Research Journal of Applied Sciences, Engineering and Technology 8(11): 1384-1388, 2014
ISSN: 2040-7459; e-ISSN: 2040-7467
© Maxwell Scientific Organization, 2014
July ‎24, ‎2014
Submitted: ‎ Accepted: ‎11 ‎October ‎2014 Published: September 20, 2014

Motion Control of Robot by using Kinect Sensor


1
Mohammed A. Hussein, 2Ahmed S. Ali, 1F.A. Elmisery and 3R. Mostafa
1
Department of Electronics Technology, Faculty of Industrial Education, Beni-Suef University,
2
Department of Mech. Engineering, Faculty of Engineering, Assuit University,
3
Department of Head of Automatic Control, Beni-Suef University, Egypt

Abstract: In the presented work, a remote robot control system is implemented utilizes Kinect based gesture
recognition as human-robot interface. The movement of the human arm in 3 d space is captured, processed and
replicated by the robotic arm. The joint angles are transmitted to the Arduino microcontroller. Arduino receives the
joint angles and controls the robot arm. In investigation the accuracy of control by human's hand motion was tested.

Keywords: Arduino, gesture, human-robot interaction, Kinect, robotic arm

INTRODUCTION mobile robot equipped with a manipulator. The


interface uses a camera to track a person and recognize
In recent years, the development of human robot gestures involving arm motion. The basic technique of
interaction in service robots has attracted the attention the depth sensor is to project a structured infrared light
of many researchers. The most important applications continuously and calculate depth from the reflection of
of industrial robots are material handling, welding, the light at different positions. By processing the depth
assembling, dispensing and processing where the image, user’s skeleton joints can be captured and
provided in real time. Human arm movement will be
robotic arm manipulator needs to perform pick and
directly reflected in the action of robot arms. A lot of
place operations incessantly, one of such industrial researches have been proposed about human motion
standard robots is a generic serial arm which consists of capture through different sensor devices, e.g., cameras,
a base, a link or series of links connected at joints and depth sensors, inertial sensors or marker based vision
an end effectors. Generally, end effectors are a gripper system (Ott et al., 2008; Cole et al., 2007; Pons-
which is at the end of the last link and the base is the Moll et al., 2010).
first link of a serial arm.
In this study, Microsoft Kinect sensor is applied in OBJECTIVES AND PROBLEM STATEMENT
a remote robot control system to recognize different
body gestures and generate visual Human-Robot The aim of study is to development a human-
interaction interface without calculating complex machine interface used for control robot arm, as shown
inverse kinematics to make the robot arm follow the in Fig. 1. In the presented work, a remote robot control
posture of human arm. This kind of system aims to system is described that utilizes Kinect based gesture
enrich the interactive way between human and robots recognition as human-robot interface. The presented
which help non-expert users to control the robot freely, work provided the evaluation of control robot arm using
making human-robot interaction much easier (Goodrich based gesture recognition as human-robot interface.
and Schultz, 2007; Cheng et al., 2012). The movement of the human arm in 3 d space is
captured, processed and replicated by the robotic arm.
LITERATURE REVIEW The joint angles are transmitted to the Arduino
microcontroller. Arduino receives the joint angles and
The concept of Gesture control to manipulate controls the robot arm.
robots has been used in many earlier researches. Thanh
et al. (2011) construct the system through which human
user can interact with robot by body language. The Kinect sensor: In the presented work, we use The
Kinect camera is used as the visual device. Luo et al. Kinect Sensor was developed by Microsoft and Prime
(2013) use the Kinect sensor developed by the Sense, shown in Fig. 2, It is a hardware device used to
Microsoft as our motion capture device. Waldherr et al. control the Microsoft XBOX-360 game console without
(2000) describes a gesture interface for the control of a any kind of controller that the user has to hold or wear.

Corresponding Author: Mohammed A. Hussein, Department of Electronics Technology, Faculty of Industrial Education, Beni-
Suef University, Egypt
1384
Res. J. App. Sci. Eng. Technol., 8(11): 1384-1388, 2014

Movement
Coordinator Commands
Programming
(Pc based )
Desired Data
Command Packet

Master Unit Transducer Slave Unit


(Human) Interface and (Robot)
communication
Position

Fig. 1: General architecture of the system

Fig. 2: Kinect-sensor

Fig. 3: Skeleton tracking

The official release of Kinect in North America was on examine the depth values (http://msdn.microsoft.
November 4th, 2010, in Europe on November 10th, com/en-us/library/hh438998.aspx).
2010.
The skeleton data consists of a set of joints. These STRUCTURE OF THE CONTROL SYSTEM
joints are shown in Fig. 3. The coordinate system for
the skeleton data is a full 3D system with values in The structure of system is shown in Fig. 4 consist
meters. It is shown in the following diagram Fig. 1. of user, Kinect sensor, computer, Arduino
There are operations for converting between Skeleton microcontroller and robot arm. This system consists of
Space and Depth Image space. The Kinect for Windows two parts: angle determination and transfer data to robot
SDK supports up to two players (skeletons) being arm.
tracked at the same time. A player index is inserted into
the lower 3 bits of the depth data so that you can tell User: Is anyone standing in front of the Kinect at a
which depth pixels belong to which player. You must certain distance committed to the rules dealing with
take these 3 bits into account when you want to Kinect.
1385
Res. J. App. Sci. Eng. Technol., 8(11): 1384-1388, 2014

Fig. 4: The structure of the control system

Kinect sensor: Is used as input device. It captures the


movement of the human arm in real time.
And send the skeleton joint data to computer via
USB for processing.

Computer: It processes the information from the


Kinect sensor and converts it into a skeletal image then
calculates angle between joints and sending it Arduino
microcontroller via USB.

Arduino microcontroller: Is logic board, which allows


interfacing electronic devices to the computer quickly
and easily. The version of the board we employ is
named Arduino Uno, Depending on the receiving data
(angles) the Arduino generates PWM signals designed
to move a Servo-Motor to a specific angle.
Fig. 5: Calibration position
Robot arm (R/C servo): Edubot100 Robotic Arm is a
five-axis articulated robotic arm designed to teach the Depth generator: The depth generator provides a
industrial robot technology in the simplest way. depth map of the scene as an array of floats, even
Received data from Arduino (PWM) the servo motor though the actual depth values are always natural
turn their axles to the angles which are received. numbers of the unit mm. OpenNI will give us this
position in “real-world” coordinates http://kinectcar.
SOFTWARE ronsper.com/docs/openni/groupdepthgen.html). These
are coordinates that are designed to correspond to the
In the presented work we used c# to write the physical position of your body in the room.
programs for our system, we employed the Microsoft
User detection (calibration): To start the user work
Kinect as depth sensor, using the OpenNI APIs to
must stand in front of the sensor assuming the
interface it and the NITE framework for depth image
calibration position, that is, with arms parallel to the
analysis and control skeleton extraction.
ground and forearms perpendicular (Fig. 5). This
The OpenNI framework is an open source SDK
process might takes from 10 sec or a little bit more
used for the development of 3D sensing middleware
depending on the positions of the Kinect sensor. Once
libraries and applications (http://www.openni.org). the calibration is done, Kinect tracks the joints and
To interface with the Kinect sensors the OpenNI- limbs position.
Framework was used. This is an open source package However, if the person stays out of the frame for
by Prime Sense. It is intended to make available the too long, Kinect recognizes that person as a new user
new opportunities offered by sensors like Kinect to a once she/he comes back and the calibration needs to be
larger community, to accelerate new developments in done again, in Fig. 6 shown steps of calibration using
natural interaction. (OpenNI+NITE PrimeSense).
OpenNI provides a driver for Kinect and an
Application Programming Interface (API). Also, it Arduino programming: The microcontroller is
offers a lot of basic functionality for analysis of the programmed using the Arduino programming
scene watched by Kinect. The functionality that was language and the Arduino development environment
used in this thesis consists of the following: (http://www.arduino.cc). The brain of robotic arm will
1386
Res. J. App. Sci. Eng. Technol., 8(11): 1384-1388, 2014

Fig. 6: Process of calibration

Also, the angle of elbow (angle2) creates between the


joints (shoulder, elbow and hand), shown in Fig. 7.
Kinect sensor provide the coordinates x, y, z of
each joint on human body. Calculating the angle
between three joints (two vectors) can define by:
→→
𝐴𝐴 . 𝐵𝐵
𝜃𝜃 = cos−1
‖𝐴𝐴‖. ‖𝐵𝐵‖

where,
Fig. 7: Calculating angels
𝜃𝜃 = The angle between three joints (two vectors)
A = Vector from joint hand to joint elbow
B = Vector from joint shoulder to joint elbow

Testing the algorithm: The test of the human motion


imitation system with conductor motion and a sequence
of photos are shown in Fig. 8. It demonstrates that the
arm robot arm can be controlled by human
demonstration in real time by using Kinect sensor.

CONCLUSION

This study presented rich the interactive way


between human and robots and help non-expert users to
control the robot freely, making human-robot
interaction much easier.
The presented provides the evaluation of control
robot arm using Kinect sensor where the joint angles
are carried out. The joint angles are transmitted to the
Arduino controller. Arduino controller receives the
joint angles and controls the robot arm. The
Fig. 8: The robot is controlled by human demonstration in
performance of the system is characterized using
real time using Kinect sensor
human input for different situations and the results
show that system has the ability to control the robot by
be an Arduino microcontroller. The Arduino is a small
using Kinect sensor.
computer that can control simple electronics and
sensors. Then communicate with the Arduino from
REFERENCES
Processing so we can position robotic arm based on the
data that capture from the Kinect. Cheng, L., Q. Sun, H. Su, Y. Cong and S. Zhao, 2012.
Design and implementation of human-robot
Calculating angels: Robot arm servos are going to interactive demonstration system based on Kinect.
reproduce the angles of the user’s shoulder and elbow. Proceeding of 24th Chinese Control and Decision
When we refer to the angle of the shoulder (angle1) Conference (CCDC, 2012), May 23-25, pp:
creates between the joints (torso, shoulder and elbow). 971-975.
1387
Res. J. App. Sci. Eng. Technol., 8(11): 1384-1388, 2014

Cole, J.B., D.B. Grimes and R.P.N. Rao, 2007. Pons-Moll, G., A. Baak, T. Helten, M. Müller, H.P.
Learning full-body motions from monocular Seidel and B. Rosenhahn, 2010. Multi sensor-
vision: Dynamic imitation in a humanoid robot. fusion for 3D full-body human motion capture.
Proceeding of IEEE/RSJ International Conference Proceeding of IEEE Conference on Computer
on Intelligent Robots and Systems (IROS, 2007), Vision and Pattern Recognition (CVPR, 2010), pp:
pp: 240-246. 663-670.
Goodrich, M.A. and A.C. Schultz, 2007. Human-robot Thanh, N.N.D., D. Stonier, S.Y. Lee and D.H. Kim,
interaction: A survey. Foundations Trends Hum. 2011. A new approach for human-robot interaction
Comput. Interaction, 1(3): 203-275. using human body language. Proceeding of the 5th
Luo, R.C., B.H. Shih and T.W. Lin, 2013. Real time International Conference on Convergence and
human motion imitation of anthropomorphic dual
Hybrid Information Technology (ICHIT, 2011),
arm robot based on Cartesian impedance control.
Proceeding of IEEE International Symposium on September 22-24, pp: 762-769.
Robotic and Sensors Environments (ROSE, 2013), Waldherr, S., R. Romero and S. Thrun, 2000. A gesture
pp: 25-30. based interface for human-robot interaction. Auton.
Ott, C., D. Lee and Y. Nakamura, 2008. Motion capture Robot., 9(2): 151-173.
based human motion recognition and imitation by
direct marker control. Proceeding of 8th IEEE-
RAS International Conference on Humanoid
Robots (Humanoids, 2008), pp: 399-405.

1388

View publication stats

You might also like