Bipedal Robot Center of Pressure Feedback Simulation For Center of Mass Learning
Bipedal Robot Center of Pressure Feedback Simulation For Center of Mass Learning
Samuel Oluyemi Owoeye, Folasade Durodola, Peace Oluwafeyidabira Adeniyi, Idris Tolulope
Abdullahi, Adesanya Boluwatito Hector
Department of Mechatronics Engineering, Federal University of Agriculture, Abeokuta, Nigeria
Corresponding Author:
Samuel Oluyemi Owoeye
Department of Mechatronics Engineering, Federal University of Agriculture
Abeokuta, Nigeria
Email: [email protected]
1. INTRODUCTION
The Robot Institute of America described a robot as a multifunctional, reprogrammable manipulator
that can move materials, parts, tools, or specialized equipment through a range of programmed motions to
accomplish several tasks [1]. A robot is an autonomous device that can sense its surroundings, process
information to make decisions, and act in the real world. It can also be described as a kind of automated
device that can quickly and precisely carry out particular activities with little or no assistance from humans [2].
Robots are generally classified into two categories, namely, locomotion and application. The
application-specific robots are industrial robots and service robots. Industrial robots are robots that are used
for manufacturing and the logistics of materials in the manufacturing process [3]. Service robots are defined
by the International Organization for Standards as robots for personal or professional use that perform useful
tasks in different environments [4]. Additionally, robots can be classified based on their locomotion, which
refers to the method by which they move in different environments. They could either be stationary or mobile
robots. Stationary robots are robots that conduct tasks in a set location and are not mobile in any way [5].
Mobile robots are robots that can move around their environment. These robots are heavily structured by
software programming and use sensors and other technologies, such as artificial intelligence, to identify and
move around their environment [6]. Robots are also classified based on locomotion, which is the method by
which robots move in different environments. They could either be stationary or mobile robots. Stationary
robots are robots that conduct their tasks in a set location and are not mobile in any way [7]. Mobile robots are
robots that can move around their environment. These robots are heavily structured by software programming
and use sensors and other technologies, such as artificial intelligence (AI), to identify their environment [8].
Home invasion has emerged as a major social vice all across the world. Inadequate security, poverty,
substance addiction, peer group pressure, and many more factors are some of the main causes of
Burglaries [9]. According to Burglary Statistics, 65% of people personally know their thieves are, and there
is a very high likelihood that a friend or neighbor may try to rob you. Due to the fast-paced nature of burglary
crimes and the potential difficulty for both homeowners and police to identify the perpetrator, only 13% of
reported burglary cases are resolved by the police. To combat the increase in home invasions, we proposed
the development of a home security robot that can warn homeowners of an attempted break-in or burglary on
their property. This will aid in preventing house invasions and make it easier to identify and apprehend those
responsible. The design of a four-legged robot moves around its environment, detects if there are intruders,
takes a picture of the intruder, sends it, and also sends a text message to the owner. If the presence of the
intruder persists, the robot will make canine sounds to alert the neighborhood. This quadrupled home security
robot has several essential components that will allow for the detection of intruders and the capture of
images. It is also powered by rechargeable batteries.
Legged robots are mobile robots that make use of mechanical limbs for movement. They are similar
to other robots, but their locomotion methods are more complicated compared to their wheeled counterparts
[10]. Quadruped robots have been around for some time now. There has been a lot of interest in mobile
robots over the last three decades because they can explore complex environments, perform rescue
operations, and complete tasks without the need for human intervention [11]. In 1870, the first legged robot
system design was made by Chebyshev with dual-axis leg motion; it was a simple 4-link system.
A researcher named Rygg developed the mechanical horse in 1893, which led to further advances in legged
robotic systems. A pedal was included in this type, which was entirely mechanical. Ralph Mosher and
General Electric created the Walking Tuck later in 1965. It was a massive electromechanical model that
could track mechanical movements using electrical inputs [12]. Further advances in quadruped robots were
made possible by the integration of biological inspiration into robotic design. Notable examples of these
robots include Massachusetts Institute of Technology (MIT) Cheetah [13], Boston Dynamics Spot Mini, and
Tekken [14], all of which demonstrated flawless stability and fluid gait patterns.
A low-cost quadrupedal robot has been designed by using either wholly three-dimensional (3D)-
printed parts or a combination of 3D-printed parts and carbon fiber. Low-cost servo motors are used in place
of hydraulics, and proportional integral derivative (PID) controlled brushless direct current (DC) motors are
used to move the legs. The Stanford Pupper is a quadruped robot constructed of carbon fiber and 3D-printed
parts that uses 12 high-voltage servo motors and a Raspberry Pi to regulate the gait logic. It was created by
robotics students at Stanford University. The spot micro robot is another inexpensive quadruped robot that is
constructed primarily from 3D-printed components and is powered by an Arduino, a node MCU, a teensy
board, and a Raspberry Pi.
Efe and Ogunlere [15] developed a highly customizable and inexpensive mobile home security
system by connecting sensors to an Arduino ATmega2560 microcontroller to enable communication with the
signal input from the sensors. A mechanism that, upon detection of an intrusion by the passive infrared (PIR)
sensor, notifies the administrator application via short message service (SMS). A smartphone application
designed to communicate with the home security system was part of the system that made it possible for
homeowners to safely secure their residences from their smartphones. The system developed could only
detect intruders using the PIR sensor, which could be set off by anything reflecting infrared (IR) light. The
system could not capture the faces of intruders.
Dhakolia et al. [16] designed a robot capable of operating in rough terrain for surveillance and
monitoring. The methodology used included a robotic arm controlled by an ESP32 development board, an
Arduino Mega serving as the robot's brain, an ultrasonic sensor for obstacle avoidance, and a night vision
camera unit with a radio receiver. Aluminium was used to create the robot's body. The study concluded that a
four-legged walking robot is an efficient surveillance device with a wide range of applications. Also,
an internet of things (IoT)-based door security system for home automation was developed in [17].
A door lock system was built with face detection and recognition, and an email alert system was developed
using a web camera that captures an image and sends it to the Raspberry Pi when motion is detected by a PIR
sensor device. The system compares the captured image with the image stored in the database. If the image is
in the database, the door will automatically open; otherwise, an SMS warning will be sent to the user with the
aid of a global system for mobile communication (GSM) module, and the door will remain locked.
Al-Obaidi et al. [18] developed a wireless-controlled mobile robot with low cost and low power
consumption for surveillance applications. Arduino and Raspberry Pi (low-cost open-source hardware) were
used for motion control and the main processing unit, respectively. The mobile robot uses sensors to track
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246
IAES Int J Rob & Autom ISSN: 2722-2586 235
physical events in its immediate surroundings while wirelessly communicating with a control station.
Without needing to recharge its battery, the robot can operate continuously at 25 rpm for 6.5 hours.
Kim et al. [19] designed and developed an open-source quadruple robot using a single-board
computer (SBC) with a graphical processing unit (GPU), an onboard depth sensor, and off-the-shelf quasi-
direct drive actuators. Two independent single-board computers (SBCs) were primarily utilized to manage
the motion control and perception tasks. Motion is managed by a Lattepanda Alpha SBC, while vision is
handled by an NVidia Xavier Jetson NX SBC. RMD-X8 and RMD-X8 Pro Actuators were used. Three-cell
LiPo batteries were used to power the single-board computers and the actuators, respectively. To make the
frame, polylactic acid (PLA) filament was utilized. The 12.7 kg quadruped robot was designed with a front walk
velocity of 1.0 m/s and an average power usage of 81.6 W, enabling steady dynamic trot-walking. Shi et al. [20]
also designed, simulated, and constructed a quadruped robot dog as a vehicle. Investigating the kinematics and
inverse kinematics solutions relying on the DH approach laid the foundation for the gait algorithm.
for infrared laser light pulses to be emitted, reach the closest body or object, and be reflected to a detector.
This ensures accurate distance measurement regardless of the target's color, texture, reflectivity, and other
characteristics.
2.2. Method
The system consists of a microcontroller and a Raspberry Pi SBC. The Raspberry Pi acts as the main
brain of the system while the ESP32 is in charge of the motion of the legs. The system is powered by either a
battery of nominal voltage of 7.4 volts (i.e., a 2S battery). A 2S battery indicator is connected in parallel to
the power source of the system to determine the approximate voltage range. A switch was placed to allow
and cut off power to the circuit. 7.4 volts from the power source goes directly to the PCA9685 16-channel
PWM driver and LM2596 dc-dc buck converter which converts the 7.4 volts to 5 volts for the ESP32 and the
Raspberry Pi single board computer (SBC).
The servo motors are connected to a PCA9685 16-channel PWM driver. The right front tibia, right
front femur and right front coxa servo motors are connected to PWM pins 1 to 3 respectively. The left front
tibia, left front femur, and the left front coxa servo motors are connected to PWM pins 5 to 7 respectively.
The left rear coxa, left rear tibia. and the left rear femur servo motors are connected to PWM pins 9 to 11
respectively. The right rear coxa, right rear tibia and the right rear femur servo motors are connected to PWM
pins 13 to 14 respectively. The PCA9685 16-channel PWM driver is connected to the ESP32 board via the
I2C pin, the I2C pins are the serial data (SDA) and serial clock (SCL) pin. The MPU6050 gyroscope sensor
is also connected to the same I2C bus. The MPU6050 gyroscope sensor is connected to the ESP32 for
stabilization of the gait. The PCA9685 16-channel PWM driver and the MPU6050 gyroscope sensors are
powered by 3.3 V from the ESP32 boards.
Three VL53L1X time of Flight sensors are connected to the Raspberry Pi using an I2C pin via the
TCA9548 I2C multiplexer. The VL53L1X time of flight sensors enables the robot to detect and avoid
obstacles while moving. The TCA9548 I2C multiplexer and the VL53L1X time of flight sensor will be
powered by 3.3 volts from the Raspberry Pi or 5 volts from an LM2596 dc-dc buck converter depending on
the final current draw. An infrared camera is connected to the Raspberry Pi board to capture images. The
Raspberry Pi and the ESP32 board communicate with each other using the transmit (TX) and receive (RX)
pins. The circuit diagram is illustrated in Figure 7. The block diagram of all the segments is shown in
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246
IAES Int J Rob & Autom ISSN: 2722-2586 237
Figure 8. A printed circuit board shown in Figure 9 was designed using Altium Designer that puts the
Raspberry Pi, the ESP32 board, the MPU6050 and the LM2569 AC -DC buck converter on the same board.
Due to the inability to source a printed circuit board (PCB), the design was built on a double-sided
9 cm × 15 cm Vero board.
2.2.1. Flowchart
The flow chart for developing the demonstrating the operation flow of the quadruped home
surveillance robot is shown in Figure 10.
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246
IAES Int J Rob & Autom ISSN: 2722-2586 239
𝜃1 = 𝛼3 − 𝛼1 (1)
In △ADC,
In _ABCE,
Furthermore, in △ABC,
∝2 = 90 −∝4 (4)
In △ABC,
−𝑦4
∝4 = arctan ( ) (7)
𝑥4
Determining 𝜃2 ,
From □ABFE,
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246
IAES Int J Rob & Autom ISSN: 2722-2586 241
From △ABC,
−𝑧4
∝3 = arctan ( ) (11)
√𝑥42 +𝑦42 −𝐿21
From △ACF,
𝐶𝐹
∝2 = arctan ( ) (12)
𝐴𝐹
Similarly,
𝐿3 sin 𝜃3
∝𝟐 = arctan ( ) (14)
𝐿2 +𝐿3 cos 𝜃3
𝐿3 sin 𝜃3 −𝑧4
∝1 = 90° − arctan ( ) − arctan ( ) (15)
𝐿2 +𝐿3 cos 𝜃3
√𝑥42 +𝑦42 −𝐿21
From (13),
𝐿3 sin 𝜃3 −𝑧4
𝜃2 = − arctan ( ) − arctan ( ) (16)
𝐿2 +𝐿3 cos 𝜃3
√𝑥42 +𝑦42 −𝐿21
Determining 𝜃3 .
𝜃3 = 180°−∝ (17)
In △ACD,
In △ABC,
𝐿3 sin 𝜃3 −𝑧4
𝜃2 = − arctan ( ) − arctan ( )
𝐿2 +𝐿3 cos 𝜃3
√𝑥42 +𝑦42 −𝐿21
The Denavit-Hartenberg (D-H) table detailing the position of the leg at resting position and standing position
is shown in Tables 1 and 2 respectfully. The length of the different part of the leg measured from each part of
the leg is as:
L1 = 60 mm [length of cortex]
L2 = 115 mm [length of the femur]
L3 = 135 mm [length of the tibia]
Table 1. D-H table for resting position Table 2. D-H table for standing position
The Frame ai-1 αi-1 di θ˚i The Frame ai-1 αi-1 di θi
0–1 60 0 0 0 0–1 60 0 0 0˚
1–2 0 -90˚ 0 90˚ 1–2 0 -90˚ 0 90˚
2–3 115 0 0 -81.5˚ 2–3 115 0 0 -59.0˚
3–4 135 0 0 -27.9˚ 3–4 135 0 0 2.3˚
Figure 14. Top view of completely assembled Figure 15. Side view of completely assembled
quadruped robot quadruped robot
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246
IAES Int J Rob & Autom ISSN: 2722-2586 243
3.2. Camera
The camera module used had an image sensor with a resolution of 5MP and its infrared (IR) filter
was removed enabling it to capture infrared light. This enabled the camera to be able to capture more details
even where visible light was low, when the camera was paired with an IR light source it provided lighting in
dark conditions without revealing the light source and the potential video capture to a potential intruder.
Videos captured from the camera were captured at 60 fps at 1080 p providing a good image quality that can
be further worked on.
Figure 16. Face recognition program identifying the face of the group members
Figure 17. Object detection program detecting a person, chair and a laptop
4. CONCLUSION
Home surveillance is a growing field of interest for many people who want to protect their property
and privacy from intruders, thieves, or other threats. The use of a quadruped robot for home surveillance
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246
IAES Int J Rob & Autom ISSN: 2722-2586 245
ensures that more ground is covered when monitoring the home. The quadruped robot using a high voltage
servo-motor, time of flight sensor, 5MP camera and PIR sensor was successfully developed and tested.
The system has been able to identify a human being using the TensorFlow library and also identify
the face of the human being, detect obstacles in its pathway using the time-of-flight sensor, send a message to
the homeowner if an intruder is detected and alert nearby people using a canine communication alert system.
The quadruped robot was designed and developed using low-cost and easy-to-access material making the
overall build of the system cost-efficient. The result of the test showed that the system can work around a
specified space for about 20 minutes.
In summary, quadruped home surveillance solved the need for an advanced and interactive home
security solution that addresses the limitations of traditional surveillance systems. The quadruped robot was
designed to be reliable and efficient making it a suitable solution for a variety of home security operations.
REFERENCES
[1] J. J. Craig, “Introduction to robotics: mechanics and control,” 3rd Editio., 2004.
[2] D. W. Marhefka, D. E. Orin, J. P. Schmiedeler, and K. J. Waldron, “Intelligent control of quadruped gallops,” IEEE/ASME
Transactions on Mechatronics, vol. 8, no. 4, pp. 446–456, Dec. 2003, doi: 10.1109/TMECH.2003.820001.
[3] M. Makulavičius, S. Petkevičius, J. Rožėnė, A. Dzedzickis, and V. Bučinskas, “Industrial robots in mechanical machining:
perspectives and limitations,” Robotics, vol. 12, no. 6, Nov. 2023, doi: 10.3390/robotics12060160.
[4] I. Lee, “Service robots: a systematic literature review,” Electronics, vol. 10, no. 21, Oct. 2021, doi: 10.3390/electronics10212658.
[5] M. Ben-Ari and F. Mondada, “Robots and their applications,” in Elements of Robotics, Cham: Springer International Publishing,
2018, pp. 1–20. doi: 10.1007/978-3-319-62533-1_1.
[6] Z. Gan, T. Wiestner, M. A. Weishaupt, N. M. Waldern, and C. David Remy, “Passive dynamics explain quadrupedal walking,
trotting, and tölting,” Journal of Computational and Nonlinear Dynamics, vol. 11, no. 2, Mar. 2016, doi: 10.1115/1.4030622.
[7] M. Fairchild, “Types of industrial robots and their different uses,” HowToRobot: Connecting the World of Robots., 2021.
[8] F. Rubio, F. Valero, and C. Llopis-Albert, “A review of mobile robots: concepts, methods, theoretical framework, and
applications,” International Journal of Advanced Robotic Systems, vol. 16, no. 2, Mar. 2019, doi: 10.1177/1729881419839596.
[9] D. Hockey, “Burglary crime scene rationality of a select group of non-apprehend burglars,” SAGE Open, vol. 6, no. 2, Apr. 2016,
doi: 10.1177/2158244016640589.
[10] P. Biswal and P. K. Mohanty, “Development of quadruped walking robots: a review,” Ain Shams Engineering Journal, vol. 12,
no. 2, pp. 2017–2031, Jun. 2021, doi: 10.1016/j.asej.2020.11.005.
[11] P. Gonzalez de Santos, E. Garcia, and J. Estremera, Quadrupedal Locomotion. London: Springer London, 2006. doi: 10.1007/1-
84628-307-8.
[12] B. Sandeep and P. Tamil Selvan, “Design and development of an autonomous quadruped robot,” IOP Conference Series:
Materials Science and Engineering, vol. 1012, no. 1, Jan. 2021, doi: 10.1088/1757-899X/1012/1/012016.
[13] G. Bledt, M. J. Powell, B. Katz, J. Di Carlo, P. M. Wensing, and S. Kim, “MIT Cheetah 3: design and control of a robust,
dynamic quadruped robot,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct. 2018, pp.
2245–2252. doi: 10.1109/IROS.2018.8593885.
[14] Y. Fukuoka, H. Kimura, Y. Hada, and K. Takase, “Adaptive dynamic walking of a quadruped robot ‘Tekken’ on irregular terrain
using a neural system model,” in 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422), pp.
2037–2042. doi: 10.1109/ROBOT.2003.1241893.
[15] E. E. Efe and S. Ogunlere, “Design and implementation of a mobile-based home security system,” American Scientific Research
Journal for Engineering, Technology, and Sciences (ASRJETS), vol. 72, no. 1, pp. 101–112, 2020.
[16] M. Dhakolia, P. Chalke, S. Baniya, and A. Desai, “Four legged walking robot for surveillance,” International Research Journal of
Engineering and Technology (IRJET), vol. 8, no. 05, pp. 1457–1463, 2021.
[17] K. N. L, B. K. G, D. V D, H. K. G. R, and J. J, “Home automation system with security using Raspberry-Pi,” International
Research Journal of Engineering and Technology (IRJET), vol. 7, no. 6, pp. 1525–1530, 2020.
[18] A. S. M. A. O. Al, A. Al-Qassa, A. R. Nasser, A. Alkhayyat, A. J. Humaidi, and I. K. Ibraheem, “Embedded design and
implementation of mobile robot for surveillance applications,” Indonesian Journal of Science and Technology, vol. 6, no. 2, pp.
427–440, 2021.
[19] J. Kim, T. Kang, D. Song, and S.-J. Yi, “Design and control of an open-source, low cost, 3D printed dynamic quadruped robot,”
Applied Sciences, vol. 11, no. 9, Apr. 2021, doi: 10.3390/app11093762.
[20] Y. Shi, S. Li, M. Guo, Y. Yang, D. Xia, and X. Luo, “Structural design, simulation and experiment of quadruped robot,” Applied
Sciences, vol. 11, no. 22, Nov. 2021, doi: 10.3390/app112210705.
[21] M. Babiuch, P. Foltynek, and P. Smutny, “Using the ESP32 microcontroller for data processing,” in 2019 20th International
Carpathian Control Conference (ICCC), May 2019, pp. 1–6. doi: 10.1109/CarpathianCC.2019.8765944.
[22] “Raspberry Pi 3 Model A+.” https://www.raspberrypi.org/app/uploads/2018/11/Raspberry_Pi_3A_product_brief.pdf (accessed
Oct. 22, 2023).
[23] S. Oluyemi Owoeye, F. Durodola, and J. Odeyemi, “Object detection and tracking: exploring a deep learning approach and other
techniques,” in Video Data Analytics for Smart City Applications: Methods and Trends, BENTHAM SCIENCE PUBLISHERS,
2023, pp. 37–53. doi: 10.2174/9789815123708123010006.
[24] M. H. Rahman, S. B. Alam, T. Das Mou, M. F. Uddin, and M. Hasan, “Dynamic approach to low-cost design, development, and
computational simulation of a 12DoF quadruped robot,” Robotics, vol. 12, no. 1, Feb. 2023, doi: 10.3390/robotics12010028.
[25] K. Xu, P. Zi, and X. Ding, “Gait analysis of quadruped robot using the equivalent mechanism concept based on metamorphosis,”
Chinese Journal of Mechanical Engineering, vol. 32, no. 1, Dec. 2019, doi: 10.1186/s10033-019-0321-2.
[26] T. Srinivas et al., “Valkyrie—design and development of gaits for quadruped robot using particle swarm optimization,” Applied
Sciences, vol. 11, no. 16, Aug. 2021, doi: 10.3390/app11167458.
[27] D. Dholakiya et al., “Design, development and experimental realization of a quadrupedal research platform: stoch,” in 2019 5th
International Conference on Control, Automation and Robotics (ICCAR), Apr. 2019, pp. 229–234. doi:
10.1109/ICCAR.2019.8813480.
BIOGRAPHIES OF AUTHORS
IAES Int J Rob & Autom, Vol. 13, No. 2, June 2024: 233-246