0% found this document useful (0 votes)
34 views67 pages

Wa0000.

Uploaded by

Kalindu Liyanage
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views67 pages

Wa0000.

Uploaded by

Kalindu Liyanage
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 67

DEVELOPMENT OF GPS CONTROLLED SELF

DRIVING MOBILE ROBOT FOR INDUSTRIAL


PACKAGE DELIVERY

BY:
M.L.K. NILUPUL
(TG/2018/308)
G.L.S. JAGATHPRIYA
(TG/2018/333)
H.P. SANDARUWAN
(TG/2018/344)

BACHELOR OF ENGINEERING TECHNOLOGY (HONOURS)


DEGREE
DEPARTMENT OF ENGINEERING TECHNOLOGY
FACULTY OF TECHNOLOGY
UNIVERSITY OF RUHUNA
SRI LANKA

DECEMBER 2023

i
DEVELOPMENT OF GPS CONTROLLED SELF DRIVING
MOBILE ROBOT FOR INDUSTRIAL PACKAGE DELIVERY

By
M.L.K. NILUPUL
G.L.S. JAGATHPRIYA
H.P. SANDARUWAN
Submitted to
Department of Engineering Technology
Faculty of Technology
In partial fulfillment of the requirements for the
Degree of Bachelor of Engineering Technology Honors

…………………………… ………………………………
Supervisor, Co -Supervisor,
Lecturer (Probationary), Temporary Lecturer,
Mr. Chanaka Weerarathne, Miss Lasini Wickramasinghe,
Department of Engineering Technology, Department of Engineering Technology,
Faculty of Technology, Faculty of Technology,
University of Ruhuna, University of Ruhuna,
Sri Lanka. Sri Lanka.

Date :………………………… Date :…………………………

ii
DECLARATION

We honestly declare that this thesis, which we have written, does not contain the work
or parts of the work of other people, except those cited in the quotations and references,
as a scientific paper should.

Signature Date

Nilupul M.L.K (TG/2018/308)

Jagathpriya G.L.S (TG/2018/333)

Sandaruwan H.P (TG/2018/344)

iii
ACKNOWLEDGMENT

We would like to express my profound gratitude to Dr. Hasini Vitharana (HOD of the
Department of Engineering Technology, Faculty of Technology, University of Ruhuna)
and Prof. Subash Jayasinghe (Dean of the Faculty of Technology, University of
Ruhuna) for their contributions to the completion of Our project titled Development of
GPS controlled self-driving mobile robot for industrial outdoor package delivery.

We would like to express our special thanks to our supervisor Mr. Chanaka Weerarathne
Lecturer (Probationary) for his time and efforts he provided throughout the year. Your
useful advice and suggestions were helpful to me during the project’s completion. In
this aspect, we are eternally grateful to you.

We are grateful to our co-supervisor Miss. Lasini wickramasinghe for giving advice,
encouragement, guidance, and all support made in every necessary time to make this
study successful.

We are grateful to our co-supervisor Mr. Neel karunasena for giving advice,
encouragement, guidance, and all support made in every necessary time to make this
study successful.

We extend our sincere gratitude to all lecturers in the Department of Engineering


Technology and the Faculty of Technology for their encouragement, guidance, advice,
and support extended throughout the period of study.

Finally, I would also like to thank our friends of Faculty of Technology batch 2018.

We would like to acknowledge that this project was completed entirely by us and not
by someone else.

iv
ABSTRACT

Nowadays in the world, different methods are used to transport packages from one place
to another. Among these, Auto Guided Vehicles (AGV) take a major place performing
tasks with minimal human interaction. Primarily employed for indoor settings within
various industries, AGVs falter in outdoor settings. The dynamic nature of outdoor
environments makes it challenging for robots to sense and fast respond to changes. This
study aims to bridge this critical gap by presenting a novel solution for outdoor package
transportation. The objective is to develop a self-driving mobile robot for outdoor
package delivery in an efficient, secure, and low-cost manner. This robot was designed
and simulated using Solidworks CAD software, and materials were selected to match
the design. This robot integrates a dual-controller architecture using a Raspberry Pi and
an STM 32-based Pixhawk Flight Controller. Navigation and control were achieved
through a GPS-guided system. This enables waypoint-based effortless path planning
using Mission Planner software, and the "Smart Return to Launch" mode for
autonomous return trips. The robot uses a sensor system to achieve obstacle avoidance.
It integrates sonar sensors and a camera module with AI-powered Computer Vision
programmed in Python language. Additionally, Open-CV, Keras TensorFlow, NumPy,
and PyTouch libraries were used. The YOLOP algorithm was used to detect objects and
track the lane. Communication between Raspberry Pi and the flight controller was
achieved via UART protocol. The Flight Controller operates the robot based on this
communicated data and GPS position. The robot's six-motor rocker bogie mechanisms
allow smooth travel over uneven terrain, and it allows for climbing bumps. OTP-based
security system secures the packages until the robot reaches its destination location.
The robot's telemetry can communicate with the ground station computer within a 2 km
radius, enabling package delivery in this range. Packing a 10kg payload lasting 2.75km
plus per single charge, it tackles deliveries efficiently. Test results showed successful
navigation along predefined paths and effective obstacle avoidance. This robot has
overall good performance in outdoor package transportation. These findings highlight
the potential of this low-cost GPS-controlled self-driving mobile robot to address
outdoor package delivery in Sri Lankan Industries.

Keywords: Autonomous Mobile Robot, Unmanned Ground Vehicle, Package


Delivery, GPS, Self-Driving, Computer Vision

v
TABLE OF CONTENT

DECLARATION.......................................................................................................... iii
ACKNOWLEDGMENT............................................................................................... iv
ABSTRACT................................................................................................................... v
LIST OF FIGURES ................................................................................................... viii
LIST OF ABBRIVIATIONS.......................................................................................... x
CHAPTER 01 – INTRODUCTION .............................................................................. 1
1.1 Background ..................................................................................................... 1
1.2 Problem Statement .......................................................................................... 2
1.3 Justification ..................................................................................................... 2
1.4 Objectives ........................................................................................................ 3
CHAPTER 02 – LETARETURE REWIEVE ................................................................ 4
Existing Research and Solutions ................................................................................ 4
Gaps and Future Directions ........................................................................................ 9
CHAPTER 03 – MATERIALS AND METHODE ...................................................... 11
3.1 Materials ............................................................................................................. 11
3.2 Method ............................................................................................................... 23
3.2.1 Structure Design .......................................................................................... 23
3.2.2 Material Selection ........................................................................................ 24
3.2.3 Base Struture................................................................................................ 24
3.2.4 Wring Diagram ............................................................................................ 25
3.2.5 Firmware Installation ................................................................................... 25
3.2.7 Parameter Adjustment and Change.............................................................. 28
3.2.5 Basic Tuning of the Controller .................................................................... 29
3.2.6 Controller Calibration .................................................................................. 32
3.2.8 Motor Controlling Method .......................................................................... 38
3.2.9 Develop Obstacle avoiding method 01 (Sonar Range Finder) .................... 40
3.2.12 Develop a Rocker Bogie Mechanism ........................................................ 42
3.2.14 Develop a Security System ........................................................................ 44
3.2.15 Develop an Obstacle Avoiding method 02 (Machine vision) .................... 45
3.2.16 Finalize and Testing ................................................................................... 47
CHAPTER 04 – RESULT AND DISCUSSION .......................................................... 48
4.1 Test Results of Package Delivery Capability ..................................................... 48
4.2 Test without Obstacle avoidance. ....................................................................... 48

vi
4.3 Test with avoidance method 01 – Using sonar sensors. ..................................... 48
4.4 Test with avoidance method 02 – Using Machine Vision .................................. 49
4.5 Security System of the Rover ............................................................................. 51
4.6 Rocker Bogie Mechanism .................................................................................. 51
4.7 Discussion About the Overall System ................................................................ 52
CHAPTER 05 – CONCLUSION ................................................................................ 54
REFFERENCE ............................................................................................................ 55

vii
LIST OF FIGURES

Figure 1: Obstacles Identify Using UGV....................................................................... 5


Figure 2: Flight Controller ........................................................................................... 11
Figure 3: GPS Module ................................................................................................. 12
Figure 4: Ground Station Software Interface ............................................................... 13
Figure 5: V5 Radio Telemetry...................................................................................... 14
Figure 6: Raspberry Pi ................................................................................................. 14
Figure 7: OV5647 Camera Module ............................................................................. 15
Figure 8: GY-US042v2 Sonar sensor ........................................................................... 15
Figure 9 : Arduino Uno ................................................................................................ 16
Figure 10 : GSM Module ............................................................................................. 17
Figure 11: Solenoid Lock ............................................................................................. 17
Figure 12: Keypad........................................................................................................ 17
Figure 13: 16×2 I2C Display ....................................................................................... 18
Figure 15: Electronic Speed Controller (ESC) ............................................................ 18
Figure 16: DC Gear Motor ........................................................................................... 19
Figure 17: 130mm wheels ............................................................................................ 19
Figure 18: 12V Lead-acid Battery ............................................................................... 20
Figure 29: Power Module ............................................................................................ 20
Figure 20: RC Transmitter and receiver....................................................................... 21
Figure 21: Buck Converter........................................................................................... 21
Figure 22: SolidWorks Design of the Rover ................................................................ 23
Figure 23: Metal sheet & Aluminum Box Bar ............................................................. 24
Figure 24: Base structure of the Rover ........................................................................ 24
Figure 25: Wring Diagram ........................................................................................... 25
Figure 26: Device Manager ......................................................................................... 26
Figure 27: Cube Programmer interface part 01 ........................................................... 27
Figure 28: Cube Programmer interface part 02 ........................................................... 27
Figure 29: Rover Firmware Installation tab ................................................................. 28
Figure 30: Parameter list .............................................................................................. 28
Figure 31: Basic Tuning tab ......................................................................................... 29
Figure 32: Tuning turn rate .......................................................................................... 30
Figure 33 : Accelerometer Calibration tab ................................................................... 32
Figure 34: Compass Calibration tab ............................................................................ 33
Figure 35: Radio Calibration tab.................................................................................. 34
Figure 36: Radio calibration results ............................................................................. 34
Figure 37 : ESC Calibration ......................................................................................... 35
Figure 38: Flight modes setup...................................................................................... 36
Figure 39: Mission planning ........................................................................................ 36
Figure 40: Way point upload ........................................................................................ 37
Figure 41: Condition Delay ......................................................................................... 37
Figure 42 : Illustration of the Radii between motor and axis ...................................... 39
Figure 43 : Flow diagram of obstacles avoidance using sonars................................... 41
Figure 44 : Roll values in Radio calibration tab .......................................................... 42
Figure 45 : Rocker bogie mechanism design ............................................................... 43
viii
Figure 46 : Rocker bogie Length ................................................................................. 43
Figure 47 : Rocker bogie height................................................................................... 44
Figure 48: Circuit diagram of the security system ....................................................... 45
Figure 49: Flow diagram of obstacles avoidance using Machine vision. .................... 46
Figure 50: Boundary Lines .......................................................................................... 46
Figure 51 : Object detection of real environment ........................................................ 49
Figure 52 : Object detection of virtual environment .................................................... 49
Figure 53 : Object detection ......................................................................................... 50
Figure 54 : Object detection and avoidance at testing time ........................................ 50
Figure 55 : Security system.......................................................................................... 51
Figure 56: Rocker bogie mechanism ........................................................................... 52

ix
LIST OF ABBRIVIATIONS

UAV - Unmanned Arial Vehicle

UGV - Unmanned Ground Vehicle

AGV - Auto Guided Vehicles

AMR - Autonomous Mobile Robot

GPS - Global Positioning System

MAVLink - Micro Air Vehicle Link

GSM - Global System for Mobile Communications

YOLO - You Only Live Once

YOLOP - You Only Look Once for Panoptic

OTP- One Time Password

RTK-Real Time Kinematics

LiDAR –Light Detection and Ranging

CPU-Central Processing Unit

RAM-Random Access Memory

LAN-Local Area Network

USB-Universal Serial Bus

HDMI-High-Definition Multimedia Interface

LED-Light-Emitting Diode

IDE-Integrated Development Environment

CAD-Computer-Aided Design

CAE-Canadian Aviation Electronics

NASA-National Aeronautics and Space Administration

SMS-Short Message Servic

x
CHAPTER 01 – INTRODUCTION

1.1 Background

From steam engines to the present cyber-physical systems, the world is evolving rapidly
into what is referred to as the Fourth Industrial Revolution or Industry 4.0. These current
technologies facilitate advancements in urban areas through modelling and prediction,
where logistics plays an integral role. A distant future involving robotic package
deliveries is now very much a reality. Advances in robotics, GPS tracking, automation,
and navigation now mean you might not find a delivery person at your door with your
package. The primary example of delivery robots in action comes from Starship
Technologies, a company based out of San Francisco with engineering facilities in
Estonia and Finland. Starship Technologies is the brainchild of Skype co-founders
Janus Friis and Anti Heinla, and they are currently the largest "last mile" delivery robot
company around the robots have a cargo capacity of around 9kg, can travel at a
maximum speed of 4 mph, weigh around 25kg, and cost over $5,0000 to manufacture
(Phillips, 2020). The delivery robot uses many of the same features as an autonomous
car. The basic idea is that this autonomous delivery rover will deliver small items from
one place to another within a short period of time. Using people to transport packages
is a rather difficult matter today(Kasangottuwar, 2017). And it is difficult because
people are restricted from traveling to some places. Package delivery by humans is a
waste of time. This will make it possible to deliver a package from one place to another
point. Many countries in the world are stepping into unmanned package delivery. Here
we present a different solution in Sri Lanka using a small, wheeled delivery rover. As a
solution to this, we have developed a self-driving mobile robot for package distribution
with some specific capabilities. In the modern world of Industry 4.0, customers require
that suppliers make deliveries in a short time, which requires efficient and flexible
logistics.

1
1.2 Problem Statement

Nowadays in the world, different methods are used to transport packages from one place
to another. Among these, Auto Guided Vehicles (AGV) take a major place performing
tasks with minimal human interaction. AGVs have found significant application in
indoor settings across various industries. However, their efficacy diminishes when
confronted with the complexities of outdoor environments. The inherent dynamism of
outdoor settings poses challenges for robots to swiftly sense and adapt to changes in
their surroundings.

This study aims to develop a self-driving mobile robot for efficient, secure, and cost-
effective outdoor package transportation. The autonomous system will navigate
through challenges in dynamic nature of outdoor environments, providing a reliable
and effective solution for outdoor logistics. The goal is to overcome the limitations of
traditional methods and ensure a more efficient and secure outdoor delivery experience.

1.3 Justification

Today, many countries in the world are using unmanned package delivery. As a solution
to this, we are developing a self-driving mobile robot for efficient and secure package
distribution. Delivery robots (or bots) are another form of autonomous delivery. This
innovative technology can improve efficiency in terms of cost savings and lower the
negative impact on the environment(Dr. Vilas Ubale et al., 2023).

Primarily, a GPS path planning STM-32 base flight controller is employed for this
purpose, providing the path from a designated location to the delivery destinations. The
initial step in the automated delivery robot's image process involves collecting
environmental data, typically facilitated by sensors like sonar sensors and camera
module. In this particular setup, the robot is equipped with a camera linked to a
Raspberry Pi. The camera captures images of the surroundings, which the Raspberry Pi
processes, transmitting this data to the flight controller via the UART protocol. The
Flight Controller, utilizing this communicated data and GPS position, guides the robot's
operations.

2
The robot uses GPS for real-time position updates and user-provided end destinations,
with starting and end coordinates as input. It also employs two sonar sensors for
obstacle avoidance.

The rover features a security system that sends an OTP to the user upon reaching a
designated location, allowing access to the parcel. The rover's six-motor rocker bogie
mechanism ensures smooth travel over uneven terrain and allows it to climb over
bumps, enhancing its adaptability to various terrains.

1.4 Objectives

• Our Primary objective is to develop a Self-driving mobile robot that can be used
for package delivery application in outdoor industry.
• And our Secondary objective is to improve performance of the mobile robot
using GPS path planning, Machine Learning and obstacles avoid using Machine
Vision.
• To add a rocker bogie mechanism that can allow the rover to climb over small
bumpers.
• To implement GSM base security mechanism to notify and unlock the packages
after they reach the respective location.

3
CHAPTER 02 – LETARETURE REWIEVE

The industrial landscape is undergoing a transformation due to automation and


efficiency demands. GPS-controlled self-driving mobile robots (AMRs) are
revolutionizing intralogistics in factories, warehouses, and distribution centers. This
literature review explores research on developing and implementing AMRs for
industrial package delivery applications.

Existing Research and Solutions

The literature reviewed provides valuable insights into various strategies to tackle the
challenges.

• ODS-Bot: Mobile Robot Navigation for Outdoor Delivery Services

Delivery robots are tasked with navigating through various obstacles and adapting to
diverse environmental conditions. While there are numerous effective technological
solutions for indoor applications, there remain a multitude of unresolved challenges in
outdoor environments. This study focuses on three technological obstacles that impede
the advancement of campus delivery robots. The initial obstacle involves achieving
reliable localization in diverse and ever-changing outdoor settings. The localization
results achieved through the utilization of Lidar’s and Global Navigation Satellite
System (GNSS) sensors exhibit both complementary benefits and drawbacks. The
suggested localization strategy effectively integrates data from various sensors. The
second obstacle involves ensuring secure navigation by detecting the area that can be
safely traversed. The terrain reversibility analysis presented offers precise and up-to-
date information on both positive and negative obstacles. Implementing an effective
path planning technique to reduce collisions and deadlocks is the third problem.
Recorded local moving obstacle timelines on maps. The trajectory was carefully
determined to avoid densely inhabited areas. Korea University conducted successful
experimental verifications. The results clearly show that the suggested methods are
essential for safe, reliable navigation in dynamic, real-world environments. (Kang et
al., 2022).

4
• A Path-Following Controller for a UAV-UGV Formation Performing the
Final Step of Last-Mile-Delivery

The study investigates how a formation of unmanned aerial vehicles (UAVs) and
ground vehicles (UGVs) can complete a path-following job. Managing the virtual
framework is crucial for synchronizing the motion of two robots to ensure the UAV
lands on the UGV while the ground vehicle remains in motion, using reactive local
path-planning to navigate obstacles. The UGV analyses its environment to identify
hazards and plan an alternative path to avoid obstacles. It temporarily deviates from the
basic course until it feels safe to proceed. The work demonstrates the feasibility of using
the virtual structure paradigm to guide UAVs to land on fixed or movable platforms,
addressing the last-mile delivery challenge. Path-following is a crucial motion control
method that adjusts formation velocity based on its surroundings. This is crucial for
package delivery applications impacted by road, avenue, or street conditions.

Figure 1: Obstacles Identify Using UGV

• Developing capabilities of accurate locating and object avoidance of an


autonomous rover

This research developed a prototype autonomous vehicle for acquiring and returning
throwing equipment during athletics contests. The research began using a platform from
the previous project, ArduRover autopilot software, and an STM32 flight control unit.

The prototype must be able to self-locate more precisely than GPS to be used at athletic
competitions. It should also recognize and overcome athletics pitch barriers. RTK and
LiDAR rangefinder studies were undertaken to improve satellite positioning and target
identification.

5
The ISO-12100 V-model for system development, prototyping, and risk assessment is
expanded by the research. The study produced a prototype that can autonomously avoid
things under ideal conditions. Rover partially achieved its goals, but obstacle avoidance
and reliability in difficult conditions need development. Bright sunshine hindered
LiDAR performance. Complex object avoidance was limited by ArduRover autopilot
software. (Ylimäki, 2021).

• An Autonomous Delivery Robot to Prevent the Spread of Coronavirus in


Product Delivery System

Amidst the global coronavirus pandemic, the robot made on the creation and
advancement of a cost-efficient prototype of an autonomous mobile robot. This robot
is capable of securely delivering packages to a specified location by utilizing the Global
Positioning System (GPS). The robot guarantees a safe and contactless delivery by
utilizing a container that is protected by a password to transport the delivery package.
The quadrupedal autonomous robot is capable of effectively traversing to a
predetermined destination by receiving precise geographical coordinates from orbiting
satellites and adjusting its orientation with the aid of a digital compass. Upon reaching
its destination, the robot remains in a state of readiness until the customer unlocks the
container. Upon delivery, the customer will be required to utilize a password to unlock
the container and retrieve the product that was ordered. The password might be
provided in the client order confirmation. The delivery robot can self-navigate back to
its starting point. The robot was tested for heading angle and trajectory completion
accuracy. Our robot can transport products safely and infection-free and reduce last
mile delivery costs by solving the last mile problem technologically. (Abrar et al.,
2020).

• Autonomous Delivery Robot

This automated delivery robot replaces human deliverymen. The delivery robot can
efficiently navigate a complex spatial environment from an origin to a destination while
avoiding impediments. To complete the mission, the delivery robot's Raspberry Pi 3
was connected to the Arduino MKR1000 Ultrasonic sensor. The robot uses an ultrasonic
sensor to detect bin contents. The robot moves only when an object is spotted. Bluetooth
beacons are utilized to denote the beginning, conclusion, and intermediate positions
that the robot will utilize to distinguish the designated locations. Bluetooth beacons will

6
serve as a device for the robot to detect. The autonomous robot will navigate along an
unspecified trajectory, as opposed to the outdated robots that can only operate within a
designated black lane. The robot undergoes training and multiple simulations using the
Donkey Car library to acquire the ability to effectively avoid obstacles and efficiently
navigate its path, resulting in fast and efficient delivery. The robot utilizes a Pi camera
to accurately correct its course along the intended path that lacks markings (Devang
Dave et al., 2020).

• GPS controlled autonomous bot for unmanned delivery.

The GPS-controlled autonomous BOT for unmanned delivery is a rover designed


specifically for delivering goods without the need for human intervention. The project
wont to three primary elements, specifically, a BOT, a Server, and a Mobile application.
To initiate the process, the user is required to input the initial and final locations and
issue the start command through the mobile application. Upon receiving the command
from the server, the BOT carries out the delivery to the designated destination. There is
no requirement for any form of human interaction in between. The goods-carrying cart
is secured with a lock that can only be operated through the mobile application. The
mobile application has been developed to provide the user with an interactive interface,
featuring a customized map based on Google Maps. There are sensors in specific areas
that detect obstacles and hindrances to guarantee safety. The camera is connected to the
BOT, allowing live streaming to be viewed through the mobile application. The project
was effectively executed and validated within the premises of B.S Abdur Rahman
Crescent Institute of science and technology. The robot effectively navigated from the
user-selected source location to the destination location (Devang Dave et al., 2020).

• Initial Development of Low-Cost Autonomous Rover for Pursuit of Moving


Targets

This paper describes the hardware and software implementation, including Arduino-
compatible circuits architecture, GPS-RTK modules, an improved compass, telemetry
modules, and data logging. This paper covers vehicle control modes, moving target
communication, software library adaption, and sensing and steering recommendations.
The study presents trajectory charts and time-dependent kinematic characteristics from
three autonomous rover outside trials. The rover successfully navigated to fixed
waypoints on its first attempt. Second test: rover independently tracked a mobile target

7
with a complex trajectory. Despite larger distance and speed differences when the target
made unexpected turns or stops, the rover approached and followed it. In the most
recent test, the rover used a sonar instrument to detect impediments along its track to
avoid collisions. Computer vision modules will be integrated during development to
evaluate and improve advanced autonomous control algorithms on this platform(James
et al., 2021).

• OTP Based Authentication Model for Autonomous Delivery

A model for autonomous delivery security is presented in this study. One-Time


Password authentication is used to validate delivery recipients in the model. This
solution reduces delivery errors and security risks, improving service. With this security
layer, autonomous delivery systems can eliminate human verification and improve
delivery security (Rani et al., 2022).

• Design of Rocker Bogie Mechanism

Rocker bogies are essential for on-site scientific investigation of targets many meters to
tens of km apart. Complex mobility designs use many wheels or legs. Quadrupedal
rover with an effective and mobile suspension system for difficult terrain. Rocker
bogies use only two motors for mobility, simplifying their drive train. The two motors
inside the body reduce heat fluctuations, improving dependability and efficiency. The
rover uses four wheels because natural terrain has few barriers that require both front
wheels to climb. Finally, mobility studies on agricultural land, bumpy roads, inclines,
stairs, and obstacle surfaces showed that the rocker bogie mechanism can travel specific
distances in field circumstances (Chinchkar et al., 2017).

• Implementation of vehicle detection algorithm for self-driving car on toll


road using Python language

This study describes the results of a toll road vehicle recognition method for a self-
driving automobile system. A 1280x720 action camera was put on the vehicle's roof to
capture the video. The vehicle speed is 100 km/h. Python 3 is popular for image
processing. Image processing combines object detection, feature perception, color
spaces, and HOG. The results show that this algorithm needs a way to dynamically alter
settings for day and night. Use of constant parameters is limited to consistent lighting.

8
Python implementations can recognize automobiles with above 90% accuracy. (Gwak
et al., 2019).

• Simulation of Self-driving Car using Deep Learning

AI rapid growth has revolutionized autonomous vehicles by integrating complex


models and algorithms. This project aims to build an autonomous driving Deep
Learning model. The model should adapt to real-time tracks without manual feature
extraction. Computer vision models that learn from video data are presented in this
work. Image processing, augmentation, behavioral cloning, and a convolutional neural
network model are required. The neural network architecture identifies a video
segment's trajectory, road boundaries, and obstacle positions, and behavioral cloning
lets the model learn from human actions in the video. (Bhalla et al., 2020).

Gaps and Future Directions

Despite the promising research showcased, gaps still exist in the development of GPS-
controlled AMRs for industrial package delivery applications. Addressing these gaps
through future research efforts will be crucial for widespread and successful
implementation.

Limited Integration of Machine Vision and GPS Path Planning: While Machine vision
and GPS path planning have been explored in other domains, their full potential for
industrial AMRs remains largely untapped. Future research should focus on

• Developing Machine vision algorithms for accurate obstacle detection and


classification in complex industrial environments. This includes handling diverse
lighting conditions, cluttered spaces, and moving objects.

• Integrating Machine vision data with ultrasonic sensor readings and GPS
information for enhanced localization and dynamic obstacle avoidance. The integration
of sensor data can provide a more comprehensive understanding of the robot's
surroundings.

9
• The project involves developing advanced GPS path planning that uses real-
time machine vision and sensor data to dynamically optimize delivery routes and
anticipate environmental changes.

Security and Accountability Concerns

• The use of this AMR should be safeguarded against theft and hijacking during
package delivery.

• Real-time tracking and monitoring of AMRs and packages is crucial for


accountability, identifying potential security breaches, and ensuring smooth delivery
processes.

Cost-Effectiveness and Scalability

• The widespread adoption of low-cost and energy-efficient AMR designs


requires the optimization of hardware components and software algorithms for efficient
operation.

• Standardization of robot design and communication protocols can enhance


scalability and flexibility by facilitating interoperability between AMRs from different
manufacturers.

Adaptability to Diverse Industrial Environments

• The study requires research on the adaptability of AMR to various industrial


layouts and floor types, including navigating uneven surfaces and handling potholes
and bumps.

• The development of flexible robot designs capable of handling various package


sizes and weights is crucial for expanding the applications of AMRs in industrial
settings.

Addressing these gaps and pursuing these future research directions will pave the way
for a paradigm shift in industrial logistics. By leveraging the power of GPS-controlled
self-driving mobile robot equipped with Machine Vision, sonar sensors fusion, and GPS
path planning, we can build a future of efficient, secure, and cost-effective intralogistics
that revolutionizes the way we manage and package delivery within factories,
warehouses, and distribution centers with outdoor environments.

10
CHAPTER 03 – MATERIALS AND METHODE

3.1 Materials

There are lot of materials had to use for developing the rover here. In here, materials
were explained one by one as what is the material and function of the material and what
is the function of the material for develop the rover.

Flight Controller

A development board based on microcontrollers, the STM32 base Pixhawk flight


controller was created for UAV producers, military applications, and some industrial
uses. This microcontroller is mainly used for path planning and controlling the UAVs.
So, we use this main to control unit as our package delivery robot. Chose the 2.4.8
Pixhawk Flight Controller due to their higher performance than their older versions. In
older versions, some specifications are not included.

Figure 2: Flight Controller

Specifications

• Processor
o 32-bit ARM Cortex M4 core with FPU
o 168 Mhz/256 KB RAM/2 MB Flash
o 32-bit failsafe co-processor

11
• Sensors
o MPU6000 as main accel and gyro
o ST Micro 16-bit gyroscope
o ST Micro 14-bit accelerometer/compass (magnetometer)
o MEAS barometer
• Power
o Ideal diode controller with automatic failover
o Servo rail high-power (7 V) and high-current ready
o All peripheral outputs over-current protected, all inputs ESD protected
• Interfaces
o 5x UART serial ports, 1 high-power capable, 2 with HW flow control
o Spektrum DSM/DSM2/DSM-X Satellite input
o Futaba S.BUS input (output not yet implemented)
o PPM sum signal
o RSSI (PWM or voltage) input
o I2C, SPI, 2x CAN, USB
o 3.3V and 6.6V ADC inputs (ardupilot.org)

GPS Module
GPS module is a device that can be used for navigating the position of where it is. It is
used to connect with satellites and communicate with flight controllers. This is a new
generation NEO-M8N High Precision GPS Module with Built-in Compass for
PIXHAWK and APM FC (with onboard Compass, low power consumption, and high
precision, the ultimate accuracy is 0.5 meters, greater than the previous generation
NEO-7N 1.4 meters accuracy, support GPS/QZSS L1 C/A, GLONASS L10F, BeiDou
B1 protocol, and more. (robu.in) we use this to navigate the position of the rover.
(ardupilot.org)

Figure 3: GPS Module

12
Mission Planner Ground Control Software

Mission Planner is a full-featured ground station application for the ArduPilot open-
source autopilot project. This software can edit as user implementation. Mission
Planner is a ground control station for Plane, Copter and Rover. Mission Planner can
be used as a configuration utility or as a dynamic control supplement for any
autonomous vehicle. Using this software can find where the rover is here, rover is
moving or stop, and Path planning. The computers that are used install the ArduPilot
software to control and Guide the Unmanned systems. (ardupilot.org)

Figure 4: Ground Station Software Interface

Telemetry

V5 Radio Telemetry can be used to provide a wireless MAVLink connection between


a ground control station and a vehicle equipped with a flight controller. This allows
parameters to be adjusted while a vehicle is in flight, telemetry checked in real time,
mission changed in flight (PX4 User Guide (Main), n.d.). Using this module, we can
monitor the rover from a ground station computer.

• 915Mhz frequency band


• Transparent serial link
• Range of approx. 2km with supplied aerial
• Demonstrated range of several kilometers with a small omni aerial

13
• Can be used with a bi-directional amplifier for even more range.
• MAVLink protocol framing and status reporting.
• Frequency hopping spread spectrum (FHSS)
• Adaptive time division multiplexing (TDM)
(PX4 User Guide (Main), n.d.)

Figure 5: V5 Radio Telemetry

Raspberry Pi

Featuring a 64-bit quad-core Arm Cortex-A76 processor running at 2.4GHz, Raspberry


Pi 5 delivers a ×2 to ×3 increase in CPU performance. 4GB RAM and Alongside a
substantial uplift in graphics performance from an 800MHz Video_Core VII GPU dual
4Kp60 display output over HDMI; and state-of-the-art camera support from a
rearchitected Raspberry Pi Image Signal Processor, it provides a smooth desktop
experience. (raspberrypi.com) Using this device can real-time video processing and
detect Objects. And signal pressing according to the object detection.

Figure 6: Raspberry Pi

14
Raspberry Pi Camera Module

The pi Camera module is a camera that can be used to take pictures and high-definition
video. Raspberry Pi Board has CSI (Camera Serial Interface) interface to which we can
attach the Pi Camera module directly. This Pi Camera module can attach to the
Raspberry Pi's CSI port using a 15-pin ribbon cable. (electronicwings.com)

We use 5MP 2K HD FPC OV5647 (95 degree) Camera Module with 2592x1944
Resolution for this rover. This camera module is used to capture real time video required
for machine vision in this rover.

Figure 7: OV5647 Camera Module

Ultrasonic Rangefinder

The GY-US042v2 Sonar is an inexpensive, short range (up to 4m) range finder
primarily designed for indoor use but which has been successfully used outdoors on RC
vehicles. It does provide more consistent height control below 4m than many
barometers. This sensor is similar to the Maxbotix I2C Sonar Rangefinder. but can
operate in I2C communication.(ardupilot.org) We use this for rover’s obstacle
avoidance system.

Figure 8: GY-US042v2 Sonar sensor

15
Arduino Uno Board

The Arduino Uno is an open-source microcontroller board based on the Microchip


ATmega328P microcontroller and developed by Arduino.cc and initially released in
2010. (Wikipedia)

• CPU: Microchip AVR (8-bit); at 16 MHz


• Memory: 2 KB SRAM
• Storage: 32 KB Flash; 1 KB EEPROM

In here Arduino Uno Board was used to operation of security system of the rover.

Figure 9: Arduino Uno

Arduino IDE

Programming and code development for Arduino microcontroller boards is done


through the open-source Arduino Integrated Development Environment (IDE). The
Arduino IDE provides an intuitive interface for creating, developing, and uploading
code to Arduino boards. Arduino is a popular platform for creating electrical projects.
This software was used to code the Arduino board for security system (Richards et al.,
2014).

GSM Module

The SIM800L GSM module can be used in a variety of IoT projects. A chip or circuit
used to establish communication between a mobile device and a GSM or GPRS system
is called a GSM or GPRS module. In this case, a modem (modulator-demodulator) is
essential (Umbarkar et al., 2017).Here we used SIM800L GSM module for security

16
system. This module is used to send a password to the recipient’s phone after the vehicle
reaches the desired location.

Figure 10: GSM Module


Solenoid Lock

A solenoid is a small electromagnet capable of pushing or pulling a plunger to perform


an operation. This allows the door lock to open without retracting the lock, as it either
holds the lock opening closed or allows it to open. Using this lock can safely deliver
the package to the required location and only recipients can get the package after open
the lock system.

Figure 11: Solenoid Lock

Keypad

Membrane keyboards are a great place to start when incorporating key inputs into a
project, as they are affordable, durable and waterproof. Additionally, when developing
a number of projects that require human participation for menu selection, password
entry, or robot action, understanding how to connect them to an Arduino is very useful
(Umbarkar et al., 2017). Here, 4x4 16-Key Membrane Switch Keypad Module
(MD0062) use to enter the password to open the solenoid lock.

Figure 12: Keypad

17
LCD Display Module

LCD itself can emit light itself. It must utilize outside light sources. LCD display
module normally includes LCD glass (or LCD panel), LCD driving circuitry and a
backlight. (orientdisplay.com) In here we use 16×2 I2C LCD Display module for show
the outputs related to the security system to the user.

Figure 13: 16×2 I2C Display

ESC (Electronic Speed Controller)

An electrical circuit known as electronic speed control (ESC) is used to regulate and
control the speed of an DC motor. It can also offer dynamic braking and reverse. ESC
has more reaction speed and accurate than traditional motor controls.

An electronic speed control changes the switching speed of a field-effect transistor


(FET) network in response to a speed reference signal, The speed of the motor can be
changed by changing the duty cycle or switching frequency of the transistors.
(Wikipedia contributors, 2023a). The use of the ESCs is controlling the speed of DC
motors in the vehicle for Speed up, Speed down, reverse, stop and turn.

Figure 15: Electronic Speed Controller (ESC)

18
DC Gear Motor

A gearbox and a DC (direct current) motor are combined to form a DC gear motor. The
torque of the engine increases and its speed decreases with the help of the gearbox.
Because of this combination, DC gear motors are suitable for applications that require
high torque output and precise speed control (Lee et al., 2022).

Here, 6 × 200 RPM 12V 37GB high torque gear motors were used for carrying a high
load without any slowdown.

Figure 16: DC Gear Motor

Wheels

These wheels can use challenging terrains with these high-performance RC tires.
Featuring a substantial 130mm and 60mm width, these tires showcase a winning
combination of grippy rubber and rugged patterns, delivering unparalleled traction.
Lightweight yet durable, pre-mounted on sponge liners for immediate action, these tires
enhance the control of RC vehicle. With easy installation and removal, this tire can be
used in dirt paths with using these powerful treads.

Figure 17: 130mm wheels


Battery

A device consisting of one or more electrochemical cells that convert chemical energy
stored in the device into electrical energy is often referred to as a "battery". Many
electrical devices, from small appliances to electric cars, run on batteries. Due to
requirement of high-power consumption powerful battery was required. So, 12V and
2AH Rechargeable Lead-acid Battery used to fulfil this requirement.
19
Figure 18: 12V Lead-acid Battery
Power Module

We use this analog power module that provides a stable power supply to the flight
controller and also supports measuring the battery voltage and current consumption.

Specifications

• Maximum input voltage of 18V


• Maximum of 90 Amps (but only capable of measuring up to 60 Amps)
• Provides 5.37V and 2.25Amp power supply to the flight controller.

The Power Module provides enough power for the autopilot, receiver, and a few low
powered peripherals (like telemetry) but does not have enough power for motors or
high current devices like FPV transmitters. (ardupilot.org)

Figure 29: Power Module

RC Transmitter and Receiver

A device that produces and emits radio waves is called a radio transmitter. Its main
purpose is to wirelessly transmit data signals over the air to the receiver. The FS-i6 is a
digital proportional radio control system that operates on the global ISM band of
2.4GHz frequency, making it suitable usage across the globe. Further the transmitter is

20
built on AFHDS 2A (Automatic Frequency Hopping Digital System Second
Generation) technology, this used to manually operate the rover and provide positions
to the controller (Richards et al., 2014).

Figure 20: RC Transmitter and receiver

Power Converter (Buck Converter)

A DC-DC (direct current to direct current) power converter that effectively steps down
a higher input voltage to a lower output voltage is called a buck converter, often referred
to as a buck converter. It is widely used in power supply applications and electronic
devices to provide constant, controlled power to various components or systems. This
type converters were used to avoid any damages from supply side to load side (Devos
et al., 2018).

Figure 21: Buck Converter

SolidWorks software

SolidWorks design is the developer of the computer-aided design (CAD) and computer-
aided engineering (CAE) program SolidWorks. In engineering and design sectors, it is
widely used to produce 3D models, simulations, and technical documents. SolidWorks
provides a wide range of tools for manufacturing, analyzing, and designing products.
This use to design and simulate structure of robot.

21
Structure

After developing the SolidWorks design Structure was developed. Then, ¾ Inches
Aluminum Box Bars were used for developing to frame and Metal Sheets were used to
develop cover the vehicle. Cladding Board was used to develop base plate of the
vehicle.

CG Calculation,

[Weight of Front section ∗ (Front track /2)] + [Weight of Rear Section ∗ (Rear track /2)]
𝐶𝐺 (𝑓𝑟𝑜𝑚 𝑡ℎ𝑒 𝑓𝑟𝑜𝑛𝑡 𝑎𝑥𝑙𝑒) =
Total Weight

[2.5kg ∗ (0.4m /2)] + [2.5kg ∗ (0.4m /2)]


𝐶𝐺 (𝑓𝑟𝑜𝑚 𝑡ℎ𝑒 𝑓𝑟𝑜𝑛𝑡 𝑎𝑥𝑙𝑒) =
5kg

CG (from the front axle) = 0.2 𝑚

Load Calculation

Each motor carry load = 3Kg

Total carry load = 3 × 6 = 18Kg

weight of the rover without load = 5Kg

Safety weight = 3Kg

Assume the carrying load of the rover is 10Kg.

Power Consumption Calculation

A motor Rated Current = 0.92A

Number of Motors = 6

Total Current consumption by the rover including other Mechanisms = (0.92 ×6) + 0.5

= 6A

Battery Output Current = 8.2A

Working time = 8.2 ÷ 6 = 1.35 hours = 82 minutes

Motor maximum RPM = 200RPM

All motors average RPM under all conditions = 100RPM

22
In one Revolution of the motor moving distance = 35cm

Rover moving distance in one minute = 35cm × 100 = 3500cm = 35m

Total distance can move the rover = 35m × 82 min = 2870m = 2.87km

Rover can self-drive at least mere than 2.75km.

3.2 Method
3.2.1 Structure Design
First thing of a project is Designing part. So, using SolidWorks Software, vehicle was
designed according to the requirement. Here, the size of the Scop had to design more
difficult parts. The creation of an autonomous rover represents an important innovative
step in the field of autonomous exploration. So, the work in this exciting industry
involves carefully crafting the design elements of autonomous vehicles using
SolidWorks, a powerful computer-aided design (CAD) program. This was an essential
application in designing the rover as it allows the creation of new 3D models and the
ability to assemble several very complex systems. To use this application requires good
knowledge and should have a good understanding about the structure of the new design.
The purpose of this application for our rover was to create a beautiful rover with high
efficiency and precision. It can improve the self-driving facility and control the rover
according to its configuration. in the figure below can be show how the structure was
designed by using SolidWorks Software.

Figure 22: SolidWorks Design of the Rover

23
3.2.2 Material Selection

Material Selection is one of the important parts of the project. This is very important in
self-driving vehicles because the efficiency of the rover depend on the materials were
used. The area of expertise as a designer in this industry is navigating the challenging
process of selecting materials that offer the perfect combination of strength,

weight, and durability. Through a deep understanding of mechanical properties, thermal


aspects and environmental effects, this was aimed to optimize material composition to
solve the specific difficulties that small rovers may face in their exploratory missions.

Figure 23: Metal sheet & Aluminum Box Bar

3.2.3 Base Struture


Here, an aluminum box bar was used to form the frame, thus reducing the weight of the
rover was the main requirement. It also keeps the strength of the frame well and the
cost reduction was another advantage. Also, since aluminum is resistant to dead
binding, the longevity could be increased. It was planned to use metal sheets to cover
the rover and it was found to be perfect for this as it is lighter and more durable. A
strong plate was used. Fixing these to each other was done by rivets and had to use nut
and bolt.

Figure 24: Base structure of the Rover


24
3.2.4 Wring Diagram

Figure 25: Wring Diagram

In accordance with the wiring diagram, the rover was wired. The wiring diagram serves
as a comprehensive visual representation detailing. and the intricate network of
electrical connections and components within the rover system.

Each component, ranging from sensors to actuators, is methodically integrated into the
wiring framework, considering factors such as battery, power module, signal
transmission, and control interfaces. The diagram not only specifies the physical
placement of these components. but also elucidates the precise wiring connections,
ensuring an organized and functional assembly.

3.2.5 Firmware Installation

There are several procedures involved in installing firmware on a Pixhawk autopilot


system. Pixhawk, the popular open-source flight control hardware, is applicable to
various unmanned vehicles such as robotic systems and drones. Below are general
instructions for using Mission Planner to install firmware on the Pixhawk.

The required DFU (direct firmware upload) drivers were installed by the
STM32CubeProgrammer, which could also be used to flash the firmware to autopilots
in DFU mode. The program was downloaded and installed. JAVA may have been
required to set up the program.

25
The ArduPilot firmware was downloaded for the flight controller board
from firmware.ardupilot.org. can normally find the appropriate firmware by doing the
following. firmware.ardupilot.org was opened. vehicle type was selected
(i.e. Plane, Copter, Rover, Sub or Antenna Tracker). “beta” or “stable” was selected.
the directory was looked with the name that most closely matches the autopilot. The
“arduXXX_with_bl.hex” file was downloaded clicking on it.

Figure 26: Device Manager

The firmware was uploaded to the autopilot using the following steps: The board's DFU
button was held down, or its "BOOT" pins were temporarily bridged. A USB cable,
attached to the PC, was then plugged in. The button was released, or the pins were
unbridged once the board was powered. The Windows Device Manager was opened,
and "Universal Serial Bus devices" was checked for "STM32 BOOTLOADER" to
confirm that the board was in DFU mode (Loading Firmware | PX4 User Guide (Main),
n.d.).

The STM32CubeProgrammer was started, and the connection method was selected as
USB. A USB port was verified to be displayed, indicating that the board was detected
in DFU mode. "Connect" was pressed, and the board's CPU specifics appeared. "Open

26
file" was then selected to choose the downloaded "arduXXX_with_bl.hex" file. The file
name became visible in the tab, and "Download" was pressed to flash the file to the
board. Subsequently, the board was rebooted, and a connection was established with
the ground station software.

Figure 27: Cube Programmer interface part 01

Figure 28: Cube Programmer interface part 02

The Rover Firmware was then installed in the flight control with a simple click on the
"Rover V4.4.0 OFFICIAL" option. This ease of installation highlighted the user-
friendly nature of the system.

27
Figure 29: Rover Firmware Installation tab

3.2.7 Parameter Adjustment and Change

There are lot of parameters that were changed when the Firmware changed to “Rover”
mode. These are the special parameters were changed.

After install Rover V4.4.0 OFFICIAL parameters should be setup. First, the flight
controller must relate to the ground station software. After connection simply click the
CONFIG tab and then the parameters can be set by simply clicking the full parameter
list. The flight controller receives data from the sensors and according to the data, the
actuators can get output by setting these parameters as required. According to this
parameter list, the following parameters of this robot were enabled as follows.

Figure 30: Parameter list

28
• SERVO1_FUNCTION = 73 (Left motor control)
• SERVO3_FUNCTION = 74 (Right motor control)
• MODE_CH = 5 (RC channel use to change in flight modes)
• RC7_OPTION = 7 (RC channel use to Save Way Point in learning mode)
• RC8_OPTION = 8 (RC channel use to Clear Way Points)
• RC6_OPTION = 41 (RC channel use to Arm/Disarm)
For Sonar Sensors Set-up
• For I2C mode, was set-up the RNGFND1_TYPE = “2”
• For Triggered Pulse mode, was set-up the RNGFND1_TYPE = “30”.
• For Serial Mode, was set-up RNGFND1_TYPE = “31”. And setting up the
Serial Port used to communicate with it to SERIALx_PROTOCOL = “9”
(Rangefinder) and SERIALx_BAUD = 9 (9600).
• Sonar sensor has a maximum useful range of 4m, therefore
RNGFND1_MAX_CM = “400”.
(Getting and Setting Parameters — Dev Documentation, n.d.)

3.2.5 Basic Tuning of the Controller

Within the Basic Tuning section, attention was turned to the Turn Rate PID Tuning.
The Turn Rate controller, employed in all modes except Hold and Manual, was tasked
with achieving the desired turn rate, as set by either the pilot or autopilot, using a PID
controller. The FF, P, I, and D gains for this controller were respectively stored within
the ATC_STR_RAT_FF, ATC_STR_RAT_P, ATC_STR_RAT_I, and ATC_STR_RAT
_D parameters.

Figure 31: Basic Tuning tab


29
To achieve a more precise measurement of the vehicle's maximum turn rate, the
ACRO_TURN_RATE parameter was set to a value that roughly approximated it in
degrees per second. The following steps were then undertaken:

First, the "Tuning" checkbox, located in the bottom middle of Mission Planner's Flight
Data screen, was checked. The graph was double-clicked, and "gz" (Gyro Z-axis) was
selected for display.

Next, the vehicle was driven in Manual mode at a medium speed, making very sharp
turns. The highest values observed on the "gz" graph were noted, and the
ACRO_TURN_RATE parameter was set slightly lower than these values. It was
important to remember that the displayed value might be in centi-degrees per second,
so it was divided by 100 to align with the parameter's unit of degrees per second.

The GCS_PID_MASK parameter was then set to 1 (Steering). To visualize the relevant
PID data on the Flight Data screen, the "Tuning" checkbox was checked again, and both
"pidachieved" and "piddesired" were selected for display after double-clicking the
graph.

Figure 32: Tuning turn rate

30
The vehicle was driven in Acro mode at a medium speed, performing various wide and
tight turns. Close attention was paid to how closely "pidachieved" followed
"piddesired" on the Flight Data screen. The tuning process began with the "FF" gain,
responsible for directly converting the desired turn rate into steering servo or motor
output. If the vehicle's turn rate response appeared sluggish, this parameter was
increased. Conversely, if the vehicle consistently overshot its desired turn rate, the "FF"
gain was reduced.

Next, the "P" gain, which corrects for short-term error, was addressed. It was often
possible to set this to a low value, such as 20% of the "FF" value, if the "FF" value had
been well-tuned. However, if set too high, the "P" gain could lead to turn rate
oscillations. It was crucial to maintain the "P" gain below the "FF" gain.

The "I" gain, responsible for correcting long-term error, was often set to the same value
as the "P" gain. If the vehicle consistently failed to achieve the desired turn rate, this
parameter was increased. However, if the vehicle's turn rate exhibited slow oscillations,
the "I" gain was reduced. Like the "P" gain, the "I" gain was always kept lower than the
"FF" gain. The "D" gain, intended to stabilize output by counteracting short-term
changes in turn rate, was typically left at zero. In the final stage of tuning, the
"ATC_STR_RAT_MAX" parameter was set to the same value as the
"ACRO_TURN_RATE" parameter. Optionally, the "ACRO_TURN_RATE" parameter
could be reduced to make turns in Acro mode more manageable for the driver. It's
important to note the distinction between these two parameters:

• ACRO_TURN_RATE: Controls the conversion of pilot input into the desired


turn rate in Acro mode. This parameter can be adjusted to fine-tune the
responsiveness of turns in Acro mode.
• ATC_STR_RAT_MAX: Represents the absolute maximum turn rate the vehicle
will attempt in any mode. It's generally recommended to keep this value close
to the vehicle's performance limits to maintain agility.

31
3.2.6 Controller Calibration

Accelerometer Calibration

Figure 33: Accelerometer Calibration tab


Upon completion of the basic tuning process, the SETUP tab was selected, followed by
Mandatory Hardware. Within this section, the Accel calibration was performed. This
step was essential to correct for bias offsets in the accelerometers of the flight controller
across all three axes: X, Y, and Z performed. This step was essential to correct for bias
offsets in the accelerometers of the flight controller across all three axes: X, Y, and Z.
By simply clicking these three "Calibrate Accel", "Calibrate Level" and "Simple Accel
Cal" shown in the figure and after that the mission planner software gives instructions
and by following those instructions, this Accel Calibration can be done easily.

Compass calibration
Compass calibration ensures accurate heading information for the flight controller,
crucial for various functions like Navigation, Obstacle avoidance and Automatic flight
modes. The Compass option was then selected, and within the "Onboard Mag
Calibration" section, the "Start" button was pressed. If the flight controller possessed a
buzzer, it emitted a continuous tone accompanied by brief beeps at one-second
intervals. The vehicle was elevated, and sequential rotations were performed to briefly
orient each side (front, back, left, right, top, and bottom) downward towards the ground.
This process involved 6 complete 360-degree rotations, ensuring the vehicle faced a

32
different direction relative to the ground with each turn. Additional time and rotations
might have been required to verify or repeat the calibration if the initial attempt was
unsuccessful. As the vehicle rotated, the green bars on the screen progressively
expanded towards the right until calibration completion. Upon successful completion,
a reboot of the flight controller was necessary.

Figure 34: Compass Calibration tab

Radio calibration
Radio calibration ensures the flight controller accurately interprets with radio control
inputs (sticks, switches, etc.)

The Radio Calibration screen was accessed within Mission Planner's INITIAL SETUP,
Mandatory Hardware section. The green "Calibrate Radio" button, positioned at the
bottom right of the screen, was selected. When a prompt appeared, the "OK" button
was pressed to confirm that the radio control equipment was powered on and
disconnected from the battery. The transmitter's control sticks, knobs, and switches

33
Figure 35: Radio Calibration tab

were then manipulated to their maximum positions. Red lines appeared over the
calibration bars, indicating the lowest and highest values observed. Upon completion,
the "Select" button was clicked. A pop-up window emerged with the instruction to
verify centered sticks and a lowered throttle position, followed by an "OK" button press.
The throttle was adjusted to the middle position, and the "OK" button was pressed.
Mission Planner then displayed a concise summary of the calibration data. Typical
ranges for minimum values were around 1100, while maximum values hovered around
1900. Following successful calibration, a reboot of the flight controller was required.

Figure 36: Radio calibration results

34
ESC Calibration

Figure 37 : ESC Calibration


The ESC calibration process was undertaken, involving the instruction of the ESC on
the specific range of throttle inputs it should respond to. This required determining the
precise PWM values on the throttle channel that corresponded to both motors shut-off
and maximum throttle, with these values being stored within the ESC itself.

Under the SETUP tab, the ESC calibration option was selected. The robot was lifted to
prevent its wheels from contacting the ground, and the "Calibrate ESCs" button was
pressed. Power was then disconnected, followed by a reconnection of power and
activation of the Safety Switch. A beep sound signaled successful ESC calibration.
Finally, the flight controller was rebooted.

Flight modes setup

The SETUP Flight Modes tab was utilized to facilitate effortless flight mode
modifications. Following the completion of the desired flight mode adjustments, the
"Complete" button was engaged, enabling the preservation of the newly configured
flight modes.

35
Figure 38: Flight modes setup

Mission Planning
Upon completion of the calibration process, the PLAN tab was employed for the
specification of the robot's desired path. Waypoints were established through simple
clicks on the map presented within the ground station software. Subsequently, the path
to be traversed by the robot was uploaded to the flight controller by inputting the
appropriate conditions on the right-hand side of the mission planner.

Figure 39: Mission planning

36
However, in this instance, the learning flight mode was engaged, and the robot was
manually guided along the necessary path via an RC transmitter. Waypoints were
meticulously recorded during this process and subsequently preserved using the "save
file" command. To furnish the robot with the required path, these saved waypoints were
imported into the ground control software using the "load file" function and then
transmitted to the flight controller for execution.

Figure 40: Way point upload

As shown in the image, the capability to temporarily halt the rover at a specific location
for a predetermined duration was achieved through the utilization of a
CONDITION_DELAY, serving as a substitute for a traditional waypoint.

Figure 41: Condition Delay


37
With the path transmitted to the flight controller, a simple mode switch to "auto"
triggered the robot's autonomous journey guided by the pre-programmed waypoints. Its
trusty GPS module meticulously tracked its location, ensuring precise adherence to the
planned route as it embarked on its mission.

3.2.8 Motor Controlling Method

Robotics, RC (radio controlled) cars, electric vehicles, drones, and other applications
often use electronic speed controllers (ESCs) as integral parts of the control system for
electric motors. The main function of an ESC is to control the electrical energy received
by a motor to adjust its speed and direction. This is more accurate than motor
controllers. accuracy is more important to this kind of robot.

A Pixhawk flight controller sends control signals to the ESC. The ESC contains an
integrated microprocessor that handles the incoming control signals. These signals are
usually sent as PWM (Pulse Width Modulation). The microcontroller decodes the signal
to determine the desired speed and direction of the motor. These ESCs usually have a
power stage made up of MOSFETs (Metal Oxide Semiconductor Field Effect
Transistors) Acting as switches.

Next, to regulate the power supplied to the motor, the microcontroller modifies the
PWM signal. The ESC may adjust the average voltage delivered to the motor, which in
turn controls the motor's speed, by altering the pulse width. The ESC could regulate
the motor's direction in addition to its speed. Reversing the polarity of the voltage
supplied to the motor accomplishes this. Additionally, the 12V battery is linked to the
ESC. To provide the motor, the necessary power, it controls the high-voltage input from
the battery. The ESC typically must be calibrated before it can be used. To make sure
that the ESC can understand the whole range of input signals from the controller, this
entails setting the lowest and maximum throttle settings.

Also, the ESC calibration can be done by Above mentioned calibration method of the
mission planner software.

Motor turn rate calculation

In Case 1, when the robot is driving straight ahead, it operates without distinguishing
between forward and backward motion. The data bytes containing speed and direction
38
received from the controller are simply transferred. Logically, these data bytes remain
consistent between the left and right sides, as both sides have the same speed and
direction. This scenario is easily recognized and executed seamlessly.

Moving on to Case 2, which involves rotation, the flight controller is strategically


positioned at the intersection of the robot's base diagonals. Calculations reveal that the
motors on the left side should rotate in the opposite direction to those on the right side.
However, given that the direction is already specified in the command data bytes, this
case can be identified and managed accordingly. Notably, only the speed of the middle
wheels needs to be multiplied by 0.545, enabling rotation at the designated position.

Now, in Case 3, where the robot is turning curves, the complexity increases. If the
vehicle continues forward from Case 1 without stopping before turning left or right, the
controller maintains the average speed but introduces a speed difference in the direction
of travel. Essentially, this means an increase in speed on one side and a decrease on the
other during the driving process. This adjustment continues until either the maximum
or minimum speed is reached.

Figure 42 : Illustration of the Radii between motor and axis

Considering the variables:

• M: Motor
• R: Radius
• V: Velocity

Driving curves involves a different approach compared to turning in a stationary


position. The rotation axis is not at the center of the flight controller but is calculated at
the border for specific considerations. Assuming a right turn, the speed of the left and

39
right sides is obtained from the controller, and the following calculations are uniform
regardless of the turning direction.

First, Calculate the rotating axis radius first. Since a circle's circumference is highest
when its radius is largest, the wheel farthest from the rotation axis must travel farthest.
Wheels M1 and M3 receive the maximum speed during full turn in the limit value.
(maximum speed at full turn).

VM3=VM1=Vleft

M2 maximum speed can be computed from there. The formula is as follows since the
curve slope depends on the speed difference controlled.

VM2 = Vleft – (Vleft – Vright) (1 – r2/r3)

Due to the shorter distance to the rotating axis, the left side-controlled speed is deducted
by the speed difference of the two sides multiplied by a constant. Right side similar.

VM6 = Vright + (Vleft – Vright) (r6/r3)

Speed differential determines inclination. Since the wheels move perpendicular to the
axis, front wheels move quicker and back wheels slower.

The remaining wheel formulas can be derived from this.

VM5 = Vright + (Vleft – Vright) (r5/r3)

VM6 = Vright - (Vleft – Vright) (r4/r3)

3.2.9 Develop Obstacle avoiding method 01 (Sonar Range Finder)

In the Obstacle avoiding part there are two methods used to detect and avoid the
obstacles. Here the first method was Sonar Sensors using obstacle avoidance. When the
rover moves from one location to another location, there are some objects and obstacles
that can be found from the rover. Here, two sonar sensors were used, and one was fixed
to the left side and the other to the right side. Then when the rover moves forward, the
obstacles coming from both sides are detected and the distance between the object and
the rover is detected. Here, the rover is turned according to the distance by this
avoidance system, where the turning process is done automatically by the flight
controller. Then the amount of turning is increased as the distance between the object

40
and the rover decreases. The rover starts turning when the distance between the rover
and the object is 400cm or less. The rover stops when the distance between the rover
and the object is 100cm or less. To identify this behavior, the radio calibration tab of
the mission planner software can be used and thus the radio calibration can be identified
when the rover is turned to the right or left. And the flow diagram of the obstacle
avoiding system can be seen below through the flow diagram developed for that.

Figure 43: Flow diagram of obstacles avoidance using sonars.


Turning Calculations

When the object was detected in Right side Sensor and distance between the object and
the Rover was 400cm,

Normal Roll value was 1515

ROLL_LEFT = 1515+30+((400-RIGHT_SENSOR) ×5)

ROLL_LEFT = 1515+30+((400-400) ×5)

ROLL_LEFT = 1515+30+((0) ×5)

ROLL_LEFT = 1515+30+(0)

ROLL_LEFT = 1515+30

ROLL_LEFT = 1545

When the Right sensor detect an object from Right side and distance was 400cm, Roll
Value was 1534. According to this values motor control can be done.

When the object was detected in Left side Sensor and distance between the object and
the Rover was 350cm,
41
ROLL_RIGHT = 1515-30-((400-LEFT_SENSOR) ×5)

ROLL_RIGHT = 1515-30-((400-350) ×5)

ROLL_RIGHT = 1515-30-((50) ×5)

ROLL_RIGHT = 1515-30-(250)

ROLL_RIGHT = 1515-280

ROLL_RIGHT = 1235

When the Left sensor detects an object from Left side and distance was 50cm, Roll
Value was 1235. According to this values motor control can be done.

Figure 44 : Roll values in Radio calibration tab

3.2.12 Develop a Rocker Bogie Mechanism

By developing the rocker bogie mechanism, it was hoped to avoid small bumpers and
get smooth running on rough roads. Here the rocker bogie mechanism introduced by
NASA with Mars rover was used. For this, 2mm thick mild steel plate and 2mm L
shaped mild steel plates were used. The part where the 90-degree shaped wheel is fixed
in front is called "bogie" and the fixed part is called "rocker". These parts were
connected using an Arc welding machine and bearings also used nut and bolt to connect
the rover assembly.

42
Figure 45: Rocker bogie mechanism design

Calculations of Rocker Bogie mechanism

Figure 46 : Rocker bogie Length

When designing the rocker bogie mechanism, the distance between the two wheels of
the “bogie” is used to determine the height of the step the rover must climb. Here the
maximum height of the bumper to be climbed is taken as 10cm. Then the gap between
the two wheels of the bogie should be more than that. So we assumed it to be 14cm
and found the length of the two arms of the bogie part.

AB2 = AC2 + BC2


142 = AC2 + BC2
AC = BC
142/2 = AC2 = BC2
196/2 = AC2 = BC2
.
√98 = 𝐴𝐶 2 = 𝐵𝐶 2
10cm = AC = BC

43
According to this calculation we can find the bogie part arm lengths (AC and CB). After
that we can determine the height (h) of the rocker bogie mechanism using bellow
calculations.

Figure 47: Rocker bogie height

h2 + (14/2)2 = 102
h2 = 102- (14/2)2
h2 = 149

.ℎ = √149
h = 12cm

3.2.14 Develop a Security System

A system that implements an easy and safe way to control the opening and closing of
the lid using an Arduino microcontroller and SIM800L GSM module to protect the
package. OTP, or One Time Password, is a widely used mechanism to improve security
in various applications, and in this case, it is used to control access to a door. Arduino
acts as the brain of the system, controlling and coordinating the various components.
Sets inputs, generates OTP, and manages the overall functionality of the door lock
system. SIM800L GSM module facilitates communication between Arduino and
mobile phone. Allows the system to send and receive messages, allows remote control
and interaction with the door lock. The system generates a unique OTP each time access
is requested. The OTP is sent via SMS to a pre-specified mobile number, ensuring that
only authorized users with the correct OTP can access the secure area. The door locking
mechanism is controlled by Arduino. After receiving the correct OTP, the Arduino
activates the unlocking mechanism, allowing access to the secure area.

44
Figure 48: Circuit diagram of the security system

After the rover reaches the respective location, the flight controller sends the signal to
the Arduino board. Arduino generates a unique OTP and sends it to the registered
mobile number associated with the system. The OTP is sent to the user's mobile via
SMS providing an emergency access credential. The user receives the OTP and enters
it into the system by replying to the SMS. If the OTP is correct and within the validity
period, the Arduino activates the door lock to unlock it, granting access. The system
can log access attempts and send notifications or alerts to predefined contacts in case
of unauthorized access attempts.

3.2.15 Develop an Obstacle Avoiding method 02 (Machine vision)

Python language was used to develop the image processing AI model. In addition,
Open CV, Keras TensorFlow, Numpy, Pytouch libraries were also used. The Yolo
object detection algorithm was used for object detection. This image processed data is
transmitted to the Pixhawk flight controller through the GPIO pin of the raspberry pi
using UART communication to the telemetry-2 port of the Pixhawk. according to that
transmitted data the Pixhawk flight controller controls the mobile robot.

Above is the flow chat showing how the rover is controlled according to the object
detection method. Here, first of all, the rover detects the lane it is traveling in. If this is
not detected, the robot stops, and if the lane is detected and an object is not detected,
the robot moves Forward according to the detected lane

45
Figure 49: Flow diagram of obstacles avoidance using Machine vision.

Above is the flow chart showing how the rover is controlled according to the object
detection method. Here first, the rover detects the lane it is traveling in. If this is not
detected, the robot stops, and if the lane is detected and an object is not detected, the
robot moves forward according to the detected lane.

According to image processing, a vertical boundary line and a horizontal boundary are
used as shown in the image to generate output data. This horizontal boundary line is
drawn according to the position where the camera is placed so that the objects within
the limit of 20m from the mobile robot can be identified. and the robot is steering.
Only after detecting the object beyond this horizontal boundary line.

Figure 50: Boundary Lines

46
If an object is detected after this horizontal boundary line, it is considered whether the
object is located beyond the 1m distance limit from the robot. This data is obtained by
the ultrasonic sensor installed in the robot. If the object is less than 1m away, the robot
stops and if it is more than 1m away, the vertical boundary line is considered. If the
object is too far to the left side, the robot turns to the right. If the object is too far to the
right side, the robot turns to the left.

3.2.16 Finalize and Testing

All the objectives mentioned above were developed and for the delivery of the package,
the performance of the rover was considered after assembling the obstacle avoidance
system, the security system, and the rocker bogie mechanism from the basic rover
design. There, all the components were installed, and it was planned to run the rover
with a weight of 5kg, and for that, it was chosen in front of the department building of
Ruhuna University. From there, the route was set up as shown in the map below and
the track test was carried out.

47
CHAPTER 04 – RESULT AND DISCUSSION

4.1 Test Results of Package Delivery Capability

The package delivery rover exhibits an efficient and robust delivery capability,
accommodating packages with dimensions not exceeding 45cm in length, 30cm in
width, and 25cm in height, and a weight limit of 10kg. Its operational range spans across
outdoor areas on roads within a 2km radius from the ground station computer, ensuring
a wide coverage for seamless delivery.

The incorporation of a rocker bogie mechanism enhances its adaptability, enabling


smooth traversal even on uneven terrains with potholes. Notably, the rover boasts the
capability to effortlessly overcome obstacles, such as bumpers with a height of less than
10cm, ensuring unhindered progress along its delivery route.

Furthermore, the rover is equipped with an impressive range, capable of covering


distances exceeding 2.75m on a single charge. This feature ensures extended
operational efficiency, reducing the need for frequent recharging and maximizing its
productivity in the package delivery process.

4.2 Test without Obstacle avoidance.

After the sensor calibrations and parameters were changed and then can be tested the
rover without other implementations. So, all the components required to test were
assembled and tested. Here the Front of the Department Building of University of
Ruhuna was selected as the testing location Check the GPS accuracy of the location.
Unfortunately, the accuracy of the GPS was low. So, the learning Mode had to Use for
this problem. Then Using the Radio Transmitter paths were generated. Here, rover was
driven to start point to final point by using the transmitter and step by step paths were
saved by transmitter switch and then Using the software can be viewed how to save the
generated paths.

4.3 Test with avoidance method 01 – Using sonar sensors.

After the obstacles avoidance system was created, its accuracy and performance were
identified. There, the rover placed an object in front of it and observed what was
48
happening, and when the object was detected by the left sensor, the rover started turning
to the right. When the right sensor detected the object, the rover turned to the left. As
the distance between the object and the rover gradually decreases, the rover's turning
speed increases according to the above calculation in method part.

4.4 Test with avoidance method 02 – Using Machine Vision

The rover detects the lane it is traveling in. If this is not detected, the robot stops, and
if the lane is detected and an object is not detected, the robot moves forward according
to the detected lane. in bellow figure shows. The green color in the figure below shows
the detection of the lane. If there is a white color line in the lane, it will be detected
more accurately. It is more clearly seen by the second figure. The green color in the
figure below shows the detection of the lane. If there is a white color line in the lane,
it will be detected more accurately. It can be seen more clearly by the second figure.
(when this model is testing in the virtual environment).

Figure 51: Object detection of real environment

Figure 52: Object detection of virtual environment

49
According to image processing, a vertical boundary line and a horizontal boundary are
used as shown in the image to generate output data. This horizontal boundary line is
drawn according to the position where the camera is placed so that the objects within
the limit of 20m from the mobile robot can be identified. and the robot is steering.
Only after detecting the object beyond this horizontal boundary line.

Figure 53: Object detection


If an object is detected after this horizontal boundary line, it is considered whether the
object is located beyond the 1m distance limit from the robot. This data is obtained by
the ultrasonic sensor installed in the robot. If the object is less than 1m away, the robot
stops and if it is more than 1m away, the vertical boundary line is considered. If the
object is too far to the left side, the robot turns to the right. If the object is too far to the
right side, the robot turns to the left.

Figure 54: Object detection and avoidance at testing time

50
4.5 Security System of the Rover

This security system was expected to ensure the security of the package and it was
successfully accomplished by this system. There, after the rover started sending the
package from the starting point to the final point, although the rover moved correctly
to the specified point, the security of the package could not be confirmed. So, by using
this system we were able to arrange the package to be received only by the recipient.
After the rover reaches the final location, an OTP number is sent to the desired person's
phone using the GSM module, and only after that person comes and enters the number
through the keypad on the left side of the rover, the solenoid lock can be opened, and
the lid can be opened. If a wrong OTP number is entered, the lid cannot be opened.

The features of this system are that OTPs are valid for a single use and have a limited
time to provide an additional layer of security. Users can request and receive access
credentials remotely via SMS, providing easy and flexible access control. The system
requires basic hardware components, making it cost-effective and suitable for a variety
of applications. The system can be expanded to manage multiple users and doors, and
it can adapt to different situations. The modular nature of the system allows integration
with other security features or home automation systems.

Figure 55: Security system

4.6 Rocker Bogie Mechanism

The main purpose of using the rocker bogie mechanism for this rover was to move
easily over bumps and small steps and not to block the rover's movement due to rock
fragments in the roads. When one wheel collides with a bumper of a normal four-

51
wheeled rover, the other wheels rotate simultaneously. This mechanism could avoid
that problem and running between two locations was very efficient. Also, this rover is
capable of traveling on very rough roads and can climb bumpers using this mechanism.

Figure 56: Rocker bogie mechanism

4.7 Discussion About the Overall System

The rover with GPS technology that we have designed successfully delivers the
package from one location to another and since we must face various obstacles while
carrying the package, we created 2 obstacle avoidance systems mentioned above.
There, ultrasonic sensor was used, and raspberry pi real time video processing was done
in both ways. The reason for using two methods here is that if the camera module fails
to detect the object that is coming or is from the very bottom when detecting the object
by the raspberry pi camera module, then as a solution, the ultrasonic sensor installed
below can identify those objects.

The navigation and control architecture of the robot were intricately designed,
leveraging a GPS-guided system. This sophisticated system facilitated waypoint-based
path planning effortlessly, utilizing the Mission Planner software. Notably, The robot's
implementation includes the "Smart Return to Launch" mode, enabling autonomous
return trips.

The robot features a robust sensor system for obstacle avoidance, including advanced
sonar sensors and an AI-powered camera module, all programmed using Python. The
system utilizes libraries like OpenCV, Keras TensorFlow, NumPy, and PyTouch,
demonstrating the technological depth of the implementation.

The YOLO algorithm was strategically used for object detection and lane tracking,
enhancing the robot's ability to effectively perceive and respond to its environment.

52
The robot demonstrated strong performance in outdoor package transportation,
showcasing adept navigation and obstacle avoidance capabilities, as demonstrated by
its comprehensive evaluation in predefined paths.

The findings suggest that a low-cost GPS-controlled self-driving mobile robot can
effectively tackle challenges in outdoor package delivery in Sri Lanka's industrial
landscape. The successful integration of cutting-edge technologies and the
demonstrated performance outcomes position this robot as a viable solution for
enhancing logistics and automation within Sri Lankan industries.

53
CHAPTER 05 – CONCLUSION

The push to use new technology to transport goods around the world has made it
efficient and timesaving. And today, the world is trying to use self-driving technology
for the transportation of packages, and it is done by various methods. However, in
countries like Sri Lanka, this self-driving technology is still not used, and many factors
have affected it. It appears that these problems have arisen due to the advancement of
technology and high cost. However, since there is no technology for self-driving
outdoor package delivery from one location to another in Sri Lanka, and as a solution
to the high cost of other technologies currently used in the world, we created a rover
using flight controller technology. The rover was controlled using a controller and that
technology is currently used only in UAV manufacturing and drone manufacturing
organizations. Here the rover's route is set by the GPS signal and this rover can travel
on any route provided by us. However, as we planned, the primary purpose of this rover
is to deliver industry packages, so we developed this rover to deliver packages within
an area of 2 km. The rover encountered various obstacles while moving forward and
the facility to avoid them was done under two methods. It was able to successfully
detect and avoid obstacles and drive the rover correctly from the starting point to the
final point. Also, it was necessary to confirm the security of the package and for that a
security system was created so that only the person who should receive the package can
get it. This made it impossible for outsiders to get the package and ensured security.
Also, the rocker bogie mechanism was used, which enables the rover to run smoothly
even on rough roads. This rover can carry a load of 10Kg and can run more than 2.5km
on a full battery charge. However, many problems were faced during the creation of
this rover, some of which will be mentioned here. The rover cost a lot to create and
most of the components had to be imported from abroad. Therefore, it took more time
to complete the project. And it had to be done several times while selecting the material.
However, we successfully completed the project and were able to achieve the level of
progress we expected.

54
REFFERENCE

Borenstein, J., & Koren, Y. (1988). Obstacle Avoidance with Ultrasonic Sensors. IEEE
Journal on Robotics and Automation, 4(2), 213–218. https://doi.org/10.1109/56.2085

Cuevas Garc´ıa, C. (2018). Obstacle alert and collision avoidance system development
for UAVs with Pixhawk flight controller.

Devos, A., Ebeid, E., & Manoonpong, P. (2018). Development of autonomous drones
for adaptive obstacle avoidance in real world environments. Proceedings - 21st
Euromicro Conference on Digital System Design, DSD 2018, 707–710.
https://doi.org/10.1109/DSD.2018.00009

Dr. Vilas Ubale, Mr. Sanket Bhor, Mr. Sanket Kasar, Miss Sakshi Khule, & Miss
Vaishnavi Umbarkar. (2023). Autonomous Delivery Robot. International Journal of
Advanced Research in Science, Communication and Technology, May, 257–262.
https://doi.org/10.48175/ijarsct-9203

Hu, J., Niu, Y., & Wang, Z. (2017). Obstacle avoidance methods for rotor UAVs using
RealSense camera. Proceedings - 2017 Chinese Automation Congress, CAC 2017,
2017-Janua, 7151–7155. https://doi.org/10.1109/CAC.2017.8244068

Kasangottuwar, A. A. (2017). Autonomous Rover Delivery System. 3(12), 132–135.

Lee, J., Park, G., Cho, I., Kang, K., Pyo, D., Cho, S., Cho, M., & Chung, W. (2022).
ODS-Bot: Mobile Robot Navigation for Outdoor Delivery Services. IEEE Access,
10(September), 107250–107258. https://doi.org/10.1109/ACCESS.2022.3212768

Richards, B., Gan, M., Dayton, J., Enriquez, M., Liu, J., Quintanna, J., & Bhandari, S.
(2014). Obstacle avoidance system for UAVs using computer vision. AIAA Infotech at
Aerospace, January, 1–9. https://doi.org/10.2514/6.2015-0986

Umbarkar, S., Rajput, G., Halder, S., Harname, P., & Mendgudle, S. (2017).
Keypad/Bluetooth/GSM Based Digital Door Lock Security System. 137, 749–757.
https://doi.org/10.2991/iccasp-16.2017.102

Vishwas, J. (2017). Rocker Bogie Mechanism. 6(10), 88–93.

55
What is Arduino? | Arduino Documentation. (n.d.). https://docs.arduino.cc/learn/
starting-guide/whats-arduino.

Sawkare, R. (2023, August 21). Learn everything about Ultrasonic sensor HC SR04 -
Vayuyaan.https://vayuyaan.com/blog/learn-everything-about-ultrasonic-sensor hcsr04.

Electrified Lock - an overview | ScienceDirect Topics. (n.d.). Www.sciencedirect.com.


https://www.sciencedirect.com/topics/computer-science/electrified-lock

LME Editorial Staff. (2022, November 8). Interface 4×3 & 4×4 Membrane Keypad
with Arduino. Last Minute Engineers.https://lastminuteengineers.com/arduino-
keypad-tutorial/

Infrared sensor - IR sensor|Sensor division knowledge. (n.d.)


https://www.infratec.eu/sensor-division/service-support/glossary/infrared-sensor.

Wikipedia contributors. (2023a, November 12). Electronic speed control. Wikipedia.


https://en.wikipedia.org/wiki/Electronic_speed_control
Lee, J., Park, G., Cho, I., Kang, K., Pyo, D., Cho, S., ... & Chung, W. (2022). ODS-Bot:
Mobile Robot Navigation for Outdoor Delivery Services. IEEE Access, 10, 107250-
107258.

Bacheti, V. P., Brandão, A. S., & Sarcinelli-Filho, M. (2021). A path-following


controller for a uav-ugv formation performing the final step of last-mile-delivery. IEEE
Access, 9, 142218-142231.

Bhalla, A., Nikhila, M. S., & Singh, P. (2020, December). Simulation of self-driving
car using deep learning. In 2020 3rd International Conference on Intelligent Sustainable
Systems (ICISS) (pp. 519-525). IEEE.

Ylimäki, T. (2021). Developing capabilities of accurate locating and object avoidance


of an autonomous rover: Applicability of LiDAR rangefinder and RTK NTRIP
technology in the conditions of an athletics event (Master's thesis).

Abrar, M. M., Islam, R., & Shanto, M. A. H. (2020, October). An autonomous delivery
robot to prevent the spread of coronavirus in product delivery system. In 2020 11th
IEEE Annual Ubiquitous Computing, Electronics & Mobile Communication
Conference (UEMCON) (pp. 0461-0466). IEEE.

Dave, D., Parsana, P., & Ajmera, A. (2020). Autonomous Delivery Robot.

56
James, C., & Matveev, K. I. (2021, November). Initial Development of Low-Cost
Autonomous Rover for Pursuit of Moving Targets. In ASME International Mechanical
Engineering Congress and Exposition (Vol. 85628, p. V07BT07A013). American
Society of Mechanical Engineers.

Rani, S. S., Pradeep, S., Dinesh, R. M., & Prabhu, S. G. (2022, July). OTP Based
Authentication Model for Autonomous Delivery Systems Using Raspberry Pi. In 2022
International Conference on Intelligent Controller and Computing for Smart Power
(ICICCSP) (pp. 1-5). IEEE.

Chinchkar, D., Gajghate, S., Panchal, R., Shetenawar, R., & Mulik, P. (2017). Design
of rocker bogie mechanism. International Advanced Research Journal in Science,
Engineering and Technology, 4(1), 46-50.

Gwak, J., Jung, J., Oh, R., Park, M., Rakhimov, M. A. K., & Ahn, J. (2019). A review
of intelligent self-driving vehicle software research. KSII Transactions on Internet and
Information Systems (TIIS), 13(11), 5299-5320.

Loading Firmware | PX4 User Guide (main). (n.d.).

https://docs.px4.io/main/en/config/firmware.html

Standard Configuration | PX4 User Guide (main). (n.d.).

https://docs.px4.io/main/en/config/.html

Full Parameter Reference · PX4 V1.9.0 User Guide. (n.d.).

https://docs.px4.io/v1.9.0/en/advanced_config/parameter_reference.html

Getting and Setting Parameters — Dev documentation. (n.d.).

https://ardupilot.org/dev/docs/mavlink-get-set-params.html

57

You might also like