0% found this document useful (0 votes)
108 views

Sensors For Robotics 2023 2043 Technologies, Markets, and Forecasts

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
108 views

Sensors For Robotics 2023 2043 Technologies, Markets, and Forecasts

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Sensors for Robotics 2023-2043:

Technologies, Markets, and Forecasts


Section title
SAMPLE PAGES

Yulin Wang and James Jeffs

www.IDTechEx.com/RoboticSensors / [email protected]

Robotics,
Slide 1
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
IDTechEx provides clarity on technology innovation
SAMPLE ONLY

• Technology assessment • Company profiling • Market forecasts


• Technology scouting • Market sizing • Strategic advice

Reports | Subscriptions | Consulting | Events | Journals | Webinars


Robotics,
Slide 2
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Table of Contents
SAMPLE ONLY

1. Executive summary – 20 slides


2. Introduction – 4 slides
3. Sensors by functions and tasks – 3 slides
4. Sensors for navigation and mapping – 52 slides
5. Sensors for collision detection and safety – 46 slides
6. Other sensors in robots – 13 slides
7. Sensors by robot type – 98 slides
8. Company Profiles – Access to 29 IDTechEx Portal profiles

Robotics,
Slide 3
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Acronyms
Acronyms
SAMPLE ONLY

AGV: Automated guided vehicles LiDAR: Light detection and ranging


AMR: Autonomous mobile robots MEMS: Microelectromechanical system
CCD: Charge-coupled device MWIR: Mid-wave infrared
CMOS: Complementary Metal Oxide Semiconductor NIR: Near infrared
Cobot: Collaborative robot OEM: Original equipment manufacturer
CPS: Capacitive proximity sensors PAC: Perimeter Access Control
DVL: Doppler velocity log Radar: Radio detection and ranging
EKF: Extended Kalman Filter ROI: Return on investment
EoAT: End-of-the-arm-tooling RPAS: Remotely Piloted Aircraft System
FLS: Forward-looking sonar RTK: Real-time kinematics
FoV: Field of view SLAM: Simultaneous localization and mapping
GPS: Global positioning system SMD: Surface Mount Device
HRI: Human robot interaction SONAR: Sound Navigation and Ranging
IFM: Intelligent flying machines ToF: Time of flight
IMU: Inertial measurement unit UAV: Unmanned aerial vehicles
IoT: Internet of things UGV: Unmanned ground vehicle
IR: Infrared VRT: Variable rate technology
LBL: Long baseline VTOL : Vertical take-off and landing
LCTF: Liquid-crystal tunable filter

Robotics,
Slide 4
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Overview of the report
SAMPLE ONLY

Sensors in robots can be used for a variety of tasks ranging from measuring force, detecting objects, navigation and localization, to collision detection and mapping.
With recent advances in sensor technologies and software, many sensors can be used for multiple purposes. For instance, cameras together with computer vision
systems can be used for collision detection as well as navigation and localization. The chart below summarizes the commonly used sensors by application split. This
report splits the tasks into four main themes including navigation and localization, collision and proximity detection, force and torque measurement, and others.

It is worth noting this report mainly focuses on the sensors that are equipped directly on the robots and largely ignore the sensors used in the components (e.g., servo
motors, controllers, etc.), such as current sensors, optical encoders, etc.

Robotics,
Slide 5
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Are 3D sensors getting increasingly popular or heading
nowhere? (2)
SAMPLE ONLY

The data gathered by 3D


sensors typically have lower
resolution than those from
conventional 2D sensors, for
example, cameras. In the case
of LiDARs, a standard sensor
discretizes the vertical space in
lines (the number of lines
varies), each having several
hundred detection points. This
produces approximately 1000
times fewer data points than
what is contained in a standard
HD picture, meaning that the
resolution can be significantly
compromised. Furthermore, the
further away the object is, the
fewer samples land on it. Thus,
the difficulty of detecting objects
increases exponentially with
their distance from the sensor.

Robotics,
Slide 6
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Company Profile Access – IDTechEx Online Portal
SAMPLE ONLY

The purchase of this report provides access to a selection of relevant company profiles available on the IDTechEx portal. Further profiles and
updates are available through the IDTechEx subscription service. Please email [email protected] for more information.

− Aidin Robotics − OnRobot


− Airskin − Pal Robotics
− Anybotics − Peratech
− Audite Robotics − Qineto
− ClearPath Robotics − Robotnik
− Clearview Imaging − SICK
− Ecovacs − Tacterion
− F&P Personal Robotics − TE Connectivity
− Franka Emika − Techman Robot
− Inivation − Universal Robots
− Interlink Electronics − Velodyne
− LuxAI − VitiBot
− Mov.ai − Vitirover
− Neura Robotics − Yujin Robot
− Omron

Robotics,
Slide 7
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Typical sensors used for robots
SAMPLE ONLY

As a simple categorization, the sensors used in robots can be Sensors


divided into two categories: proprioceptive and exteroceptive.

Internal data such as joint speed, torque, position, and force are Proprioceptive Exteroceptive
measured by proprioceptive sensors (which consist of motor
encoders and gyroscopes). These sensors are typically used for IMU Extrinsic Intrinsic
robotic control.
Inclinometers 2D Lidar Ultrasonic
Exteroceptive sensors collect information about the robot's
surroundings and sense the environmental parameters, such as its Torque sensors Motion Time-of-
distance and speed from a moving or stationary item, light intensity, capture flight (ToF)
Magnetic encoders
temperature, chemicals, and more. This type may include tactile
RGB-D
sensors, force and torque sensors, proximity sensors, range Compass Tactile
cameras
sensors, vision sensors, and others, used for robot guidance,
obstacle identification, monitoring, etc. On-hand
depth
Exteroceptive sensing can be further categorized as extrinsic or
cameras
intrinsic. More details can be found in the chart on the right.
IR sensors

Internal data External data


(environmental perception)

Robotics,
Slide 8
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Sensors by applications
SAMPLE ONLY

Although different sensors are typically used together to conduct certain tasks, it is worth classifying these sensors based on their use cases
and application scenarios.

Notably, there is no universal classification for this. In this report, IDTechEx specifically focuses on robotic sensors based on the following
applications.

Robotics,
Slide 9
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Navigation and mapping sensors
SAMPLE ONLY

Autonomy, as one of the key features of robots, refers to the ability for machines/robots to work independently without human control or
intervention. Autonomy consists of many concepts including autonomous mobility, autonomously identifying and manipulating the objects, and
many others. In recent years, autonomous mobility has gained significant momentum, particularly in the area of autonomous mobile robots
(AMRs), automated guided vehicles (AGVs), unmanned aerial vehicles (UAVs), and some self-driving agricultural robots. Although these
robots are utilized for different tasks and purposes, they all need to have a robust autonomous mobility system.
Autonomous mobility requires navigation, localization, and mapping. With the increasing demand for autonomous mobility, navigation,
localizing and mapping sensors are becoming increasingly important. Typical navigation and mapping sensors include cameras (2D RGB
cameras or 3D stereo cameras), ultrasonic sensors, LiDAR, radar, GPS, and IMU.

Robotics,
Slide 10
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
The emergence of 3D cameras/3D robotic vision
SAMPLE ONLY

The use of 3D automated vision in robotic work cells is on the rise. The robot can recognize an object's position, size, depth, and color thanks
to this technology. Using visual components, industries like logistics, food processing, life science, and manufacturing are exploring ways to
automate their processes.
It is worth noting that there is no “one size fits all” solution because the integration of vision sensors depends on a number of factors including
applications, equipment, product, environment/workspace, and budget. Therefore, IDTechEx believes that there is no ‘standard solution’ when
it comes to setting up real-time 3D imaging in a robotic system. However, there are indeed several standard techniques although, in reality,
they all need to be tailored to benefit specific tasks. These techniques are introduced as follows:
Laser triangulation – a laser scanner's light beam traverses the objects it is scanning. As the object passes across the laser line, a camera
positioned at a certain angle captures an image of the laser line, distorted by the object's profile. More details are explained on this slide.
Structured light – a projector creates a thin band of light to project a pattern on an object. Cameras from different angles observe the various
curved lines from the light to develop a 3D image of the object.
Time of Flight (ToF) – a camera uses a high-power scanner to emit light reflected from the object back to the image sensor. The distance of
the object is calculated based on the time delay between the transmitted and the received light – more details can be found in the report.
Stereo vision – The robotic system uses two cameras to record the same 2D view of an object taken from two different angles. The software
then uses the established position of the two cameras and compares corresponding points in the two flat images to identify variations and
produce an image with depth information. This is particularly useful in autonomous mobile robots (AMRs) and automated-guided vehicles
(AGVs).

Robotics,
Slide 11
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Torque sensors – introduction
SAMPLE ONLY

Force sensors can detect the force and moment (x, y, z,


yaw, pitch, and roll) applied to a robot from external
sources. Hence, according to FANUC, one of the biggest
Force sensors play an important part in collision manufacturers of industrial robots globally, force sensors
detection and force measurement. Force sensors can be can be applied in the robotic system to control velocity
installed in different positions in a robot (e.g., at joints for and force when objects are fit, aligned, buffed, trimmed,
collaborative robots or grippers/end-effectors for or assembled thereby improving product quality and
industrial/collaborative robots) to enable them to process integrity. In essence, force sensors are
manipulate parts with high precision and accuracy. integrated so as to create intelligent robots that can
“feel”, enabling, handling parts of varying textures, the
most demanding mechanical assembly, and material
removal operations.

Robotics,
Slide 12
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Piezoresistive vs. Piezoelectric vs. Capacitive technologies
SAMPLE ONLY

Piezoresistive vs. Piezoelectric vs. Capacitive Capacitive sensors can operate over a wide range of temperatures
technologies and tolerate short-term overpressure conditions. They are not
Pizeoelectric Capacitive Piezoresistive affected by changes in temperature, and the temperature coefficient
of sensitivity of a capacitive sensor is 10 times better than a
The number 1-3 indicates the
Design/construction piezoresistive pressure sensor. Meanwhile, the power consumption
simplicity
ranking where 1 has the lowest 3 for capacitive sensors is low as they don’t need power sources.
performance and 3 indicates the
best performance. Piezoresistive sensors are less expensive compared with capacitive
2 sensors. They are also highly resistant to pressure changes, shocks,
Price Response speed
and vibrations. They can be applied over a wide range of pressure
1 (up to 20,000 psi), and many of these sensors with thin-film resistors
are much more resistant to higher temperatures and overpressures.
0
Piezoelectric sensors can tolerate very high temperatures (e.g.,
some materials can be used at up to 1,000 ºC). These sensors are
usually self-powered, enabling them to have very low power
Tolerance to high
Power consumption consumption. Although they can be used at a wide range of pressures
pressure
(e.g., 0.1 psi to 10,000 psi), the maximum pressure tolerance is lower
than piezoresistive sensors.

Tolerance to high
temperature

Robotics,
Slide 13
Downloaded Copyright
by ÇAğatay © IDTechEx.
Gümüş | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Comparison of proximity sensors
SAMPLE ONLY
Comparison Comparison
The diagram on the right compares 7,000 Source: IDTechEx
different proximity sensors. It is 600 Typical response time of proximity
observable that most proximity sensors sensors < 10ms
are relatively small (footprint smaller
than 40,000 mm3), with maximum 6,000
sensing distances shorter than 2000mm
Time of flight sensors are significantly
500
(2 meters). smaller than other types.

Ultrasonic sensors typically have the Ultrasonic and light reflection sensors 5,000
largest sensing distance compared to 400 usually have large footprint
other proximity sensors, whereas

Max sensing range (mm)


Max sensing range (mm)
capacitive proximity sensors have the
shortest sensing distance. The 4,000
correlation between footprint and max Light reflection
sensing distance determines the ideal 300
application scenarios of these sensors. Time of flight
For instance, sensors with large
3,000
sensing distances with a large footprint Triangulation
(e.g., ultrasonic sensors) can be widely 200
used in tasks that need long-range 2,000 Capacitive
detection, such as underwater robots.
On the contrary, the sensors with (7.5, 100) with a footprint of 13.44 mm3 Ultrasonic
relatively short detection distances and 100
footprints would be more suitable for 1,000
tasks in limited space, such as
collaborative robots in production lines.
0
-10 0 10 20 30 40 50 0
13,500mm3
-100 0 100 200 300
129,524mm3 Response time (ms)
-100 -1,000 Response time (ms)
1,287mm3 Bubble size: Footprint (mm3)
Robotics,
Slide 14
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
What are industrial robots and what does the current
market look like?
SAMPLE ONLY

Industrial robot categories Market trend

Industrial robots are generally composed of three basic parts: the Japan, Germany, and Switzerland remain strong in industrial
main body, the driving system, and the control system. The main robots. This is because reputable industrial robot companies are
body is the base and the actuator, including the arm, the wrist, mainly located in these well-developed countries. However, the
and the hand. Some robots also have a walking mechanism. Most emerging markets in China and South Korea are catching up very
industrial robots have 3-6 degrees of freedom (DoF) of quickly.
movement, of which the wrist usually has 1-3 degrees of freedom Industrial robots are also transitioning to an intelligent and
of movement; the drive system includes a power device and a modular design. With the increasing complexities of tasks,
transmission mechanism. Finally, a reducer and a servo motor industrial robots are required to ‘perceive’ the environments and
make the actuator produce corresponding movements. accurately identify and inspect complex situations. Industrial
Regarding the classification of industrial robots, IDTechEx is yet robots are transitioning from ‘pre-programming,’ ‘on-site control,’
to see standards specified in the world. However, they can be and ‘remote control’ to self-learning and independent working.
divided according to load weight, control method, degree of In order to fulfil the requirements of these tasks, typical sensors
freedom, structure, and application fields. According to the used in industrial robotic arms include vision sensors, force and
configuration, it is divided into multi-joint robots, rectangular torque sensors, photoelectric sensors, along with a few sensors
coordinate robots, SCARA robots, parallel robots, and that are indirectly used on the robotic arms such as IMUs, voltage
collaborative robots. The results of classification according to sensors, optical encoders, and many others.
applications are as follows.
In this section, we will primarily focus on vision sensors
Functions of industrial robots (cameras), force and torque sensors, and photoelectric sensors
Picking and used in industrial robotic arms.
Welding Painting Assembly Others
machine tending

Robotics,
Slide 15
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Sensors for AGV and AMR – overview
SAMPLE ONLY

Mobile robots primarily refer to autonomous guided vehicles (AGVs), autonomous mobile robots (AMRs), and many others. As the name
indicates, autonomous driving, also commonly known as autonomous mobility, is one of the key functions of mobile robots. To achieve
autonomous driving, object detection, collision detection, navigation and localization are important.
Multiple sensors are often combined to achieve autonomous navigation, object detection, and collision avoidance. Typical sensors include
cameras (e.g., RGB cameras, IR cameras, etc.), laser scanners, LiDARs, radars, force sensors, GPS, ultrasonic sensors, and many others.
The chart shows an overview of typical sensors and typical functions needed for fully functional mobile robot.

Functions Requirements needed to fulfil functions Common sensors

Exploring/detecting Navigation and localization LiDAR

GPS
Transporting Collision/proximity detection
Ultrasonic sensors
Delivery/logistics Object detection
Encoders (e.g., optical,
magnetic, etc.)
Picking/placing
Force and torque sensors
IMU (gyroscope,
accelerometer, etc.)

Robotics,
Slide 16
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Cobot – functions and typical sensors
SAMPLE ONLY

Collaborative robots, also known as cobots, refer to the robots that work side-by-side with human operators. Cobots can be used in many industries and tasks, such
as, material handling, picking and placing, quality inspection, assembly, and a few others.

Unlike traditional industrial robots, there is no physical separation between cobots and human operators, therefore, safety in cobots always comes as a priority. In
order to ensure a safe collaboration, proximity and collision detection sensors are commonly used. Examples include force and torque sensors, tactile sensors, and
vision sensors.

Aside from safety requirement, an accurate control of cobot’s position, along with the ability to control and force exerted on the object also plays an important role. At
this stage, torque and force sensors are usually equipped to measure the force exerted. In order to determine the position of the cobot, cameras and IMUs are often
used. Cameras, along with computer vision technology can be used to detect the distance between target object and the robotic arm, thereby informing the
movements. IMUs are used to determine the posture of the cobot.

Functions Requirements needed to fulfil functions Common sensors

Material handling Safety – priority over all the other functions IMU

Inspection Force measurement Cameras

Assembly Object proximity detection Tactile sensors

Picking and placing Collision detection Force and torque sensors

Robotics,
Slide 17
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Overview of the sensors in drones (1)
SAMPLE ONLY

Drones are becoming increasingly popular over the past several years. Depending on different purposes, a wide suite of sensors could be
equipped. LiDAR, thermal cameras, RGB cameras, and IMUs are just several examples. Below is a chart showing different sensors for
various applications.

Tasks Key sensory components Application themes

Posture control IMU (accelerometer, Military


gyroscope, etc.)
Terrain mapping Proprioceptive Law enforcement
Tilt sensors
Search and Rescue Firefighting
Magnetic position sensors
Commercial Border control
Security and Monitoring Drones
GPS
Emergency response
Inspection Camera (thermal, RGB,
multispectral, etc.) Agriculture
Cargo delivery
Exteroceptive Humidity sensors Delivery and logistics
Others
Consumer
LiDAR Recreation
Drones
* This chart only shows a few sensory systems and typical
tasks, the chart is not designed to be mutually exclusive Radar
and comprehensively exhaustive.

Robotics,
Slide 18
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Overview for sensors for service robots
SAMPLE ONLY

A service robot, by definition, refers to a robot that frees humans by performing some useful tasks for them. Service robots can present full
or partial autonomy. In order to achieve full or partial autonomy, service robots have a series of sensors onboard.

Tasks Key sensory components Categories

Cleaning IMU (accelerometer,


Cleaning robots
gyroscope, etc.)
Logistics and delivery
Tilt sensors Delivery and logistics robots
Search and Rescue
Ultrasonic sensors
Agricultural robots
Socializing (e.g., signal
recording) GPS

Camera (thermal, RGB, Underwater robots


Navigation
multispectral, etc.)
Object detection/identification Social robots
Collision sensors
Weed detection, spraying, etc. Kitchen and restaurant
LiDAR
robots
* This chart only shows a few sensory systems and typical
tasks, the chart is not designed to be mutually exclusive radar
and comprehensively exhaustive.

Robotics,
Slide 19
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Cleaning robots – overview of tasks and sensors
SAMPLE ONLY

As one of the top applications of service robotics, Criteria Key technologies and components
cleaning robots have been investigated for many
years. Recently, due to COVID, cleaning robots
gained a lot more momentum. It is reported by the Dust detection
International Federation of Robotics (IFR) that there Source: IDTechEx
Cleaning
are at least 50 more cleaning robotics manufacturers
efficiency Mopping/scrubbing speed (autonomous mobility)
as of today compared with two years ago.
Cleaning robots refer to all the robots with capabilities The capacity of dust bags/spraying bucket
of cleaning and disinfecting their surrounding
environments, regardless of the measures. There is a
variety of measures used by robots to do the cleaning. Sensors (LiDAR, radar, Camera, photoelectric
These measures can be categorized into two types Navigation sensors, etc.)
that are mechanical cleaning and non-mechanical
cleaning (often referred to as killing microorganisms). Cleaning Key sensors
Typical mechanical cleaning methods include physical Software (SLAM, GPS, obstacle avoidance, etc.)
robots needed
wiping or scrubbing whereas non-mechanical
methods usually refer to using ultraviolet radiation and Battery capacity – operation time
spraying disinfectants. Endurance
The chart on the right shows a few key criteria of
Charging time
cleaning robots, along with their key enabling sensors. Battery capacity – operation time

Remote control (App/voice control)


Interaction Compatible/work with other smart appliances
Interaction with targets (dust) – direct and indirect

Robotics,
Slide 20
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Sensors in social robots – overview
SAMPLE ONLY

Social robots are designed to interact with humans. Interaction, as one of the core Common functions of social robots
features of social robots, needs the support of many sensors including voice
detecting sensors (microphones), motion detection sensors, touch sensors, Interacting with people – voice detecting
cameras, and many others. Guidance – some social robots have
Aside from sensors used for interaction purposes, safety sensors also play an autonomous mobility
important role. Many social robots are designed for children, whose actions are
unpredictable; it is, therefore, crucial to ensure that the potential risks and injuries Motion detection – respond to user’s motion
can be mitigated during human-robot interaction. Safety requirements can be
divided across physical safety and emotional safety. With regards to physical safety
(i.e., obstacle avoidance, emergency system if kids want to disassemble the robot, Safety assurance measurements
etc.), sensory and object detection systems are usually equipped in a social robot
Physical safety
to make sure that robots can perceive the environment and identify objects. The
Avoid collisions/physical crashes
sensing systems and types of sensors equipped in mobile social robots are very
similar to those used in mobile delivery robots. Typical sensors, navigation, and LiDAR
localization systems include cameras, LiDAR, SLAM (Simultaneous localization and
Camera
mapping), obstacle avoidance, machine vision, and many others.
Touch/force sensors
By contrast, emotional safety primarily requires intelligent software systems so that
robots can correctly perceive and understand users, thereby making responses that Ultrasonic
wouldn’t make users feel emotionally uncomfortable. Therefore, this report will not IMU
discuss this part in detail, but more information can be found in IDTechEx’s latest
research on Service Robots 2022-2032. Others
Source: IDTechEx

Robotics,
Slide 21
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
Overview of common sensors in different applications –
market size (USD billions)
SAMPLE ONLY

By 2043, the total market size of sensors used in the robotics industry will exceed US$80 billion where the force and torque sensors for cobots
and AMRs will have the largest proportion of the market.

Source: IDTechEx

Robotics,
Slide 22
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI
SAMPLE ONLY
Contact us
IDTechEx guides your strategic business decisions through its
Research, Subscription and Consultancy products, helping you profit
from emerging technologies.

For more information, contact [email protected]


or visit www.IDTechEx.com.

Europe (UK) - Headquarters Americas (USA) Germany


IDTechEx GmbH
+44 1223 812300 +1 617 577 7890 c/o ljh Lindlbauer Rechtsanwälte PartmbB
IDTechEx, 9 Hills Road, Cambridge One Boston Place, Suite 2600 Heimeranstraße 35, 80339 München,
CB2 1GE, United Kingdom Boston, MA 02108, United States Deutschland

Asia (Japan) - Headquarters China, Hong South Korea


+81 3 3216 7209 Kong, Taiwan +82 10 3896 6219
21F Shin Marunouchi Center Bldg, 1-6-2 Marunouchi
Chiyoda-ku, Tokyo 100-0005, Japan +886 9 3999 9792

Robotics,
Slide 23
Downloaded Copyright
by ÇA&#287;atay © IDTechEx.
Gümü&#351; | www.IDTechEx.com/RoboticSensors
- [email protected] 05 Dec 2022 15:15:38
Autonomy & AI

You might also like