Human-Robot Collaboration in Manufacturing Applications - A Review
Human-Robot Collaboration in Manufacturing Applications - A Review
Review
Human–Robot Collaboration in Manufacturing
Applications: A Review
Eloise Matheson 1 , Riccardo Minto 2, *, Emanuele G. G. Zampieri 2 , Maurizio Faccio 3 and
Giulio Rosati 2
1 Department of Mechanical Engineering, Imperial College London, London SW7 2AZ, UK;
[email protected]
2 Department of Industrial Engineering, University of Padova, 35131 Padova, Italy;
[email protected] (E.G.G.Z.); [email protected] (G.R.)
3 Department of Management and Engineering, University of Padova, 36100 Vicenza, Italy;
[email protected]
* Correspondence: [email protected]; Tel.: +39-049-827-6810
Received: 30 September 2019; Accepted: 3 December 2019; Published: 6 December 2019
1. Introduction
Traditional industrial robotic systems require heavy fence guarding and peripheral safety
equipment that reduce flexibility while increasing costs and required space. The current market,
however, asks for reduced lead times and mass customization, thus imposing flexible and multi-purpose
assembly systems [1]. These needs are particularly common for small- and medium-sized enterprises
(SMEs). Collaborative robots (or cobots [2]) represent a natural evolution that can solve existing
challenges in manufacturing and assembly tasks, as they allow for a physical interaction with humans
in a shared workspace; moreover, they are designed to be easily reprogrammed even by non-experts
in order to be repurposed for different roles in a continuously evolving workflow [3]. Collaboration
between humans and cobots is seen as a promising way to achieve increases in productivity while
decreasing production costs, as it combines the ability of a human to judge, react, and plan with the
repeatability and strength of a robot.
Several years have passed since the introduction of collaborative robots in industry, and cobots
have now been applied in several different applications; furthermore, collaboration with traditional
robots is considered in research, as it takes advantage of the devices’ power and performance. Therefore,
we believe that it is the proper time to review the state of the art in this area, with a particular focus on
industrial case studies and the economic convenience of these systems. A literature review is considered
a suitable approach to identify the modern approaches towards Human–Robot Collaboration (HRC),
in order to better understand the capabilities of the collaborative systems and highlight the possible
existing gap on the basis of the presented future works.
The paper is organized as follows: After a brief overview of HRC methods, Section 2 provides
an overview of the economic advantages of the collaborative systems, with a brief comparison with
traditional systems. Our literature review analysis is presented in Section 3, and Section 4 contains
a discussion of the collected data. Lastly, Section 5 concludes the work.
Background
Despite their relatively recent spread, the concept of cobots was invented in 1996 by J. Edward
Colgate and Michael Pashkin [2,4]. These devices were passive and operated by humans, and are quite
different from modern cobots that are more represented by the likes of lightweight robots such as
KUKA LBR iiwa, developed since the 1990s by KUKA Roboter GmbH and the Institute of Robotics
and Mechatronics at the German Aerospace Center (DLR) [5], or the first commercial collaborative
robot sold in 2008, which was a UR5 model produced by the Danish company Universal Robots [6].
First of all, we believe that it is important to distinguish the different ways of collaboration, since
the term collaboration often generates misunderstandings in its definition.
Müller et al. [7] proposed a classification for the different methodologies in which humans and cobots
can work together, as summarized in Figure 1, where the final state shows a collaborative environment.
• Coexistence, when the human operator and cobot are in the same environment but generally do
not interact with each other.
• Synchronised, when the human operator and cobot work in the same workspace, but at
different times.
• Cooperation, when the human operator and cobot work in the same workspace at the same time,
though each focuses on separate tasks.
• Collaboration, when the human operator and the cobot must execute a task together; the action of
the one has immediate consequences on the other, thanks to special sensors and vision systems.
It should be noted that neither this classification nor the terminology used are unique, and others
may be found in the literature [8–11].
To provide definitions and guidelines for the safe and practical use of cobots in industry, several
standards have been proposed. Collaborative applications are part of the general scope of machinery
safety regulated by the Machinery Directive, which defines the RESS (Essential Health and Safety
Requirements). For further documentation, we refer to [12].
The reference standards as reported in the Machinery Directive are:
• UNI EN ISO 12100:2010 “Machine safety, general design principles, risk assessment, and risk reduction”.
Robotics 2019, 8, 100 3 of 25
• UNI EN ISO 10218-2:2011 “Robots and equipment for robots, Safety requirements for industrial
robots, Part 2: Systems and integration of robots”.
• UNI EN ISO 10218-1:2012 “Robots and equipment for robots, Safety requirements for industrial
robots, Part 1: Robots”.
In an international setting, the technical specification ISO/TS 15066:2016 “Robots and robotic
devices, Collaborative Robots” is dedicated to the safety requirements of the collaborative methods
envisaged by the Technical Standard UNI EN ISO 10218-2:2011.
According to the international standard UNI EN ISO 10218 1 and 2, and more widely explained
in ISO/TS 15066:2016, four classes of safety requirements are defined for collaborative robots:
• Safety-rated monitored stop (SMS) is used to cease robot motion in the collaborative workspace
before an operator enters the collaborative workspace to interact with the robot system and
complete a task. This mode is typically used when the cobot mostly works alone, but occasionally
a human operator can enter its workspace.
• Hand-guiding (HG), where an operator uses a hand-operated device, located at or near the robot
end-effector, to transmit motion commands to the robot system.
• Speed and separation monitoring (SSM), where the robot system and operator may move
concurrently in the collaborative workspace. Risk reduction is achieved by maintaining at least
the protective separation distance between operator and robot at all times. During robot motion,
the robot system never gets closer to the operator than the protective separation distance. When
the separation distance decreases to a value below the protective separation distance, the robot
system stops. When the operator moves away from the robot system, the robot system can resume
motion automatically according to the requirements of this clause. When the robot system reduces
its speed, the protective separation distance decreases correspondingly.
• Power and force limiting (PFL), where the robot system shall be designed to adequately reduce
risks to an operator by not exceeding the applicable threshold limit values for quasi-static and
transient contacts, as defined by the risk assessment.
Collaborative modes can be adopted even when using traditional industrial robots; however,
several safety devices, e.g., laser sensors and vision systems, or controller alterations are required. Thus,
a commercial cobot that does not require further hardware costs and setup can be a more attractive
solution for industry.
Lastly, cobots are designed with particular features that distinguish them considerably from
traditional robots, defined by Michalos et al. [13] as technological and ergonomic requirements.
Furthermore, they should be equipped with additional features with respect to traditional robots,
such as force and torque sensors, force limits, vision systems (cameras), laser systems, anti-collision
systems, recognition of voice commands, and/or systems to coordinate the actions of human operators
with their motion. For a more complete overview, we refer to [8,13]. Table A1 shows the characteristics
of some of the most popular cobots, with a brief overview of some kinematic schemes in Table A2.
Collaborative systems can also achieve lower direct unit production costs: [17] observed that
a higher degree of collaboration, called c% , has a high impact on throughput; moreover, depending on
the assembly process considered, the throughput can be higher than in traditional systems.
Table 1 provides a comparison between collaborative and traditional systems for four different
jobs: assembly (the act of attaching two or more components), placement (the act of positioning each
part in the proper position), handling (the manipulation of the picked part), and picking (the act of
taking from the feeding point). In order to adapt to market needs, a manual assembly system could be
used, though this can lead to a decrease in productivity due to variations in quality and fluctuations in
labor rates [18]. Comparing the human operator capabilities to automated systems, it is clear that the
performance of manual assembly is greatly influenced by ergonomic factors, which restrict the product
weight and the accuracy of the human operator [19]. Therefore, these restrictions limit the capabilities
of human operators in the handling and picking tasks of heavy/bulky parts. These components
can be manipulated with handling systems such as jib cranes: These devices could be considered as
large workspace-serving robots [20], used for automated transportation of heavy parts. However,
to the authors’ knowledge, there are no commercial end-effectors that allow these systems to carry out
complex tasks, such as assembly or precise placing, since they are quite limited in terms of efficiency
and precision [21].
Traditional robotic systems [22] bridge the presented gap, presenting manipulators with both
high payload (e.g., FANUC M-2000 series with a payload of 2.3 t [23]) and high repeatability. However,
the flexibility and dexterity required for complex assembly tasks could be too expensive, or even
impossible, to achieve with traditional robotic systems [24]. This gap can be closed by collaborative
systems, since they combine the capabilities of a traditional robot with the dexterity and flexibility of
the human operator. Collaborative robots are especially advantageous for assembly tasks, particularly
if the task is executed with a human operator. They are also suitable for pick and place applications,
though the adoption of a traditional robot or a handling system can offer better results in terms of
speed, precision, and payload.
Table 1. Qualitative evaluation of the most suitable solutions for the main industry tasks.
No complex tasks
High dexterity Combines human dexterity Dexterity/flexibility
Assembly with commercial
and flexibility with robot capabilities [24] could be unreachable [24]
end-effectors [21]
3. Literature Review
This literature review analyses works from 2009–2018 that involved collaborative robots for
manufacturing or assembly tasks. Reviewed papers needed to include a practical experiment involving
a collaborative robot undertaking a manufacturing or assembly task; we ignored those that only
considered the task in simulation. This criterion was implemented as, often, only practical experiments
with real hardware can highlight both the challenges and advantages of cobots.
For this literature review, three search engines were used to collect papers over our time period
that were selected using the following boolean string: ((collaborative AND robot) OR cobot OR cobotics)
AND (manufacturing OR assembly). Our time period of 2009–2018 was chosen as the timeline for this
Robotics 2019, 8, 100 5 of 25
literature review, as it is only in the last 10 years that we have seen the availability of collaborative
robots in the market.
• ScienceDirect returned 124 results, from which 26 were found to fit our literature review criteria
after reading the title and abstract.
• IEEExplore returned 234 results, from which 44 were found to fit our literature review criteria
after reading the title and abstract.
• Web of Science returned 302 results, from which 62 were found to fit our literature review criteria
after reading the title and the abstract.
Of all these relevant results, 16 were duplicated results, leaving us with 113 papers to analyze.
Upon a complete read-through of the papers, 41 papers were found to fully fit our criteria and have
been included in this review. It should be noted that in the analysis regarding industry use cases, only
35 papers are referenced, as 6 papers were focused on the same case study as others and did not add
extra information to our review.
The following parameters were studied: The robot used, control system, application, objectives,
key findings, and suggested future work for all these studies, as summarized in Table A3. These were
chosen for the following reasons. The robot choice is important, as it highlights which systems are
successfully implemented for collaborative applications. The control system is interesting to analyze,
as it dictates both safety and performance considerations of the task. Furthermore, when a human is in
the control loop, the control system choice is specific to the manner of human–machine interaction—
by seeing which methods are more popular and successfully implemented, we can identify trends
and future directions. We characterized control systems as vision systems (such as cameras and
laser sensors), position systems (such as encoders which are typical of traditional industrial robots),
impedance control systems (through haptic interfaces), admittance control (taking advantage of the
cobot torque sensors or voltage measurement), audio systems (related to voice command and used for
voice/speech recognition), and other systems (that were not easily classified, or that were introduced
only in one instance).
The application represents the task given to the cobot, which we believe allows a better
understanding to be made regarding the capabilities of collaborative robots. These tasks were divided
into assembly (when the cobot collaborates with the operator in an assembly process), human
assistance (when the cobot acts as an ergonomic support for the operator, e.g., movable fixtures,
quality control, based on vision systems), and lastly, machine tending (when the cobot performs
loading/unloading operations).
Furthermore, we divided the objectives into three main topics: Productivity, representing the
studies focused on task allocation, quality increase, and reduction of cycle time; safety, which includes
not only strictly safety-related topics such as collision avoidance, but also an increase in human
ergonomics and reduction of mental stress; and HRI (Human–Robot Interaction), which is focused
on the development of new HRI methodologies, e.g., voice recognition. It should be noted that in no
way is the proposed subdivision univocal; an interesting example could be [25–27]. These works were
considered as safety because, even if the proposed solutions keep a high level of productivity, they
operate on HRC safety.
The key findings were not grouped, since we believe they depend on the specific study and are
too varied; however, they have been summarized in Table A3. Key findings were useful to present the
capabilities of the collaborative systems and what HRC studies have achieved. They were included in
our analysis in order to identify common solutions. Future work has been grouped into: HRI (works
that focus on increasing HRI knowledge and design), safety (works that focusing on increasing the
operator safety when working with the cobot), productivity (works focusing on increasing the task
productivity in some manner), task complexity (works that focus on increasing the complexity of the
task for a particular application), applicability (works that focus on increasing the scope of the work to
be used for other industrial applications), and method (works that focus on enhancing the method of
Robotics 2019, 8, 100 6 of 25
HRI via modeling, using alternative robots, or applying general rules and criteria to the design and
evaluation process). From these groupings, we can identify ongoing challenges that still need to be
solved in the field; by seeing what researchers identify as future work for industrial uptake, we can
find trends across the industry in the direction research on which is focused. Our analysis of these
parameters is presented in Section 4.
5
4 industrial
3
2 cobot
1
0
2009 2011 2013 2014 2015 2016 2017 2018
Year
Figure 2. Robot usage in selected human–robot collaboration studies in the period 2009–2018.
Figure 3 presents the different control systems in the selected human–robot collaboration (HRC)
studies. Position control systems were only used for traditional industrial robots, often using extra
vision systems for safety reasons. Due to the inherent compliance of cobots, impedance control was
more commonly chosen for these systems, though in many cases where an inherently compliant cobot
was used, vision was also included for feedback [31–33]. Robot compliance can often be a trade-off
Robotics 2019, 8, 100 7 of 25
with robot precision, so including a separate channel for feedback to monitor collisions and increase
safety can be a useful method of maintaining manipulation performance. Vision is indeed the prevalent
sensor used in HRC studies, also due to the flexibility and affordability of the systems, especially
when using depth cameras such as Microsoft Kinect cameras. It is interesting to note that in recent
years, Augmented Reality (AR) systems, such as the Microsoft Hololens, have been used more in
HRC research, as they are able to provide information to the operator without obscuring their view
of the assembly process. In one study, a sensitive skin was incorporated with the cobot to provide
environmental information and maintain the operator’s safety. As these skins become more widely
studied and developed, we could see this feedback control input become more common, though
challenges such as response time must still be solved [34].
6
Number of studies
0
2009 2011 2013 2014 2015 2016 2017 2018
Year
Figure 3. Control systems used in selected human–robot collaboration studies in the period 2009–2018:
In red, the number of vision systems; in orange, position-controlled systems (used especially for
traditional industrial robots); in gray, the cases for impedance control (e.g., through haptic interfaces);
in yellow, admittance control (e.g., through torque sensors); in blue, audio systems (for voice/speech
recognition); and green for other systems.
The considered studies used the aforementioned robots, both traditional industrial robots and
cobots with different collaborative methodologies. Early studies were focused on SSM and PFL
methodologies; we believe this focus is due to the need for safety and flexibility in traditional
robotic systems and the early spread of cobots. Since 2016 and the introduction of ISO/TS 15066:2016,
the considered research sample began to study other methodologies, especially the HG method, which,
as shown in Figure 4, has become prevalent in recent years. The HG method is indeed a representative
function of collaborative robots [30], since it allows even unskilled users to interact with and program
the cobot, which can allow some degree of flexibility—even if the robot moves only on predefined
directions—without the need for expensive algorithms [35]. It should be noted that the HG method
could also be employed with traditional industrial robots, such as a COMAU NJ130 [36]: This allows
one to take advantage of the robot’s characteristics, such as high speed and power, and increase the
system’s flexibility.
Robotics 2019, 8, 100 8 of 25
5
Number of studies
HG
3
SMS
2 SSM
PFL
1
0
2009 2011 2013 2014 2015 2016 2017 2018
Year
Figure 4. Collaboration methods used in selected human–robot collaboration in the period 2009–2018:
In blue, hand guiding (HG); in orange, safety-rated monitored stop (SMS); in gray, speed and separation
monitoring (SSM); in yellow, power and force limiting (PFL).
As stated previously, the collaborative mode depends on the considered application. Figure 5
depicts the considered tasks over the last decade. The most studied task is assembly, likely due to the
required flexibility in the task, which makes traditional robotic systems too expensive or difficult to
implement. However, the task of production also requires flexibility, and could greatly benefit from
collaborative applications. Likely, until the fundamental challenges of setting up collaborative workcells
are solved for the easier tasks of assembly, we will not see many case studies targeting production.
10
8
Assembly
6
Human assistant
4 Quality control
2 Machine tending
0
2009 2011 2013 2014 2015 2016 2017 2018
Year
Figure 5. Tasks assigned to the robot in selected collaborative applications in research in the period
2009–2018: In blue, assembly tasks; in orange, the tasks used to assist the operator, e.g., handover of
parts, quality control tasks, or machine tending, i.e., loading and/or unloading.
In our review, 35 papers presented unique case studies of industrial applications. Two industries
seem to drive this research—the automotive industry accounted for 22.85% of studies, and the
Robotics 2019, 8, 100 9 of 25
electronics industry a further 17.14%. Interestingly, research for the automotive industry only began
after 2015, and will likely continue to drive research in this area.
HRC studies present several objectives that can be grouped into three main topics. Figure 6
depicts the focus of HRC studies in the last decade. It is interesting to note that the first phase of HRC
study [37–41] was more focused on increasing the production and safety aspects of HRC, at least in
a manufacturing context. As the research progressed, an increasing number of studies were focused on
HRI methodologies, becoming a predominant objective in 2017. The ostensible reduction in 2018 should
not mislead us to believe that HRI studies were abandoned in that year: As stated before, the presented
classification is not univocal, thus studies such as [42–44] could also be considered HRI studies.
5
Number of papers
4
Productivity
3 Safety
HRI
0
2009 2011 2013 2014 2015 2016 2017 2018
Year
Figure 6. Main topics or objectives in HRC studies. The objectives were divided into productivity
studies (blue), safety studies (orange), e.g., ergonomics and collision avoidance, HRI (Human–Robot
Interaction) studies (gray), e.g., development or improvement of HRI methodologies.
The key findings of these studies highlight challenge areas that research has successfully
addressed, or even solved, when cobots are used for industrial tasks. Multiple studies reported
an increase in task performance—e.g., by reducing completion time and minimizing error [25,37,38,43]—
as well as a better understanding of the operator space [29,31,32,41] and higher precision of workpiece
manipulation [28,30,45]. Thematic areas of research intent can be identified, such as increasing and
quantifying the trust of the operator in the robotic system [29,46,47], as well as improving safety by
minimizing collisions [40].
The directions of future work identified in literature are summarized in Figure 7. Historically,
researchers aimed to increase the HRI relevance of their work, also with a focus on higher safety
requirements and more complex tasks. In recent years, the scope of future work has expanded,
with researchers focusing on more complex methods that improve the performance of their systems—
whether this is by applying their method to different application fields or more complex tasks. This is
likely due to the prevalence of new cobots and sensing methodologies coming onto the market,
maturing algorithms, and experience in designing collaborative workcells.
Robotics 2019, 8, 100 10 of 25
0
2009 2011 2013 2014 2015 2016 2017 2018
Year
Figure 7. Future work topics from HRC studies. The work was divided into directions of HRI
(dark blue), safety (orange), task complexity (gray), applicability (yellow), method (light blue),
and productivity (green).
Many of the reviewed works highlight future work in terms of the method they used, whether it
be by increasing the complexity of their modeling of the operator and/or environment [48], or using
different metrics to evaluate performance [33,49,50] and task choice [51]. Others believe that expanding
their research setup to other application areas is the next step [31,45,52]. In our view, these works
can be achieved without any step change in existing technology or algorithms; rather, it requires
more testing time. To increase safety, productivity, and task performance, researchers will need to
improve planners, [39,53], environment and task understanding [28,40,54,55], operator intention
understanding [38], and ergonomic cell setups [37,56]. To improve HRI systems, common future work
focuses on increasing the robots’ and operators’ awareness of the task and environment by object
recognition [44] and integrating multi-modal sensing in an intuitive manner for the operator [3,32,36].
In essence, this future direction focuses on having better understanding of the scene—whether
this is what the operator intends to do, what is happening in the environment, or the status of the task.
Researchers propose solving this by using more sensors and advanced algorithms, and fusing this
information in a way that is easy to use and intuitive for the operator to understand. These systems
will inherently lead to better safety, as unexpected motions will be minimized, leading consequently
to more trust and uptake. We can expect that many of these advances can come from other areas of
robotics research, such as learning by demonstration through hand-guiding or simulation techniques
that make it easy to teach a robot a task, and advances in computer vision and machine learning for
object recognition and semantic mapping. Other reviews, such as [8], identify similar trends, namely
those of improved modeling and understanding, better task planning, and adaptive learning. It will
be very interesting to see how this technology is incorporated into the industrial setting to take full
advantage of the mechanics and control of cobots and the HRI methodologies of task collaboration.
for cobots, especially considering that small- and medium-sized enterprises (SMEs), which represent
almost 70% of the global number of manufacturers [60] and could not afford robotic applications due
to the high capital costs, are now adopting cobots, as they require less expertise and lower installation
expenses, confirming a trend presented in scientific works [3].
Finally, [57] highlights that cobots, presenting different payloads, were preferred with up to 5 kg
payload capacity; indeed, they held the largest market size in 2017, and a similar trend is expected
to continue from 2018 to 2025. This preference of the market towards lightweight robots, which are
safer but do not present the high speed and power typically connected with industrial robots [36,61],
restrains the HRC possibilities in the current manufacturing scenario. However, we believe that without
proper regulation, the current market will continue to mark a dividing line between heavy-duty tasks
and HRC methods.
5. Conclusions
Human–robot collaboration is a new frontier for robotics, and the human–robot synergy will
constitute a relevant factor in industry for improving production lines in terms of performances and
flexibility. This will only be achieved with systems that are fundamentally safe for human operators,
intuitive to use, and easy to set up. This paper has provided an overview of the current standards
related to Human–Robot Collaboration, showing that it can be applied in a wide range of different
modes. The state of the art was presented and the kinematics of several popular cobots were described.
A literature analysis was carried out and 41 papers, presenting 35 unique industrial case studies,
were reviewed.
Within the context of manufacturing applications, we focused on the control systems,
the collaboration methodologies, and the tasks assigned to the cobots in HRC studies. From our
analysis, we can identify that the research is largely driven by the electronics and automotive industries,
but as cobots become cheaper and easier to integrate into workcells, we can expect SMEs from a wide
range of industrial applications to lead their adoption. Objective, key findings and future research
directions are also identified, the latter highlighting ongoing challenges that still need to be solved.
We can expect that many of the advances needed in the identified directions could come from other
areas of robotics research; how these will be incorporated into the industrial setting will lead to new
challenges in the future.
Author Contributions: Conceptualization, E.M. and G.R.; Methodology, E.M. and G.R.; Formal analysis,
E.M., R.M., and E.G.G.Z.; Investigation, E.M., R.M. and E.G.G.Z.; Data curation, E.M., R.M., and E.G.G.Z.;
Writing—original draft preparation, E.M., R.M., and E.G.G.Z.; Writing—review and editing, M.F., E.M., R.M., G.R.,
and E.G.G.Z.; Supervision, M.F. and G.R.; Project administration, M.F. and G.R.; Funding acquisition, G.R.
Funding: This research was funded by University of Padua—Program BIRD 2018—Project no. BIRD187930,
and by Regione Veneto FSE Grant 2105-55-11-2018.
Conflicts of Interest: The authors declare no conflict of interest.
Robotics 2019, 8, 100 12 of 25
Appendix A. Tables
Table A1. List of characteristics of some of the most used cobots for different kinematics.
Table A2. Denavit–Hartenberg parameters and singularity configurations for the considered kinematic
schemes.
Kinematic
Denavit–Hartenberg Parameters Singularity Configurations
Scheme
1500
1000
z [mm]
T α a θ d 500
T10 0 0 q1 d1 1600
0 -200
1200
z [mm]
800
400
-600
T54 90 0 q5 0
Elbow: Wrist coplanar with J2 and J3
T65 −90 0 q6 d6 0
200
400
600
800
1000
Wrist: J6 // J4
T α a θ d
T10 0 0 q1 d1
Six axes T21 −90 0 −90 + q2 d2
with three Shoulder: Intersection of J5 and J6 coplanar
parallel T32 0 a2 q3 − d3 with J1 and J2
axes T43 0 a3 −90 + q4 d4
T54 −90 0 180 + q5 d5
Elbow: J2, J3, and J4 coplanar
T65 −90 0 q6 d6
1200
1000
800
z [mm]
T α a θ d 400
200
T10 0 0 q1 d1
Six axes T21 90 0 90 + q2 − d2
with Shoulder: Wrist point near the yellow
offset T32 180 a2 90 + q3 − d3 column (300 mm of radius)
wrist T43 90 0 180 + q4 − d4
1000
T54 90 0 180 + q5 − d5
Elbow: J3 ≈ 0◦ or 180◦ ±15◦
z [mm]
500
T65 90 0 q6 − d6 0
-100
0
100
T65 −90 0 q6 0
T76 90 0 q7 d7
Wrist: J5 // J7
T α a θ d
T10 0 0 q1 d1
Seven axes T21 −90 − a1 q2 0
without Shoulder: Wrist point near J1 direction
spherical T32 90 a2 q3 d3
joints T43 90 a3 90 + q4 0
T54 90 a4 q5 d5 Elbow: J3 // J5
T65 90 − a5 q6 0
T76 −90 a6 180 + q7 d7
Robotics 2019, 8, 100 14 of 25
T. Ende 2011 DLR Gesture Human Productivity: Gather Eleven gestures present N/A
et al. [41] LWR-III 3 recognition assistant gestures for HRC from recognition rate over 80%;
humans recognition problem when
torso position is part of the
gesture
Robotics 2019, 8, 100 15 of 25
H. Ding 2013 ABB Vision Speed and Assembly Safety: Collaborative Speed reduction applied N/A
et al. [25] FRIDA separation behavior with operator based on the distance
(YuMi) monitoring safety and without between human arm and
productivity losses due Kinect position avoids
to emergency stops emergency stops
H. Ding 2013 ABB Vision Speed and Assembly Safety: Multiple Development of a finite state N/A
et al. [26] FRIDA separation operators in automaton; speed reduction
(YuMi) monitoring collaborative behavior improves the uptime while
with operators safety respecting safety constraints
concern and without
productivity losses
A. M. 2013 Industrial Position Speed and Quality Safety: Compromise Development of N/A
Zanchettin robot separation control between safety and a safety-oriented
et al. [27] monitoring productivity; Adaptable path-constrained motion
robot speed planning, tracking operator,
and reducing robot speed
K. P. 2014 Universal Vision Power and Assembly Productivity: Robots Compromises between HRI: impact of the system
Hawkins Robots Force limiting need to anticipate human wait times and on the operator’s sense of
et al. [32] UR10 human actions even confidence in the human fluency, i.e., synchronization
with task or sensor action detection between cobot and operator
ambiguity
K. R. 2015 Universal Impedance Machine Productivity: Robot Test of machine tending: 82% HRI: Test ease of use and
Guerin Robots tending assistant with set of of parts taken from machine focus on HRC, gesture
et al. [3] UR5 capabilities for typical (due to bad weld or bad recognition for learning
SMEs grasp)
I. D. 2015 Industrial Position Human HRI: Trust not The robot pose changes Applicability: Effectiveness
Walker robot assistant considered in handoffs; accordingly to trust in of the approach in SME
et al. [46] derive model for robot human, reducing impact scenarios
trust on operator forces in case of low trust
Robotics 2019, 8, 100 16 of 25
T. 2015 Kawada Vision Power and Assembly HRI: Learn task from Human and robot roles Task complexity: Complete
Hamabe HIRO Force limiting human demonstration fixed, due to limits in robot’s set of scenarios assumed,
et al. [54] manipulation capabilities; cobot should be able
changing task order increases to recognize new tasks
time due to recognition autonomously
system
S. M. M. 2016 Kinova Vision Assembly HRI: Derive model for pHRI gets better with trust Method: Apply the
Rahman MICO robot trust on human; for contextual information proposed method with
et al. [29] 2-finger trust-based motion transparent to human; kinematically redundant
planning for handover increase: 20% safety, 30% robot
tasks handover success, 6.73%
efficiency
S. M. M. 2016 Rethink Vision Assembly Productivity: Regret based method leads Method: Different objective
Rahman Baxter Autonomous error to improvement of fluency, criteria for the regret-based
et al. [49] detection with human’s due to an increase in approach to evaluate HRC
intervention synchronization, reduction performance
in mean cognitive workload
and increase in human trust
compared to a Bayesian
approach
L. Rozo 2016 WAM Impedance/ Hand-guiding Assembly HRI: Robotic assistant Model adapts to changes in Method: Estimation of the
et al. [48] robot/ admittance needs to be easily starting and ending point, damping matrix for the
KUKA reprogrammed, thus task, and control mode; with spring damper model; study
LWR iiwa programming by high compliance, the robot how interaction forces can
demonstration can not follow trajectory change the robot behaviors
A. M. 2016 ABB Vision Speed and Assembly Safety: Collision Speed reduction method N/A
Zanchettin FRIDA separation avoidance strategy: based on minimum distance;
et al. [62] (YuMi) monitoring Decrease the speed of distance threshold adaptable
the cobot to the programmed speed;
continuous speed scaling
Robotics 2019, 8, 100 17 of 25
A. 2016 KUKA Admittance Safety rated Assembly Safety: Collaborative Framework that integrates Productivity: Deploying the
Cherubini LWR4+ monitored human–robot many state-of-the-art proposed methodologies on
et al. [63] stop application to assembly robotics components, mobile manipulator robots to
a car homokinetic joint applied in real industrial increase flexibility
scenarios
H. 2016 Rethink Vision Power and Assembly HRI: Implementation API development combining Method: add cost to
Fakhruldeen Baxter and audio Force limiting of a self-built planner an object-oriented action evaluation, add
et al. [51] recognition in a cooperative task programming scheme with non-productive actions,
where the cobot actively a Prolog meta interpreter action completion percentage
collaborates to create these plans and should be considered
execute them
S. Makris 2016 Industrial AR Hand-guiding Human Safety: Development The proposed system Task complexity:
et al. [64] robot system assistant of an AR system minimizes the time required Application in other
in aid of operators for the operator to access industrial environments
in human–robot information and send
collaborative feedback; it decreases the
environment to reduce stoppage and enhances the
mental stress training process
B. 2017 KUKA Impedance Hand-guiding/ Assembly HRI: Cooperate in 100% correct placement of Applicability: Adapt
Whitsell LBR iiwa / Power and everyday environment; a block in 1440 trials; robot variables to environment, e.g.
et al. [28] admittance Force Limiting robots need to adapt to can control a DOF if the task and robot coordinate
human; haptics should operator does not control it system not aligned
adapt to the operator (95.6%); lessening the human
ways responsibility by letting the
robot control an axis reduces
the completion time
Robotics 2019, 8, 100 18 of 25
J. Bös 2017 ABB Admittance Power and Assembly Productivity: Increase Increase acceleration Method: Study and
et al. [65] YuMi Force limiting assembly speed without by applying Dynamic theoretical proof on stability
reducing the flexibility Movement Primitives to in the long term
or increasing contact an ILC, reduce contact
forces using iterative forces by adding a learning
learning control (ILC) controller. Stochastic
disturbances do not have
a long term effect; task
duration decreases by 40%,
required contact force by
50%
M. 2017 KUKA Vision Hand-guiding Assembly Productivity: Create Easy reconfiguration, Method: Introduce
Wojtynek LBR iiwa a modular and flexible without complex metrics for quantitative
et al. [33] system; abstraction of programming measurement of HRC
any equipment
B. 2017 Rethink Human Power and Assembly HRI: Combination of Augmenting physical/social N/A
Sadrfaridpour Baxter tracking Force limiting pHRI and sHRI in capabilities increases
et al. [47] system order to predict human one subjective measure
behavior and choose (trust, workload, usability);
robot path and speed assembly time does not
change
I. El 2017 Rethink Vision Power and Assembly HRI: HRC based on The framework is validated; Task Complexity: Adapt
Makrini Baxter Force limiting natural communication; more intuitive HRI robot to user; adjust parts
et al. [56] framework for the cobot position based on user’s
to communicate height
P. J. Koch 2017 KUKA Admittance Screwing for HRI: Cobot HR interface, simple for user Applicability: Expand
et al. [52] LWR4+ maintenance development: Focus on reconfiguration. Steps in to several industrial
intuitive human–robot order to transform a mobile maintenance tasks
interface manipulator into a cobot
Robotics 2019, 8, 100 19 of 25
M. Haage 2017 ABB Vision Assembly HRI: Reduce the time A web-based HRI for N/A
et al. [66] YuMi and required expertise assisting human instructors
to setup a robotized to teach assembly tasks
assembly station in a straightforward and
intuitive manner
P. 2017 Universal Impedance Hand-guiding/ Assembly HRI: Joint Speech Developed a simplified Method: Test if haptic
Gustavsson Robots and audio Power and recognition and a haptic HRI responsive to vocal control can be used to move
et al. [50] UR3 recognition Force Limiting control in order to obtain commands, that guides the the robot with linear motions;
an intuitive HRC user in the progress of the an automatic way of logging
task with haptics the accuracy
M. Safeea 2017 KUKA Admittance Hand-guiding/ Assembly Safety: Precise and Possible to hand-guide the Method: Utilizing the
et al. [30] LBR iiwa Safety rated intuitive hand guiding robot with accuracy, with no redundancy of iiwa to
Monitored vibration, and in a natural achieve better stiffness in
Stop and intuitive way hand-guiding
W. Wang 2018 Industrial Gesture Hand-guiding/ Assembly Productivity: Easier 95% of accuracy (higher than Applicability: Use
et al. [45] robot recognition Power and reconfiguration of previous methods based on multimodal information
Force Limiting the cobot using vision); Lower inefficient to investigate different
a teaching-by-demonstrationtime applications
model
N. 2018 KUKA Vision Speed and Assembly HRI: Need for a flexible Gesture is intuitive but Applicability: Expand use
Mendes LBR iiwa separation system with simple and delays process; constrained case to several industrial
et al. [31] monitoring fast interface flexibility fields
K. 2018 Rethink Vision Power and Assembly Productivity: Increase Planning and task N/A
Darvish et al. [42] Baxter Force limiting robot adaptability representation require
integrated in the little time (less than 1% of
FlexHRC architecture by total); the simulation follows
an online task planner the real data very well
Robotics 2019, 8, 100 20 of 25
A. 2018 ABB Vision Power and Assembly Productivity: Predict Decrease of task time equal N/A
Zanchettin YuMi Force limiting human behavior in order to 17%;
et al. [43] to increase the robot
adaptability
G. Pang 2018 ABB Sensitive Safety rated Test of Safety: Cobot perceives Integration on cobot of Method: Reduce contact
et al. [34] YuMi skin monitored collision stimulus only by its sensitive skin; delay in the area of the sensitive skin; test
stop/ torque sensors, not system reaction with multisensing systems
Speed and guaranteeing collision
separation avoidance
monitoring
V. V. 2018 Universal Vision Speed and Human Productivity: Cobots Prediction of long motion Safety: Recognize
Unhelkar Robots separation assistant can be successful, but (16 s) with a prediction unmodeled motion and
et al. [53] UR10 monitoring they have restricted time horizon up to 6 s; in incremental planners
range: Mobile cobots for simulation, reduced safety
delivering parts rated monitored stops,
increasing task efficiency
V. Tlach 2018 Industrial Admittance Hand-guiding/ Assembly Productivity: Design of The method is flexible to the Productivity: Improve
et al. [55] robot Safety rated collaborative tasks in type of product methods for recognizing
Monitored an application objects
Stop
S. 2018 KUKA Vision / Hand-guiding/ Assembly Safety: Task allocation Assembly time of 203 s in N/A
Heydaryan LBR iiwa admittance Safety rated to ensure safety of the SMS, but the robot obstructs
et al. [35] 14 R820 Monitored operator, increase the access to some screws;
Stop productivity by proposed a hand-guided
increasing ergonomics solution (210s)
G. 2018 Industrial Admittance Hand-guiding Assembly Safety: Implementation Development of HRC HRI: Improve human
Michalos robot of a robotic system for assembly cell with high immersion in the cell.
et al. [36] HRC assembly payload industrial robots Integrate all the sensing and
and human operators. interaction equipment
Robotics 2019, 8, 100 21 of 25
References
1. Barbazza, L.; Faccio, M.; Oscari, F.; Rosati, G. Agility in assembly systems: A comparison model.
Assem. Autom. 2017, 37, 411–421. [CrossRef]
2. Colgate, J.E.; Edward, J.; Peshkin, M.A.; Wannasuphoprasit, W. Cobots: Robots for Collaboration with
Human Operators. In Proceedings of the 1996 ASME International Mechanical Engineering Congress and
Exposition, Atlanta, GA, USA, 17–22 November 1996; pp. 433–439
3. Guerin, K.R.; Lea, C.; Paxton, C.; Hager, G.D. A framework for end-user instruction of a robot assistant for
manufacturing. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation
(ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 6167–6174.
4. Peshkin, M.A.; Colgate, J.E.; Wannasuphoprasit, W.; Moore, C.A.; Gillespie, R.B.; Akella, P. Cobot architecture.
IEEE Trans. Robot. Autom. 2001, 17, 377–390. [CrossRef]
5. DLR—Institute of Robotics and Mechatronics. History of the DLR LWR. Available online: https://www.dlr.
de/rm/en/desktopdefault.aspx/tabid-12464/21732_read-44586/ (accessed on 30 November 2019).
6. Universal Robots. Low Cost and Easy Programming Made the UR5 a Winner. Available online: https:
//www.universal-robots.com/case-stories/linatex/ (accessed on 30 November 2019).
7. Müller, R.; Vette, M.; Geenen, A. Skill-based dynamic task allocation in Human-Robot-Cooperation with the
example of welding application. Procedia Manuf. 2017, 11, 13–21. [CrossRef]
8. Wang, L.; Gao, R.; Váncza, J.; Krüger, J.; Wang, X.V.; Makris, S.; Chryssolouris, G. Symbiotic human-robot
collaborative assembly. CIRP Ann. 2019, 68, 701–726. [CrossRef]
9. Müller, R.; Vette, M.; Mailahn, O. Process-oriented task assignment for assembly processes with human-robot
interaction. Procedia CIRP 2016, 44, 210–215. [CrossRef]
10. Wang, X.V.; Kemény, Z.; Váncza, J.; Wang, L. Human–robot collaborative assembly in cyber-physical
production: Classification framework and implementation. CIRP Ann. 2017, 66, 5–8. [CrossRef]
11. Krüger, J.; Lien, T.K.; Verl, A. Cooperation of human and machines in assembly lines. CIRP Ann.
2009, 58, 628–646.
12. Gaskill, S.; Went, S. Safety issues in modern applications of robots. Reliab. Eng. Syst. Saf. 1996, 53, 301–307.
[CrossRef]
13. Michalos, G.; Makris, S.; Tsarouchi, P.; Guasch, T.; Kontovrakis, D.; Chryssolouris, G. Design considerations
for safe human-robot collaborative workplaces. Procedia CIrP 2015, 37, 248–253. [CrossRef]
14. Gravel, D.P.; Newman, W.S. Flexible robotic assembly efforts at Ford Motor Company. In Proceeding of the
2001 IEEE International Symposium on Intelligent Control (ISIC’01) (Cat. No. 01CH37206), Mexico City,
Mexico, 5–7 September 2001; pp. 173–182.
15. Zhu, Z.; Hu, H. Robot learning from demonstration in robotic assembly: A survey. Robotics 2018, 7, 17.
16. Fechter, M.; Foith-Förster, P.; Pfeiffer, M.S.; Bauernhansl, T. Axiomatic design approach for human-robot
collaboration in flexibly linked assembly layouts. Procedia CIRP 2016, 50, 629–634. [CrossRef]
17. Faccio, M.; Bottin, M.; Rosati, G. Collaborative and traditional robotic assembly: A comparison model. Int. J.
Adv. Manuf. Technol. 2019, 102, 1355–1372. [CrossRef]
18. Edmondson, N.; Redford, A. Generic flexible assembly system design. Assem. Autom. 2002, 22, 139–152.
[CrossRef]
19. Battini, D.; Faccio, M.; Persona, A.; Sgarbossa, F. New methodological framework to improve productivity
and ergonomics in assembly system design. Int. J. Ind. Ergon. 2011, 41, 30–42. [CrossRef]
20. Sawodny, O.; Aschemann, H.; Lahres, S. An automated gantry crane as a large workspace robot. Control Eng.
Pract. 2002, 10, 1323–1338. [CrossRef]
21. Krüger, J.; Bernhardt, R.; Surdilovic, D.; Spur, G. Intelligent assist systems for flexible assembly. CIRP Ann.
2006, 55, 29–32. [CrossRef]
22. Rosati, G.; Faccio, M.; Carli, A.; Rossi, A. Fully flexible assembly systems (F-FAS): A new concept in flexible
automation. Assem. Autom. 2013, 33, 8–21. [CrossRef]
23. FANUC Italia, S.r.l. M-2000—The Strongest Heavy Duty Industrial Robot in the Marker. Available online:
https://www.fanuc.eu/it/en/robots/robot-filter-page/m-2000-series (accessed on 30 November 2019).
24. Hägele, M.; Schaaf, W.; Helms, E. Robot assistants at manual workplaces: Effective co-operation and
safety aspects. In Proceedings of the 33rd ISR (International Symposium on Robotics), Stockholm, Sweden
7–11 October 2002; Volume 7.
Robotics 2019, 8, 100 23 of 25
25. Ding, H.; Heyn, J.; Matthias, B.; Staab, H. Structured collaborative behavior of industrial robots in mixed
human-robot environments. In Proceedings of the 2013 IEEE International Conference on Automation
Science and Engineering (CASE), Madison, WI, USA, 17–20 August 2013; pp. 1101–1106.
26. Ding, H.; Schipper, M.; Matthias, B. Collaborative behavior design of industrial robots for multiple
human-robot collaboration. In Proceedings of the IEEE ISR 2013, Seoul, Korea, 24–26 October 2013; pp. 1–6.
27. Zanchettin, A.M.; Rocco, P. Path-consistent safety in mixed human-robot collaborative manufacturing
environments. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and
Systems, Tokyo, Japan, 3–7 November 2013; pp. 1131–1136.
28. Whitsell, B.; Artemiadis, P. Physical human–robot interaction (pHRI) in 6 DOF with asymmetric cooperation.
IEEE Access 2017, 5, 10834–10845. [CrossRef]
29. Rahman, S.M.; Wang, Y.; Walker, I.D.; Mears, L.; Pak, R.; Remy, S. Trust-based compliant robot-human
handovers of payloads in collaborative assembly in flexible manufacturing. In Proceedings of the 2016
IEEE International Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA,
21–25 August 2016; pp. 355–360.
30. Safeea, M.; Bearee, R.; Neto, P. End-effector precise hand-guiding for collaborative robots. In Iberian Robotics
Conference; Springer: Berlin, Germany, 2017; pp. 595–605.
31. Mendes, N.; Safeea, M.; Neto, P. Flexible programming and orchestration of collaborative robotic
manufacturing systems. In Proceedings of the 2018 IEEE 16th International Conference on Industrial
Informatics (INDIN), Porto, Portugal, 18–20 July 2018; pp. 913–918.
32. Hawkins, K.P.; Bansal, S.; Vo, N.N.; Bobick, A.F. Anticipating human actions for collaboration in the presence
of task and sensor uncertainty. In Proceedings of the 2014 IEEE International Conference on Robotics and
Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 2215–2222.
33. Wojtynek, M.; Oestreich, H.; Beyer, O.; Wrede, S. Collaborative and robot-based plug & produce for rapid
reconfiguration of modular production systems. In Proceedings of the 2017 IEEE/SICE International
Symposium on System Integration (SII), Taipei, Taiwan, 11–14 December 2017; pp. 1067–1073.
34. Pang, G.; Deng, J.; Wang, F.; Zhang, J.; Pang, Z.; Yang, G. Development of flexible robot skin for safe and
natural human–robot collaboration. Micromachines 2018, 9, 576. [CrossRef]
35. Heydaryan, S.; Suaza Bedolla, J.; Belingardi, G. Safety design and development of a human-robot
collaboration assembly process in the automotive industry. Appl. Sci. 2018, 8, 344. [CrossRef]
36. Michalos, G.; Kousi, N.; Karagiannis, P.; Gkournelos, C.; Dimoulas, K.; Koukas, S.; Mparis, K.;
Papavasileiou, A.; Makris, S. Seamless human robot collaborative assembly—An automotive case study.
Mechatronics 2018, 55, 194–211. [CrossRef]
37. Tan, J.T.C.; Zhang, Y.; Duan, F.; Watanabe, K.; Kato, R.; Arai, T. Human factors studies in information support
development for human-robot collaborative cellular manufacturing system. In Proceedings of the RO-MAN
2009—The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama,
Japan, 27 September–2 October 2009; pp. 334–339.
38. Arai, T.; Duan, F.; Kato, R.; Tan, J.T.C.; Fujita, M.; Morioka, M.; Sakakibara, S. A new cell production assembly
system with twin manipulators on mobile base. In Proceedings of the 2009 IEEE International Symposium
on Assembly and Manufacturing, Suwon, Korea, 17–20 November 2009; pp. 149–154.
39. Tan, J.T.C.; Duan, F.; Zhang, Y.; Watanabe, K.; Kato, R.; Arai, T. Human-robot collaboration in cellular
manufacturing: Design and development. In Proceedings of the 2009 IEEE/RSJ International Conference on
Intelligent Robots and Systems, Saint Louis, MO, USA, 10–15 October 2009; pp. 29–34.
40. Lenz, C.; Rickert, M.; Panin, G.; Knoll, A. Constraint task-based control in industrial settings. In Proceedings
of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, Saint Louis, MO, USA,
10–15 October 2009; pp. 3058–3063.
41. Ende, T.; Haddadin, S.; Parusel, S.; Wüsthoff, T.; Hassenzahl, M.; Albu-Schäffer, A. A human-centered
approach to robot gesture based communication within collaborative working processes. In Proceedings of
the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA,
25–30 September 2011; pp. 3367–3374.
42. Darvish, K.; Bruno, B.; Simetti, E.; Mastrogiovanni, F.; Casalino, G. Interleaved Online Task Planning,
Simulation, Task Allocation and Motion Control for Flexible Human-Robot Cooperation. In Proceedings of
the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN),
Nanjing, China, 27–31 August 2018; pp. 58–65.
Robotics 2019, 8, 100 24 of 25
43. Zanchettin, A.; Casalino, A.; Piroddi, L.; Rocco, P. Prediction of human activity patterns for human-robot
collaborative assembly tasks. IEEE Trans. Ind. Inf. 2018, 15, 3934–3942. [CrossRef]
44. Blaga, A.; Tamas, L. Augmented Reality for Digital Manufacturing. In Proceedings of the 2018 26th Mediterranean
Conference on Control and Automation (MED), Akko, Israel, 1–4 July 2018; pp. 173–178.
45. Wang, W.; Li, R.; Diekel, Z.M.; Chen, Y.; Zhang, Z.; Jia, Y. Controlling Object Hand-Over in Human–Robot
Collaboration Via Natural Wearable Sensing. IEEE Trans. Human-Mach. Syst. 2018, 49, 59–71. [CrossRef]
46. Walker, I.D.; Mears, L.; Mizanoor, R.S.; Pak, R.; Remy, S.; Wang, Y. Robot-human handovers based on trust.
In Proceedings of the 2015 IEEE Second International Conference on Mathematics and Computers in Sciences
and in Industry (MCSI), Sliema, Malta, 17 August 2015; pp. 119–124.
47. Sadrfaridpour, B.; Wang, Y. Collaborative assembly in hybrid manufacturing cells: An integrated framework
for human–robot interaction. IEEE Trans. Autom. Sci. Eng. 2017, 15, 1178–1192. [CrossRef]
48. Rozo, L.; Calinon, S.; Caldwell, D.G.; Jimenez, P.; Torras, C. Learning physical collaborative robot behaviors
from human demonstrations. IEEE Trans. Robot. 2016, 32, 513–527. [CrossRef]
49. Rahman, S.M.; Liao, Z.; Jiang, L.; Wang, Y. A regret-based autonomy allocation scheme for human-robot shared
vision systems in collaborative assembly in manufacturing. In Proceedings of the 2016 IEEE International
Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA, 21–25 August 2016;
pp. 897–902.
50. Gustavsson, P.; Syberfeldt, A.; Brewster, R.; Wang, L. Human-robot collaboration demonstrator combining
speech recognition and haptic control. Procedia CIRP 2017, 63, 396–401. [CrossRef]
51. Fakhruldeen, H.; Maheshwari, P.; Lenz, A.; Dailami, F.; Pipe, A.G. Human robot cooperation planner using
plans embedded in objects. IFAC-PapersOnLine 2016, 49, 668–674. [CrossRef]
52. Koch, P.J.; van Amstel, M.K.; Dȩbska, P.; Thormann, M.A.; Tetzlaff, A.J.; Bøgh, S.; Chrysostomou, D.
A skill-based robot co-worker for industrial maintenance tasks. Procedia Manuf. 2017, 11, 83–90. [CrossRef]
53. Unhelkar, V.V.; Lasota, P.A.; Tyroller, Q.; Buhai, R.D.; Marceau, L.; Deml, B.; Shah, J.A. Human-aware robotic
assistant for collaborative assembly: Integrating human motion prediction with planning in time. IEEE Robot.
Autom. Lett. 2018, 3, 2394–2401. [CrossRef]
54. Hamabe, T.; Goto, H.; Miura, J. A programming by demonstration system for human-robot collaborative
assembly tasks. In Proceedings of the 2015 IEEE International Conference on Robotics and Biomimetics
(ROBIO), Zhuhai, China, 6–9 December 2015; pp. 1195–1201.
55. Tlach, V.; Kuric, I.; Zajačko, I.; Kumičáková, D.; Rengevič, A. The design of method intended for implementation
of collaborative assembly tasks. Adv. Sci. Technol. Res. J. 2018, 12, 244–250. [CrossRef]
56. El Makrini, I.; Merckaert, K.; Lefeber, D.; Vanderborght, B. Design of a collaborative architecture for
human-robot assembly tasks. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent
Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 1624–1629.
57. MarketsandMarketsTM Research Private Ltd. Collaborative Robots Market by Payload Capacity (Up to 5 kg,
Up to 10 kg, Above 10 kg), Industry (Automotive, Electronics, Metals & Machining, Plastics & Polymer, Food
& Agriculture, Healthcare), Application, and Geography—Global Forecast to 2023. Available online: https:
//www.marketsandmarkets.com/Market-Reports/collaborative-robot-market-194541294.html (accessed on
30 November 2019).
58. International Federation of Robotics (IFR). Robots and the Workplace of the Future. 2018. Available online:
https://ifr.org/papers (accessed on 30 November 2019).
59. Barclays Investment Bank. Technology’s Mixed Blessing. 2017. Available online: https://www.investmentbank.
barclays.com/our-insights/technologys-mixed-blessing.html (accessed on 30 November 2019).
60. Tobe, F. Why Co-Bots Will Be a Huge Innovation and Growth Driver for Robotics Industry. 2015.
Available online: https://spectrum.ieee.org/automaton/robotics/industrial-robots/collaborative-robots-
innovation-growth-driver (accessed on 30 November 2019).
61. Gopinath, V.; Ore, F.; Grahn, S.; Johansen, K. Safety-Focussed Design of Collaborative Assembly Station with
Large Industrial Robots. Procedia Manuf. 2018, 25, 503–510. [CrossRef]
62. Zanchettin, A.M.; Ceriani, N.M.; Rocco, P.; Ding, H.; Matthias, B. Safety in human-robot collaborative
manufacturing environments: Metrics and control. IEEE Trans. Autom. Sci. Eng. 2015, 13, 882–893.
[CrossRef]
63. Cherubini, A.; Passama, R.; Crosnier, A.; Lasnier, A.; Fraisse, P. Collaborative manufacturing with physical
human–robot interaction. Robot. Comput. Integr. Manuf. 2016, 40, 1–13. [CrossRef]
Robotics 2019, 8, 100 25 of 25
64. Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.S. Augmented reality system for operator support in
human–robot collaborative assembly. CIRP Ann. 2016, 65, 61–64. [CrossRef]
65. Bös, J.; Wahrburg, A.; Listmann, K.D. Iteratively Learned and Temporally Scaled Force Control with application
to robotic assembly in unstructured environments. In Proceedings of the 2017 IEEE International Conference
on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3000–3007.
66. Haage, M.; Piperagkas, G.; Papadopoulos, C.; Mariolis, I.; Malec, J.; Bekiroglu, Y.; Hedelind, M.; Tzovaras, D.
Teaching assembly by demonstration using advanced human robot interaction and a knowledge integration
framework. Procedia Manuf. 2017, 11, 164–173. [CrossRef]
c 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).