Human-Centric Digital Twins in
Human-Centric Digital Twins in
Review
Human-CentricDigital Twins in Industry:
A Comprehensive Review of Enabling Technologies and
Implementation Strategies
Usman Asad 1,2 , Madeeha Khan 3 , Azfar Khalid 3, * and Waqas Akbar Lughmani 1
Abstract: The last decade saw the emergence of highly autonomous, flexible, re-configurable Cyber-
Physical Systems. Research in this domain has been enhanced by the use of high-fidelity simulations,
including Digital Twins, which are virtual representations connected to real assets. Digital Twins
have been used for process supervision, prediction, or interaction with physical assets. Interaction
with Digital Twins is enhanced by Virtual Reality and Augmented Reality, and Industry 5.0-focused
research is evolving with the involvement of the human aspect in Digital Twins. This paper aims to
review recent research on Human-Centric Digital Twins (HCDTs) and their enabling technologies. A
systematic literature review is performed using the VOSviewer keyword mapping technique. Current
technologies such as motion sensors, biological sensors, computational intelligence, simulation, and
visualization tools are studied for the development of HCDTs in promising application areas. Domain-
specific frameworks and guidelines are formed for different HCDT applications that highlight the
workflow and desired outcomes, such as the training of AI models, the optimization of ergonomics,
the security policy, task allocation, etc. A guideline and comparative analysis for the effective
Citation: Asad, U.; Khan, M.; Khalid, development of HCDTs are created based on the criteria of Machine Learning requirements, sensors,
A.; Lughmani, W.A. Human- Centric interfaces, and Human Digital Twin inputs.
Digital Twins in Industry:
A Comprehensive Review of
Keywords: Digital Twin; human-centric; Industry 5.0; literature review; human-robot collaboration;
Enabling Technologies and
artificial intelligence
Implementation Strategies. Sensors
2023, 23, 3938. https://doi.org/
10.3390/s23083938
that in addition to technological aspirations, government, and industry should move to-
wards a value-driven future where the focus is on human well-being, sustainability, and
resilience under Industry 5.0 [2].
In order to better integrate humans in CPS, the concept of Human Digital Twins is
finding increasing popularity in order to better monitor, evaluate, and optimize human
performance, ergonomics, and well-being [18]. The development of Human Digital Twins
involves the deployment of a model of humans using sensor data that provides insight into
their behavior and attributes, which may include their physical, physiological, cognitive,
and emotional states [19]. Although DTs of machines have found broad use in industry, the
use of DTs and parallel societies for human-centric social computing is still a developing
research topic [20]. Research on human intent recognition is also motivated by the aim
of developing symbiotic collaboration between robots and humans so as to distinguish
between accidental contact and active collaboration and develop an intuitive and help-
ful cobot motion control strategy [21]. Creating a symbiotic human-robot collaboration
system requires the use of dynamic monitoring of humans and resources using smart
sensors, active collision avoidance, dynamic planning, and context-aware adaptive robot
control [22].
The main purpose of this study is to present state-of-the-art research conducted on
Human-Centric Digital Twins, their enabling technologies, and implementation frame-
works for different industrial applications. Firstly, the recent literature in the domain of
HCDTs, how it evolved over the years, and areas for future research are discussed using
a detailed literature review. Secondly, enabling technologies used by various researchers
and engineers in the past and those having potential in the future are discussed. Finally,
different applications of HCDT technology along with implementation frameworks are
Sensors 2023, 23, 3938 4 of 27
presented, and general guidelines are discussed for the development of HCDTs, as shown
in Figure 2.
2. Review Methodology
A comprehensive literature review is conducted in this study, where relevant research
studies are exported into a digital library, assessed, and screened for any relevance and
duplication. The research articles considered in this study were published between the
period of 2012 and 2022 and exported from Google Scholar, Science Direct, and Scopus.
These databases are selected as they contain the latest full-text peer-reviewed articles and
advanced search options and cover the largest content of published research papers. The
keywords used to perform a keywords-based search method are as follows: “Digital Twin”
AND (“Human-Centric” OR “Human Centered” OR “Human Robot Collaboration” OR
“Industry 4.0” OR “Human Robot Interaction” OR “Human Digital Twin” OR “Human-
Centered Design” OR “Industry 5.0”).
In the second phase, a graph-based search method is used to find additional relevant
papers, where key papers identified from our initial search are used as seeds. For the
selection of key papers, the number of citations of the publications was considered, and
some of the most highly cited articles included in this review (Table 1) are used as seed
papers. Citation analysis tools (inciteful (https://inciteful.xyz/ (accessed on 12 January
2023)) and Litmaps (https://www.litmaps.com/ (accessed on 15 January 2023))) are utilized
to create citation network graphs (Figure 3), and related papers are explored and added to
the database using the network graph.
Using this search method, first, 237 research publications are screened by studying the
abstracts. Relevant papers are imported into the digital library, created using Mendeley
software, and further assessed for duplication and redundancies. Only those papers are
selected where the human element of DTs is an important consideration in the research
work. Finally, the final literature volume of 119 latest publications from 2016 to 2022 is
included in this review.
Sensors 2023, 23, 3938 5 of 27
Figure 5 highlights only the cluster of keywords related to ‘human centricity’ and
‘human centered design’. It shows that selected recent literature has discussed human
centricity in DTs, but it is not emphasized in all articles, as Digital Twin literature has
not considered human participation in previous years. The same trend is also shown in
Table 2, where the keyword ‘digital twin’ has the highest overall link strength of 88 and the
keywords ‘human-centricity’ and ‘human-centered design’ have a combined link strength
of 12, showing a weak correlation as they occur only six times.
Sensors 2023, 23, 3938 7 of 27
Table 2. Keywords.
3. Enabling Technologies
A number of technical challenges exist in the implementation of HCDTs. This paper
discusses the key technologies that can be used for the development of HCDTs and im-
plemented by different researchers and engineers in the literature for different industrial
applications, including Human-Robot Interaction (HRI). The following sections focus on
sensing technologies, computational intelligence techniques involving artificial intelligence,
optimization and control systems, multiple simulations, and visualization tools. In the
end, a generic framework for HCDT is also presented that incorporates the discussed
enabling technologies.
cameras [34,35], and Kinect [36–39], are also extensively used in literature. Non-optical
tracking devices include wearable inertial and magnetic measurement units (IMUs) [40],
and magnetometers have been used in the past to track human movements, trajectory, and
position while collaborating with cobots. Mechanical motion capture systems are also used
when direct measurement of human motion is essential [41].
Different biological sensors are also being used to measure the physiological data of
humans to monitor human behavior during human-robot collaboration [42]. Physiologi-
cal sensors, such as Electrooculogram (EOG) [43], Electrocardiogram (ECG) [44], Electroen-
cephalogram (EEG) [45], Magnetoencephalogram (MEG) [46], and EMG [47], capture signals
generated from the human body and can infer important information. Lately, these signals
have been broadly used in HRC systems to predict the intention of human operators [46].
Sensors are also required to collect and transmit data on various environmental
parameters such as airflow, humidity, light, noise, temperature, and others. This data
is then used to create an accurate digital representation of the physical environment, which
can be used for simulations, analysis, and decision-making with regard to human comfort
and well-being. For example, sensors can be used to monitor the air quality in a building,
which can help identify potential health hazards or optimize the operation of HVAC
systems in buildings, leading to improved comfort and energy savings.
interaction. Jin et al. developed a soft-robotic sensory gripper that uses an SVM-based
machine learning algorithm for object recognition through tactile feedback [56].
nicate between MATLAB and Unity. Other commonly used robotics simulators include
CoppeliaSim [81], and NVIDIA Isaac Sim [82].
text and voice communication through a fusion of technologies such as Task and Motion
Planning, Vision, Language, and Control in Robotics [101].
Figure 6. Enabling Technologies and framework of Human-Centric Digital Twins in Industry 5.0.
4. Application Domains
Digital Twin Technology has recently been applied in a range of industries and scenar-
ios, including Smart City construction [104], monitoring and optimization of physical assets
Sensors 2023, 23, 3938 13 of 27
including machine tools, vehicles, machinery, mechanical structures, and materials [27–29].
Here, we have identified and placed particular emphasis on key application domains where
human-centricity is of paramount importance.
(c) Digital Twin for training Vision Models (d) Warehouse Management
Figure 8. Nvidia Omniverse based DTs for Amazon and Pepsi [111].
Sensors 2023, 23, 3938 15 of 27
A generic framework for using HCDTs for training and testing of robotics systems is
shown in Figure 9. Here, the intent is to realize the aim of a symbiotic human-robot relation-
ship, by training a deep learning or deep reinforcement learning algorithm using human
and machine data fed into the DT, which includes predicting human intent and creating a
machine policy to assist the human in a flexible intuitive collaborative environment.
In ref. [94], an AR Headset (Microsoft HoloLens) is used to receive input from ROS
through the Rosbridge communication package to enhance the user’s experience and per-
ception. VR/AR-based training with DTs is receiving a lot of interest in a wide range of
industries, such as construction [118,119], mining [120]. Using immersive virtual simula-
tions, the safety perception of HRC in construction workers is tested and enhanced in [121].
Matsas et al. [122] created a virtual training and testing environment using 3D graphics
generated in 3ds Max in a gaming engine (Unity 3D). Using a VR HMD, user evaluations
of safety techniques adopted by the HRC AI are carried out and analyzed. Wang et al. [35]
used a combination of an industrial camera, a VR HMD, and Unity for the assessment of
Welder behavior in a teleoperation setting with a UR5 Cobot.
In the context of user training and education, the literature shows that immersive
technologies, audio-visual feedback, and the use of artificial intelligence with Digital Twin
technology are central to achieving adaptive, intuitive, and customized user assistance and
training experiences.
Figure 10. VR enhanced Digital Twin framework for HRC design [123].
Kousi et al. [79] employed ROS Gazebo, MoveIt, and Lanner Witness Simulation in an
automotive assembly case study, where a DT-based system was used to generate alternative
configurations for the assembly process with a dual-arm mobile robot and human operators
and validate the system’s performance. Wang et al. present a framework for Human-Robot
Collaborative Assembly using DTs that proposes a data fusion and visualization service to
process information coming from human-centric as well as robot sensors [124]. The data is
subsequently processed and used to generate events, schedule tasks, and run the robot’s
control service. Table 6 presents a brief summary of related literature.
Table 6. Cont.
The literature shows that DTs have been leveraged within this application domain to
optimize cycle time, enhance workstation layout, ergonomics, and flexibility, and carry out
dynamic scheduling, line balancing, and process planning.
5. Discussion
Cyber-physical systems with synergistic human-machine interaction, aided by Human-
Centric Digital Twins are slowly making their mark in multiple application domains.
Regarding the human element in Digital Twin applications, the utility of different sensors,
Human Digital Twin inputs, feedback mechanisms, and prevalent types of machine learning
algorithms in the identified application domains are shown in Table 9. The utility is ranked
as low, medium, or high and displayed graphically.
The use of HCDTs with VR/AR may be leveraged to create a personalized and adap-
tive user learning and collaborative experience. The audio-visual interface and direct input
devices play a central role in this application, which may be aided by artificial intelligence
by leveraging Large Language Models (LLM) or for adaptive learning. The supervisors
of the training regime should have access to intervene as required. Visual and motion
sensors are central to process design applications. HCDTs, using optimization techniques
and deep learning, can optimize cycle time, enhance workstation layout, ergonomics, and
flexibility, and carry out dynamic scheduling, line balancing, and process planning. An
immersive visual interface can be very beneficial for human designers to interact with the
DT for process monitoring and assessment.
In security applications, biometric and facial recognition, motion sensing, and the
use of data security techniques, including blockchain technology, are central. Here, deep
learning techniques can be leveraged to provide safeguards against the numerous types of
cyberattacks. Human intervention is required only in the event of any detected security
breach or threat. HCDTs already have a lot of acceptance in medical applications, where
DTs of patients are created using medical imaging and biological sensors, which can be
subjected to deep learning algorithms or FEA analysis. These are used by doctors who can
train and operate with the aid of immersive sensory feedback, including haptic feedback
and robotic assistance.
6. Conclusions
In this research work, a state-of-the-art literature review on Human-Centric Digital
Twins (HCDTs) and their enabling technologies is conducted. A key shortcoming observed
in current DT literature is that the focus is almost entirely on the physical assets of a CPS and
not on human operators. In the coming years, an increasingly ambitious and sophisticated
industry and process-specific DTs may address this shortcoming by developing HCDTs.
A generic framework is proposed to underline the enabling technologies, such as human-
focused sensors and computer vision, that can be used for the creation of HCDTs. The
enabling technologies for this purpose have received considerable individual attention.
For example, AI-driven algorithms for a range of problems such as object recognition
and collision avoidance or using DTs for monitoring and supervision of physical assets,
etc., have reached some degree of maturity. However, there is a lack of literature on how
all these enabling technologies are utilized together in a synergistic manner to enhance
human-machine interaction in industrial settings.
We identified six key application areas for DTs with strong human involvement, which
are ergonomics and safety, training and testing of robotics systems, user training and
education, product and process design, validation and testing, security of cyber-physical
systems, and finally rehabilitation, well-being, and health management. Implementation
frameworks for the selected domains highlight the workflow and desired outcomes (such
as optimization of ergonomics, security policy, task allocation, etc.). It has been found
that the use of HCDTs in literature on the security of CPS is currently underdeveloped
and merits further exploration. DTs are being extensively used to train robotic systems by
simulation and data generation in a virtual environment. These can be further enriched by
including more human-focused sensor data to enable synergistic, intuitive collaboration.
The development of increasingly specialized modeling tools and the ubiquitous spread
of artificial intelligence can transform this expert-driven paradigm towards becoming
user-driven. Over time, the use of HCDTs is expected to considerably expand owing to the
immense interest shown by researchers and industry.
Author Contributions: Conceptualization, A.K. and W.A.L.; Supervision, A.K. and W.A.L.;
Writing—original draft, U.A. and M.K.; Methodology, U.A. and M.K.; Visualization, U.A. and
M.K.; Writing—review & editing A.K. and W.A.L. All authors have read and agreed to the published
version of the manuscript.
Funding: This research received no external funding.
Sensors 2023, 23, 3938 22 of 27
References
1. Nahavandi, S. Industry 5.0-a human-centric solution. Sustainability 2019, 11, 4371. [CrossRef]
2. Lu, Y.; Zheng, H.; Chand, S.; Xia, W.; Liu, Z.; Xu, X.; Wang, L.; Qin, Z.; Bao, J. Outlook on human-centric manufacturing towards
Industry 5.0. J. Manuf. Syst. 2022, 62, 612–627. [CrossRef]
3. Gallala, A.; Kumar, A.A.; Hichri, B.; Plapper, P. Digital Twin for Human–Robot Interactions by Means of Industry 4.0 Enabling
Technologies. Sensors 2022, 22, 4950. [CrossRef]
4. Shafto, M.; Rich, M.C.; Glaessgen, D.E.; Kemp, C.; Lemoigne, J.; Wang, L. Modeling, Simulation, Information technology, and
Processing roadmap. Technology Area 11. Natl. Aeronaut. Space Adm. 2012, 32, 1–38.
5. DIGITAL TWIN: DEFINITION & VALUE An AIAA and AIA Position Paper. Available online: https://www.aia-aerospace.org/
publications/digital-twin-definition-value-an-aiaa-and-aia-position-paper/ (accessed on 27 October 2022).
6. Ammar, A.; Nassereddine, H.; Dadi, G. Roadmap to a Holistic Highway Digital Twin: A Why, How, & Why Framework. In
Critical Infrastructure—Modern Approach and New Developments; Pietro, D.A.D., Marti, P.J., Eds.; IntechOpen: London, UK, 2022;
Chapter 3. [CrossRef]
7. Liu, M.; Fang, S.; Dong, H.; Xu, C. Review of digital twin about concepts, technologies, and industrial applications. J. Manuf. Syst.
2021, 58, 346–361. [CrossRef]
8. Turner, C.J.; Garn, W. Next generation DES simulation: A research agenda for human centric manufacturing systems. J. Ind. Inf.
Integr. 2022, 28, 100354. [CrossRef]
9. Tao, F.; Zhang, H.; Liu, A.; Nee, A.Y. Digital Twin in Industry: State-of-the-Art. IEEE Trans. Ind. Inform. 2019, 15, 2405–2415.
[CrossRef]
10. Wang, L. A futuristic perspective on human-centric assembly. J. Manuf. Syst. 2022, 62, 199–201. [CrossRef]
11. Khalid, A.; Khan, Z.H.; Idrees, M.; Kirisci, P.; Ghrairi, Z.; Thoben, K.D.; Pannek, J. Understanding vulnerabilities in cyber physical
production systems. Int. J. Comput. Integr. Manuf. 2022, 35, 569–582. [CrossRef]
12. Michalos, G.; Makris, S.; Tsarouchi, P.; Guasch, T.; Kontovrakis, D.; Chryssolouris, G. Design Considerations for Safe Human-robot
Collaborative Workplaces. Procedia CIRP 2015, 37, 248–253. [CrossRef]
13. Xu, X.; Lu, Y.; Vogel-Heuser, B.; Wang, L. Industry 4.0 and Industry 5.0—Inception, conception and perception. J. Manuf. Syst.
2021, 61, 530–535. [CrossRef]
14. Hardt, L.; Barrett, J.; Taylor, P.G.; Foxon, T.J. What structural change is needed for a post-growth economy: A framework of
analysis and empirical evidence. Ecol. Econ. 2021, 179, 106845. [CrossRef]
15. Land, N.; Syberfeldt, A.; Almgren, T.; Vallhagen, J. A framework for realizing industrial human-robot collaboration through
virtual simulation. Procedia CIRP 2020, 93, 1194–1199. [CrossRef]
16. Wang, B.; Zheng, P.; Yin, Y.; Shih, A.; Wang, L. Toward human-centric smart manufacturing: A human-cyber-physical systems
(HCPS) perspective. J. Manuf. Syst. 2022, 63, 471–490. [CrossRef]
17. Longo, F.; Padovano, A.; Umbrello, S. Value-Oriented and Ethical Technology Engineering in Industry 5.0: A Human-Centric
Perspective for the Design of the Factory of the Future. Appl. Sci. 2020, 10, 4182. [CrossRef]
18. Löcklin, A.; Jung, T.; Jazdi, N.; Ruppert, T.; Weyrich, M. Architecture of a Human-Digital Twin as Common Interface for Operator
4.0 Applications. Procedia CIRP 2021, 104, 458–463. [CrossRef]
19. Miller, M.E.; Spatz, E. A unified view of a human digital twin. Hum.-Intell. Syst. Integr. 2022, 4, 23–33. [CrossRef]
20. Wang, F.Y.; Qin, R.; Li, J.; Yuan, Y.; Wang, X. Parallel Societies: A Computing Perspective of Social Digital Twins and Virtual-Real
Interactions. IEEE Trans. Comput. Soc. Syst. 2020, 7, 2–7. [CrossRef]
21. Zhang, R.; Lv, J.; Li, J.; Bao, J.; Zheng, P.; Peng, T. A graph-based reinforcement learning-enabled approach for adaptive
human-robot collaborative assembly operations. J. Manuf. Syst. 2022, 63, 491–503. [CrossRef]
22. Wang, L.; Gao, R.; Váncza, J.; Krüger, J.; Wang, X.; Makris, S.; Chryssolouris, G. Symbiotic human-robot collaborative assembly.
CIRP Ann. 2019, 68, 701–726. [CrossRef]
23. Hosamo, H.H.; Imran, A.; Cardenas-Cartagena, J.; Svennevig, P.R.; Svidt, K.; Nielsen, H.K. A Review of the Digital Twin
Technology in the AEC-FM Industry. Adv. Civ. Eng. 2022, 2022, 2185170. [CrossRef]
24. Kunz, A.; Rosmann, S.; Loria, E.; Pirker, J. The Potential of Augmented Reality for Digital Twins: A Literature Review. In
Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) 2022, Christchurch, New Zealand, 12–16
March 2022; pp. 389–398. [CrossRef]
25. Lu, Y.; Liu, C.; Wang, K.I.; Huang, H.; Xu, X. Digital Twin-driven smart manufacturing: Connotation, reference model, applications
and research issues. Robot. Comput.-Integr. Manuf. 2020, 61, 101837. [CrossRef]
26. Dianatfar, M.; Latokartano, J.; Lanz, M. Review on existing VR/AR solutions in human–robot collaboration. Procedia CIRP 2021,
97, 407–411. [CrossRef]
Sensors 2023, 23, 3938 23 of 27
27. Atalay, M.; Murat, U.; Oksuz, B.; Parlaktuna, A.M.; Pisirir, E.; Testik, M.C. Digital twins in manufacturing: Systematic literature
review for physical–digital layer categorization and future research directions. Int. J. Comput. Integr. Manuf. 2022, 35, 679–705.
[CrossRef]
28. Perno, M.; Hvam, L.; Haug, A. Implementation of digital twins in the process industry: A systematic literature review of enablers
and barriers. Comput. Ind. 2022, 134, 103558. [CrossRef]
29. Agnusdei, G.P.; Elia, V.; Gnoni, M.G. A classification proposal of digital twin applications in the safety domain. Comput. Ind. Eng.
2021, 154, 107–137. [CrossRef]
30. Park, K.B.; Choi, S.H.; Lee, J.Y.; Ghasemi, Y.; Mohammed, M.; Jeong, H. Hands-Free Human–Robot Interaction Using Multimodal
Gestures and Deep Learning in Wearable Mixed Reality. IEEE Access 2021, 9, 55448–55464. [CrossRef]
31. Tuli, T.B.; Kohl, L.; Chala, S.A.; Manns, M.; Ansari, F. Knowledge-Based Digital Twin for Predicting Interactions in Human-Robot
Collaboration. In Proceedings of the 2021 26th IEEE International Conference on Emerging Technologies and Factory Automation
(ETFA ) 2021, Vasteras, Sweden, 7–10 September 2021; pp. 1–8. [CrossRef]
32. Yi, S.; Liu, S.; Xu, X.; Wang, X.V.; Yan, S.; Wang, L. A vision-based human-robot collaborative system for digital twin. Procedia
CIRP 2022, 107, 552–557. [CrossRef]
33. Dimitropoulos, N.; Togias, T.; Zacharaki, N.; Michalos, G.; Makris, S. Seamless Human–Robot Collaborative Assembly Using
Artificial Intelligence and Wearable Devices. Appl. Sci. 2021, 11, 5699. [CrossRef]
34. Ha, E.; Byeon, G.; Yu, S. Full-Body Motion Capture-Based Virtual Reality Multi-Remote Collaboration System. Appl. Sci. 2022,
12, 5862. [CrossRef]
35. Wang, Q.; Jiao, W.; Wang, P.; Zhang, Y. Digital Twin for Human-Robot Interactive Welding and Welder Behavior Analysis.
IEEE/CAA J. Autom. Sin. 2021, 8, 334–343. [CrossRef]
36. Khatib, M.; Khudir, K.A.; Luca, A.D. Human-robot contactless collaboration with mixed reality interface. Robot. Comput.-Integr.
Manuf. 2021, 67, 102030. [CrossRef]
37. Liu, Q.; Liu, Z.; Xiong, B.; Xu, W.; Liu, Y. Deep reinforcement learning-based safe interaction for industrial human-robot
collaboration using intrinsic reward function. Adv. Eng. Inform. 2021, 49, 101360. [CrossRef]
38. Greco, A.; Caterino, M.; Fera, M.; Gerbino, S. Digital Twin for Monitoring Ergonomics during Manufacturing Production. Appl.
Sci. 2020, 10, 7758. [CrossRef]
39. Choi, S.H.; Park, K.B.; Roh, D.H.; Lee, J.Y.; Mohammed, M.; Ghasemi, Y.; Jeong, H. An integrated mixed reality system for
safety-aware human-robot collaboration using deep learning and digital twin generation. Robot. Comput.-Integr. Manuf. 2022,
73, 102258. [CrossRef]
40. Yi, X.; Zhou, Y.; Xu, F. TransPose. ACM Trans. Graph. (TOG) 2021, 40, 1–13. [CrossRef]
41. Filippeschi, A.; Schmitz, N.; Miezal, M.; Bleser, G.; Ruffaldi, E.; Stricker, D. Survey of Motion Tracking Methods Based on Inertial
Sensors: A Focus on Upper Limb Human Motion. Sensors 2017, 17, 1257. [CrossRef]
42. Tiberio, L.; Cesta, A.; Belardinelli, M.O. Psychophysiological Methods to Evaluate User’s Response in Human Robot Interaction:
A Review and Feasibility Study. Robotics 2013, 2, 92–121. [CrossRef]
43. Usakli, A.B.; Gurkan, S.; Aloise, F.; Vecchiato, G.; Babiloni, F. On the use of electrooculogram for efficient human computer
interfaces. Comput. Intell. Neurosci. 2010, 2010, 135629. [CrossRef]
44. Schalk, G.; Leuthardt, E.C. Brain-computer interfaces using electrocorticographic signals. IEEE Rev. Biomed. Eng. 2011, 4, 140–154.
[CrossRef]
45. Bi, L.; Fan, X.A.; Liu, Y. EEG-based brain-controlled mobile robots: A survey. IEEE Trans. Hum.-Mach. Syst. 2013, 43, 161–176.
[CrossRef]
46. Hakonen, M.; Piitulainen, H.; Visala, A. Current state of digital signal processing in myoelectric interfaces and related applications.
Biomed. Signal Process. Control 2015, 18, 334–359. [CrossRef]
47. Yeom, H.G.; Kim, J.S.; Chung, C.K. Estimation of the velocity and trajectory of three-dimensional reaching movements from
non-invasive magnetoencephalography signals. J. Neural Eng. 2013, 10, 026006. [CrossRef] [PubMed]
48. Mujica, M.; Crespo, M.; Benoussaad, M.; Junco, S.; Fourquet, J.Y. Robust variable admittance control for human–robot co-
manipulation of objects with unknown load. Robot. Comput.-Integr. Manuf. 2023, 79, 102408. [CrossRef]
49. Dröder, K.; Bobka, P.; Germann, T.; Gabriel, F.; Dietrich, F. A Machine Learning-Enhanced Digital Twin Approach for Human-
Robot-Collaboration. Procedia CIRP 2018, 76, 187–192. [CrossRef]
50. Islam, S.O.B.; Lughmani, W.A.; Qureshi, W.S.; Khalid, A.; Mariscal, M.A.; Garcia-Herrero, S. Exploiting visual cues for safe and
flexible cyber-physical production systems. Res. Artic. Adv. Mech. Eng. 2019, 11, 1–13. [CrossRef]
51. Zhang, R.; Li, J.; Zheng, P.; Lu, Y.; Bao, J.; Sun, X. A fusion-based spiking neural network approach for predicting collaboration
request in human-robot collaboration. Robot. Comput.-Integr. Manuf. 2022, 78, 102383. [CrossRef]
52. Liu, Z.; Liu, Q.; Xu, W.; Liu, Z.; Zhou, Z.; Chen, J. Deep learning-based human motion prediction considering context awareness
for human-robot collaboration in manufacturing. Procedia CIRP 2019, 83, 272–278. [CrossRef]
53. Huang, Z.; Shen, Y.; Li, J.; Fey, M.; Brecher, C. A Survey on AI-Driven Digital Twins in Industry 4.0: Smart Manufacturing and
Advanced Robotics. Sensors 2021, 21, 6340. [CrossRef]
54. Dmytriyev, Y.; Insero, F.; Carnevale, M.; Giberti, H. Brain–Computer Interface and Hand-Guiding Control in a Human–Robot
Collaborative Assembly Task. Machines 2022, 10, 654. [CrossRef]
Sensors 2023, 23, 3938 24 of 27
55. Ji, Z.; Liu, Q.; Xu, W.; Yao, B.; Liu, J.; Zhou, Z. A closed-loop brain-computer interface with augmented reality feedback for
industrial human-robot collaboration. Int. J. Adv. Manuf. Technol. 2021, 124, 3083–3098. [CrossRef]
56. Jin, T.; Sun, Z.; Li, L.; Zhang, Q.; Zhu, M.; Zhang, Z.; Yuan, G.; Chen, T.; Tian, Y.; Hou, X.; et al. Triboelectric nanogenerator
sensors for soft robotics aiming at digital twin applications. Nat. Commun. 2020, 11, 5381. [CrossRef] [PubMed]
57. Mnih, V.; Kavukcuoglu, K.; Silver, D.; Graves, A.; Antonoglou, I.; Wierstra, D.; Riedmiller, M. Playing Atari with Deep
Reinforcement Learning. arXiv 2013, arXiv:1312.5602. [CrossRef].
58. Naveed, K.; Anjum, M.L.; Hussain, W.; Lee, D. Deep introspective SLAM: Deep reinforcement learning based approach to avoid
tracking failure in visual SLAM. Auton. Robot. 2022, 46, 705–724. [CrossRef]
59. Liu, Y.; Xu, H.; Liu, D.; Wang, L. A digital twin-based sim-to-real transfer for deep reinforcement learning-enabled industrial
robot grasping. Robot. Comput.-Integr. Manuf. 2022, 78, 102365. [CrossRef]
60. Vrabič, R.; Škulj, G.; Malus, A.; Kozjek, D.; Selak, L.; Bračun, D.; Podržaj, P. An architecture for sim-to-real and real-to-sim
experimentation in robotic systems. Procedia CIRP 2021, 104, 336–341. [CrossRef]
61. Sun, X.; Zhang, R.; Liu, S.; Lv, Q.; Bao, J.; Li, J. A digital twin-driven human–robot collaborative assembly-commissioning method
for complex products. Int. J. Adv. Manuf. Technol. 2022, 118, 3389–3402. [CrossRef]
62. Xia, K.; Sacco, C.; Kirkpatrick, M.; Saidy, C.; Nguyen, L.; Kircaliali, A.; Harik, R. A digital twin to train deep reinforcement
learning agent for smart manufacturing plants: Environment, interfaces and intelligence. J. Manuf. Syst. 2021, 58, 210–230.
[CrossRef]
63. Matulis, M.; Harvey, C. A robot arm digital twin utilising reinforcement learning. Comput. Graph. 2021, 95, 106–114. [CrossRef]
64. Nourmohammadi, A.; Fathi, M.; Ng, A.H. Balancing and scheduling assembly lines with human-robot collaboration tasks.
Comput. Oper. Res. 2022, 140, 105674. [CrossRef]
65. Bansal, R.; Khanesar, M.A.; Branson, D. Ant colony optimization algorithm for industrial robot programming in a digital twin.
In Proceedings of the ICAC 2019—2019 25th IEEE International Conference on Automation and Computing, Lancaster, UK, 5–7
September 2019. [CrossRef]
66. Zhu, X.; Ji, Y. A digital twin–driven method for online quality control in process industry. Int. J. Adv. Manuf. Technol. 2022,
119 3045–3064. [CrossRef]
67. Kennel-Maushart, F.; Poranne, R.; Coros, S. Multi-Arm Payload Manipulation via Mixed Reality; IEEE: Philadelphia, PA, USA, 2022;
pp. 11251–11257. [CrossRef]
68. Asad, U.; Rasheed, S.; Lughmani, W.A.; Kazim, T.; Khalid, A.; Pannek, J. Biomechanical Modeling of Human–Robot Accident
Scenarios: A Computational Assessment for Heavy-Payload-Capacity Robots. Appl. Sci. 2023, 13, 1957. [CrossRef]
69. Aubert, K.; Germaneau, A.; Rochette, M.; Ye, W.; Severyns, M.; Billot, M.; Rigoard, P.; Vendeuvre, T. Development of Digital
Twins to Optimize Trauma Surgery and Postoperative Management. A Case Study Focusing on Tibial Plateau Fracture. Front.
Bioeng. Biotechnol. 2021, 9, 722275. [CrossRef] [PubMed]
70. Liang, L.; Liu, M.; Martin, C.; Sun, W. A deep learning approach to estimate stress distribution: A fast and accurate surrogate of
finite-element analysis. J. R. Soc. Interface 2018, 15, 20170844. [CrossRef]
71. Aivaliotis, P.; Georgoulias, K.; Chryssolouris, G. The use of Digital Twin for predictive maintenance in manufacturing. Int. J.
Comput. Integr. Manuf. 2019, 32, 1067–1080. [CrossRef]
72. Kapteyn, M.G. Mathematical and Computational Foundations to Enable Predictive Digital Twins at Scale. Ph.D. Thesis,
Massachusetts Institute of Technology, Cambridge, MA, USA, 2021.
73. Calka, M.; Perrier, P.; Ohayon, J.; Grivot-Boichon, C.; Rochette, M.; Payan, Y. Machine-Learning based model order reduction of a
biomechanical model of the human tongue. Comput. Methods Programs Biomed. 2021, 198, 105786. [CrossRef] [PubMed]
74. Barricelli, B.R.; Casiraghi, E.; Gliozzo, J.; Petrini, A.; Valtolina, S. Human Digital Twin for Fitness Management. IEEE Access 2020,
8, 26637–26664. [CrossRef]
75. Lauzeral, N.; Borzacchiello, D.; Kugler, M.; George, D.; Rémond, Y.; Hostettler, A.; Chinesta, F. A model order reduction approach
to create patient-specific mechanical models of human liver in computational medicine applications. Comput. Methods Programs
Biomed. 2019, 170, 95–106. [CrossRef]
76. Thumm, J.; Althoff, M. Provably Safe Deep Reinforcement Learning for Robotic Manipulation in Human Environments. In
Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022.
77. Mahadevan, K.; Sousa, M.; Tang, A.; Grossman, T. “Grip-that-there”: An Investigation of Explicit and Implicit Task Allocation
Techniques for Human-Robot Collaboration. In Proceedings of the 2021 CHI Conference on Human Factors in Computing
Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–14. [CrossRef]
78. Macenski, S.; Foote, T.; Gerkey, B.; Lalancette, C.; Woodall, W. Robot Operating System 2: Design, architecture, and uses in the
wild. Sci. Robot. 2022, 7, 66. [CrossRef]
79. Kousi, N.; Gkournelos, C.; Aivaliotis, S.; Lotsaris, K.; Bavelos, A.C.; Baris, P.; Michalos, G.; Makris, S. Digital twin for designing
and reconfiguring human–robot collaborative assembly lines. Appl. Sci. 2021, 11, 4620. [CrossRef]
80. Andaluz, V.H.; Chicaiza, F.A.; Gallardo, C.; Quevedo, W.X.; Varela, J.; Sánchez, J.S.; Arteaga, O. Unity3D-MatLab Simulator in Real
Time for Robotics Applications; Springer: Cham, Switzerland, 2016; Volume 9768, pp. 246–263. [CrossRef]
81. Farley, A.; Wang, J.; Marshall, J.A. How to pick a mobile robot simulator: A quantitative comparison of CoppeliaSim, Gazebo,
MORSE and Webots with a focus on accuracy of motion. Simul. Model. Pract. Theory 2022, 120, 102629. [CrossRef]
Sensors 2023, 23, 3938 25 of 27
82. Rojas, M.; Hermosilla, G.; Yunge, D.; Farias, G. An Easy to Use Deep Reinforcement Learning Library for AI Mobile Robots in
Isaac Sim. Appl. Sci. 2022, 12, 8429. [CrossRef]
83. Clausen, C.S.B.; Ma, Z.G.; Jørgensen, B.N. Can we benefit from game engines to develop digital twins for planning the deployment
of photovoltaics? Energy Inform. 2022, 5, 42. [CrossRef]
84. Kuts, V.; Otto, T.; Tähemaa, T.; Bondarenko, Y. Digital twin based synchronised control and simulation of the industrial robotic
cell using virtual reality. J. Mach. Eng. 2019, 19, 128–144. [CrossRef]
85. Srinivasan, M.; Mubarrat, S.T.; Humphrey, Q.; Chen, T.; Binkley, K.; Chowdhury, S.K. The Biomechanical Evaluation of a
Human-Robot Collaborative Task in a Physically Interactive Virtual Reality Simulation Testbed. Proc. Hum. Factors Ergon. Soc.
Annu. Meet. 2021, 65, 403–407. [CrossRef]
86. Mania, P.; Kenfack, F.K.; Neumann, M.; Beetz, M. Imagination-enabled Robot Perception. IEEE Int. Conf. Intell. Robot. Syst. 2020,
936–943. [CrossRef]
87. Yun, H.; Jun, M.B. Immersive and interactive cyber-physical system (I2CPS) and virtual reality interface for human involved
robotic manufacturing. J. Manuf. Syst. 2022, 62, 234–248. [CrossRef]
88. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Closed-Loop Robotic Arm Manipulation Based on Mixed Reality. Appl. Sci. 2022,
12, 2972. [CrossRef]
89. Alexopoulos, K.; Nikolakis, N.; Chryssolouris, G. Digital twin-driven supervised machine learning for the development of
artificial intelligence applications in manufacturing. Int. J. Comput. Integr. Manuf. 2020, 33, 429–439. [CrossRef]
90. Wu, P.; Qi, M.; Gao, L.; Zou, W.; Miao, Q.; Liu, L.L. Research on the virtual reality synchronization of workshop digital twin. In
Proceedings of the 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference, ITAIC 2019,
Chongqing, China, 24–26 May 2019; pp. 875–879. [CrossRef]
91. Hernigou, P.; Olejnik, R.; Safar, A.; Martinov, S.; Hernigou, J.; Ferre, B. Digital twins, artificial intelligence, and machine learning
technology to identify a real personalized motion axis of the tibiotalar joint for robotics in total ankle arthroplasty. Int. Orthop.
2021, 1, 3. [CrossRef]
92. Rabah, S.; Assila, A.; Khouri, E.; Maier, F.; Ababsa, F.; Bourny, V.; Maier, P.; Mérienne, F. Towards improving the future of
manufacturing through digital twin and augmented reality technologies. Procedia Manuf. 2018, 17, 460–467. [CrossRef]
93. Akar, C.A.; Tekli, J.; Jess, D.; Khoury, M.; Kamradt, M.; Guthe, M. Synthetic Object Recognition Dataset for Industries; IEEE:
Piscataway, NJ, USA, 2022; pp. 150–155. [CrossRef]
94. Blaga, A.; Tamas, L. Augmented Reality for Digital Manufacturing. In Proceedings of the MED 2018—26th Mediterranean
Conference on Control and Automation, Zadar, Croatia, 19–22 June 2018; pp. 173–178. [CrossRef]
95. Li, C.; Zheng, P.; Li, S.; Pang, Y.; Lee, C.K. AR-assisted digital twin-enabled robot collaborative manufacturing system with
human-in-the-loop. Robot. Comput.-Integr. Manuf. 2022, 76, 102321. [CrossRef]
96. Ke, S.; Xiang, F.; Zhang, Z.; Zuo, Y. A enhanced interaction framework based on VR, AR and MR in digital twin. Procedia CIRP
2019, 83, 753–758. [CrossRef]
97. Su, Y.P.; Chen, X.Q.; Zhou, T.; Pretty, C.; Chase, G. Mixed-Reality-Enhanced Human–Robot Interaction with an Imitation-Based
Mapping Approach for Intuitive Teleoperation of a Robotic Arm-Hand System. Appl. Sci. 2022, 12, 4740. [CrossRef]
98. Masood, T.; Egger, J. Augmented reality in support of Industry 4.0—Implementation challenges and success factors. Robot.
Comput.-Integr. Manuf. 2019, 58, 181–195. [CrossRef]
99. Han, S.; Liu, B.; Cabezas, R.; Twigg, C.D.; Zhang, P.; Petkau, J.; Yu, T.H.; Tai, C.J.; Akbay, M.; Wang, Z.; et al. MEgATrack. ACM
Trans. Graph. (TOG) 2020, 39, 87. [CrossRef]
100. Skantze, G. Turn-taking in Conversational Systems and Human-Robot Interaction: A Review. Comput. Speech Lang. 2021,
67, 101178. [CrossRef]
101. Huang, W.; Xia, F.; Xiao, T.; Chan, H.; Liang, J.; Florence, P.; Zeng, A.; Tompson, J.; Mordatch, I.; Chebotar, Y.; et al. Inner
Monologue: Embodied Reasoning through Planning with Language Models. In Proceedings of the 6th Conference on
Robot Learning, Volume 205 of Proceedings of Machine Learning Research, Auckland, New Zealand, 14–18 December 2022;
pp. 1769–1782.
102. Qi, Q.; Tao, F.; Hu, T.; Anwer, N.; Liu, A.; Wei, Y.; Wang, L.; Nee, A.Y. Enabling technologies and tools for digital twin. J. Manuf.
Syst. 2021, 58, 3–21. [CrossRef]
103. Wang, J.; Wu, Q.; Remil, O.; Yi, C.; Guo, Y.; Wei, M. Modeling indoor scenes with repetitions from 3D raw point data. CAD
Comput. Aided Des. 2018, 94, 1–15. [CrossRef]
104. Wang, W.; Guo, H.; Li, X.; Tang, S.; Li, Y.; Xie, L.; Lv, Z. BIM Information Integration Based VR Modeling in Digital Twins in
Industry 5.0. J. Ind. Inf. Integr. 2022, 28, 100351. [CrossRef]
105. Gualtieri, L.; Rauch, E.; Vidoni, R. Development and validation of guidelines for safety in human-robot collaborative assembly
systems. Comput. Ind. Eng. 2022, 163, 107801. [CrossRef]
106. Havard, V.; Jeanne, B.; Lacomblez, M.; Baudry, D. Digital twin and virtual reality: A co-simulation environment for design and
assessment of industrial workstations. Prod. Manuf. Res. 2019, 7, 472–489. [CrossRef]
107. Maragkos, C.; Vosniakos, G.C.; Matsas, E. Virtual reality assisted robot programming for human collaboration. Procedia Manuf.
2019, 38, 1697–1704. [CrossRef]
108. Bobka, P.; Germann, T.; Heyn, J.K.; Gerbers, R.; Dietrich, F.; Dröder, K. Simulation Platform to Investigate Safe Operation of
Human-Robot Collaboration Systems. Procedia CIRP 2016, 44, 187–192. [CrossRef]
Sensors 2023, 23, 3938 26 of 27
109. Maruyama, T.; Ueshiba, T.; Tada, M.; Toda, H.; Endo, Y.; Domae, Y.; Nakabo, Y.; Mori, T.; Suita, K. Digital Twin-Driven Human
Robot Collaboration Using a Digital Human. Sensors 2021, 21, 8266. [CrossRef]
110. Grandi, F.; Prati, E.; Peruzzini, M.; Pellicciari, M.; Campanella, C.E. Design of ergonomic dashboards for tractors and trucks:
Innovative method and tools. J. Ind. Inf. Integr. 2022, 25, 100304. [CrossRef]
111. Use Cases—Omniverse Digital Twin Documentation. Available online: https://docs.omniverse.nvidia.com/prod_digital-twins/
prod_digital-twins/warehouse-digital-twins/use-cases.html (accessed on 24 October 2022).
112. Lee, H.; Kim, S.D.; Amin, M.A.U.A. Control framework for collaborative robot using imitation learning-based teleoperation from
human digital twin to robot digital twin. Mechatronics 2022, 85, 102833. [CrossRef]
113. 5 Important Augmented And Virtual Reality Trends For 2019 Everyone Should Read. Available online: https://www.forbes.com/
sites/bernardmarr/2019/01/14/5-important-augmented-and-virtual-reality-trends-for-2019-everyone-should-read/ (accessed
on 5 July 2022).
114. Hagmann, K.; Hellings-Kuß, A.; Klodmann, J.; Richter, R.; Stulp, F.; Leidner, D. A Digital Twin Approach for Contextual
Assistance for Surgeons During Surgical Robotics Training. Front. Robot. AI 2021, 8, 735566. [CrossRef]
115. Rockwell Automation Deployed Collaborative Augmented Reality to Prevent Production Delays During Pandemic. Avail-
able online: https://www.ptc.com/en/case-studies/rockwell-automation-collaborative-augmented-reality (accessed on
2 October 2022).
116. Zahabi, M.; Razak, A.M.A. Adaptive virtual reality-based training: A systematic literature review and framework. Virtual Real.
2020, 24, 725–752. [CrossRef]
117. Blankendaal, R.A.M.; Bosse, T. Using Run-Time Biofeedback During Virtual Agent-Based Aggression De-Escalation Training. In
Proceedings of the Advances in Practical Applications of Agents, Multi-Agent Systems, and Complexity: The PAAMS Collection:
16th International Conference, PAAMS 2018, Toledo, Spain, 20–22 June 2018; pp. 97–109. [CrossRef]
118. Harichandran, A.; Johansen, K.W.; Jacobsen, E.L.; Teizer, J. A Conceptual Framework for Construction Safety Training using
Dynamic Virtual Reality Games and Digital Twins. In Proceedings of the International Symposium on Automation and Robotics
in Construction, Bogotá, Colombia, 13–15 July 2021.
119. Joshi, S.; Hamilton, M.; Warren, R.; Faucett, D.; Tian, W.; Wang, Y.; Ma, J. Implementing Virtual Reality technology for safety
training in the precast/ prestressed concrete industry. Appl. Ergon. 2021, 90, 103286. [CrossRef]
120. Beloglazov, I.I.; Petrov, P.A.; Bazhin, V.Y. The concept of digital twins for tech operator training simulator design for mining and
processing industry. Eurasian Mining 2020, 2020, 50–54. [CrossRef]
121. You, S.; Kim, J.H.; Lee, S.H.; Kamat, V.; Robert, L.P. Enhancing perceived safety in human–robot collaborative construction using
immersive virtual environments. Autom. Constr. 2018, 96, 161–170. [CrossRef]
122. Matsas, E.; Vosniakos, G.C.; Batras, D. Prototyping proactive and adaptive techniques for human-robot collaboration in
manufacturing using virtual reality. Robot. Comput.-Integr. Manuf. 2018, 50, 168–180. [CrossRef]
123. Malik, A.A.; Masood, T.; Bilberg, A. Virtual reality in manufacturing: Immersive and collaborative artificial-reality in design of
human-robot workspace. Int. J. Comput. Integr. Manuf. 2019, 33, 22–37. [CrossRef]
124. Wang, Y.; Feng, J.; Liu, J.; Liu, X.; Wang, J. Digital Twin-based Design and Operation of Human-Robot Collaborative Assembly.
IFAC-PapersOnLine 2022, 55, 295–300. [CrossRef]
125. Malik, A.A.; Bilberg, A. Digital twins of human robot collaboration in a production setting. Procedia Manuf. 2018, 17, 278–285.
[CrossRef]
126. Nikolakis, N.; Alexopoulos, K.; Xanthakis, E.; Chryssolouris, G. The digital twin implementation for linking the virtual
representation of human-based production tasks to their physical counterpart in the factory-floor. Int. J. Comput. Integr. Manuf.
2019, 32, 1–12. [CrossRef]
127. Malik, A.A.; Brem, A. Digital twins for collaborative robots: A case study in human-robot interaction. Robot. Comput.-Integr.
Manuf. 2021, 68, 102092. [CrossRef]
128. Zhu, Z.; Liu, C.; Xu, X. Visualisation of the Digital Twin data in manufacturing by using Augmented Reality. Procedia CIRP 2019,
81, 898–903. [CrossRef]
129. Ding, K.; Chan, F.T.; Zhang, X.; Zhou, G.; Zhang, F. Defining a Digital Twin-based Cyber-Physical Production System for autonomous
manufacturing in smart shop floors. Int. J. Prod. Res. 2019, 57, 6315–6334. [CrossRef]
130. Müller, M.; Mielke, J.; Pavlovskyi, Y.; Pape, A.; Masik, S.; Reggelin, T.; Häberer, S. Real-time combination of material flow
simulation, digital twins of manufacturing cells, an AGV and a mixed-reality application. Procedia CIRP 2021, 104, 1607–1612.
[CrossRef]
131. Zhou, T.; Tang, D.; Zhu, H.; Zhang, Z. Multi-agent reinforcement learning for online scheduling in smart factories. Robot.
Comput.-Integr. Manuf. 2021, 72, 102202. [CrossRef]
132. Lv, Q.; Zhang, R.; Sun, X.; Lu, Y.; Bao, J. A digital twin-driven human-robot collaborative assembly approach in the wake of
COVID-19. J. Manuf. Syst. 2021, 60, 837–851. [CrossRef] [PubMed]
133. Chiriatti, G.; Ciccarelli, M.; Forlini, M.; Franchini, M.; Palmieri, G.; Papetti, A.; Germani, M. Human-Centered Design of a
Collaborative Robotic System for the Shoe-Polishing Process. Machines 2022, 10, 1082. [CrossRef]
134. Leng, J.; Zhang, H.; Yan, D.; Liu, Q.; Chen, X.; Zhang, D. Digital twin-driven manufacturing cyber-physical system for parallel
controlling of smart workshop. J. Ambient. Intell. Humaniz. Comput. 2018, 10, 1155–1166. [CrossRef]
Sensors 2023, 23, 3938 27 of 27
135. Khalid, A.; Kirisci, P.; Khan, Z.H.; Ghrairi, Z.; Thoben, K.D.; Pannek, J. Security framework for industrial collaborative robotic
cyber-physical systems. Comput. Ind. 2018, 97, 132–145. [CrossRef]
136. Laaki, H.; Miche, Y.; Tammi, K. Prototyping a Digital Twin for Real Time Remote Control over Mobile Networks: Application of
Remote Surgery. IEEE Access 2019, 7, 20235–20336. [CrossRef]
137. Dietz, M.; Vielberth, M.; Pernul, G. Integrating Digital Twin Security Simulations in the Security Operations Center. In
Proceedings of the 15th International Conference on Availability, Reliability and Security (ARES ’20), Virtual Event, Ireland, 25–28
August 2020; ACM: New York, NY, USA, 2020. [CrossRef]
138. Suhail, S.; Malik, S.U.R.; Jurdak, R.; Hussain, R.; Matulevičius, R.; Svetinovic, D. Towards situational aware cyber-physical
systems: A security-enhancing use case of blockchain-based digital twins. Comput. Ind. 2022, 141, 103699. [CrossRef]
139. Lv, Z.; Qiao, L.; Li, Y.; Yuan, Y.; Wang, F.Y. BlockNet: Beyond reliable spatial Digital Twins to Parallel Metaverse. Patterns 2022,
3, 100468. [CrossRef]
140. Zhang, J.; Tai, Y. Secure medical digital twin via human-centric interaction and cyber vulnerability resilience. Connect. Sci. 2021,
2022, 895–910. [CrossRef]
141. VIDEO: The Digital Athlete and How It’s Revolutionizing Player Health & Safety. Available online: https://www.nfl.com/
playerhealthandsafety/equipment-and-innovation/aws-partnership/digital-athlete-spot (accessed on 2 February 2023).
142. Greenbaum, D.; Lavazza, A.; Beier, K.; Bruynseels, K.; Sio, F.S.D.; Hoven, J.V.D. Digital Twins in Health Care: Ethical Implications
of an Emerging Engineering Paradigm. Front. Genet. 2018, 9, 31. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.
Reproduced with permission of copyright owner. Further reproduction
prohibited without permission.