0% found this document useful (0 votes)
15 views23 pages

2 - Hierarchical Human-Inspired Control Strategies For Prosthetic Hands

This document reviews hierarchical control strategies for prosthetic hands, highlighting the limitations of current commercial prostheses that often lead to abandonment by users. It emphasizes the need for control laws that better replicate human hand functions to improve usability and acceptance among amputees. The paper aims to consolidate existing knowledge and suggest future research directions for developing more intuitive and effective prosthetic control systems.

Uploaded by

hisham elsherif
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views23 pages

2 - Hierarchical Human-Inspired Control Strategies For Prosthetic Hands

This document reviews hierarchical control strategies for prosthetic hands, highlighting the limitations of current commercial prostheses that often lead to abandonment by users. It emphasizes the need for control laws that better replicate human hand functions to improve usability and acceptance among amputees. The paper aims to consolidate existing knowledge and suggest future research directions for developing more intuitive and effective prosthetic control systems.

Uploaded by

hisham elsherif
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

sensors

Review
Hierarchical Human-Inspired Control Strategies for
Prosthetic Hands
Cosimo Gentile 1,2, * , Francesca Cordella 1 and Loredana Zollo 1
1 Unit of Advanced Robotics and Human-Centred Technologies, Università Campus Bio-Medico di Roma,
00128 Rome, Italy; [email protected] (F.C.); [email protected] (L.Z.)
2 INAIL Prosthetic Center, Vigorso di Budrio, 40054 Bologna, Italy
* Correspondence: [email protected]

Abstract: The abilities of the human hand have always fascinated people, and many studies have been
devoted to describing and understanding a mechanism so perfect and important for human activities.
Hand loss can significantly affect the level of autonomy and the capability of performing the activities
of daily life. Although the technological improvements have led to the development of mechanically
advanced commercial prostheses, the control strategies are rather simple (proportional or on/off
control). The use of these commercial systems is unnatural and not intuitive, and therefore frequently
abandoned by amputees. The components of an active prosthetic hand are the mechatronic device,
the decoding system of human biological signals into gestures and the control law that translates all
the inputs into desired movements. The real challenge is the development of a control law replacing
human hand functions. This paper presents a literature review of the control strategies of prosthetics
hands with a multiple-layer or hierarchical structure, and points out the main critical aspects of the
current solutions, in terms of human’s functions replicated with the prosthetic device. The paper
finally provides several suggestions for designing a control strategy able to mimic the functions of
the human hand.


 Keywords: prostheses; prosthetic; hand; human-ispired; control; strategy; level
Citation: Gentile, C.; Cordella, F.;
Zollo, L. Hierarchical Human-
Inspired Control Strategies for
Prosthetic Hands. Sensors 2022, 22, 1. Introduction
2521. https://doi.org/10.3390/ Thousands of years ago, Aristotle described the hand as follows: «For the hands are
s22072521 instruments or organs, and the invariable plan of nature in distributing the organs is to give
Academic Editor: Paolo Mercorelli
each to such animal as can make use of it [. . . ] man does not owe his superior intelligence
to his hands, but his hands to his superior intelligence. For the most intelligent of animals
Received: 11 January 2022 is the one who would put the most organs to use; and the hand is not to be looked on as
Accepted: 23 March 2022 one organ but as many; for it is, as it were, an instrument for further instruments» [1].
Published: 25 March 2022
A child learns about the world through the hands even before using the other senses.
Publisher’s Note: MDPI stays neutral Human beings develop new skills with their hands and use them for every daily action.
with regard to jurisdictional claims in A study about the hand was carried out by Sir Charles Bell in 1834 [2], who analyzed
published maps and institutional affil- the hand starting with a comparison with animal anatomy. In 1900, two famous anatomists,
iations. Frederic Wood Jones [3] and Russell Napier [4], studied the primitive nature of the human
hand and the similarity with the other pentadactyl mammals’ upper limbs, remarking that
functions as prehension or dexterity belong only to primates and humans [5]. The hand
has always fascinated many people, from scientists to artists, and many studies have
Copyright: © 2022 by the authors. been conducted to describe and understand a mechanism so perfect and important for
Licensee MDPI, Basel, Switzerland.
human activities.
This article is an open access article
The hand is one of the most important parts of the human body, used to learn and
distributed under the terms and
to interact with the environment. Therefore, hand loss represents irreparable damage for
conditions of the Creative Commons
a person: life is upset, activities of daily living (ADLs) are compromised. Besides having
Attribution (CC BY) license (https://
suffered hand loss, the amputee will have to learn to perform everyday life actions with
creativecommons.org/licenses/by/
4.0/).
only one hand. To remedy this problem, since ancient Egypt, prostheses have been used

Sensors 2022, 22, 2521. https://doi.org/10.3390/s22072521 https://www.mdpi.com/journal/sensors


Sensors 2022, 22, 2521 2 of 23

both for cosmetic and functional purposes [6]. The first documented amputee who used a
prosthetic limb is General Marcus Sergius, who lived in Ancient Rome [7]: «In his second
campaign Sergius lost his right hand. [. . . ] He had a right hand made of iron for him
and, going into battle with this bound to his arm, he raised the siege of Cremona, saved
Placentia and captured twelve enemy camps in Gaul—all of which exploits were confirmed
by the speech he made as praetor when his colleagues tried to debar him as infirm from the
sacrifices» [8].
The process of prostheses development began over 2000 years ago, but the first
externally powered prosthesis was made only in 1919, using pneumatic and electric power
sources [9]. The first myoelectric prosthesis was made in 1948 [10], a simple device with
actuators powered using the amplified superficial electromyographic signals (sEMG).
However, this idea had no future either in clinical and in commercial fields until 1969 when
it was reinvented [11].
Nowadays, the use of EMG signals is the most common approach to actively control
prosthetic hands [12]. After 1948, many research units developed myoelectric control in
complete autonomy, reaching comparable results among them. A simple solution using
EMG signals is the on/off control, i.e., when the signal exceeds a threshold, an output is
sent to the prosthetic hand motors [13]. In [14], two electrodes were placed on agonist
and antagonist muscle pairs, so a single motion (opening and closure) is associated with
a single muscle. Another solution envisaged the use of EMG dynamic to proportionally
modulate force or speed [15,16]. The number of degrees of freedom (DoFs) is small; a
solution to overcome this limitation is offered by pattern recognition [17–19]. This technique
extracts several features from different time segments of sEMG and uses them as input
to the classifier to predict different grasps. The classifier output can be used to control a
prosthetic device.
Many amputee subjects do not use myoelectric prostheses because their control is
unnatural and not intuitive [20]. Unfortunately, the actual commercial prostheses are typi-
cally driven with proportional or on/off control [13], with a limited number of grasping
configurations. Commonly, just the opening and closing of the prosthetic hand are possible,
by using sEMG signals related to the flexion and extension of the wrist [21]. In the last
30 years, technologies and functioning of commercial prosthetics have not substantially
changed [22], resulting in the abandonment of the artificial hand [23,24]. Marginal func-
tional improvement in daily life is offered by the commercial prostheses [25], whereby
the rejection reaches 59% (for amputations close to the wrist [26]) or 75% (for myoelectric
prostheses [27]). Commercial prostheses are perceived as extraneous devices and not like
the lost limb [28]. More realistic rates for rejection and non-usage have been estimated to
be even higher due to the lack of contact between the clinic community and non-users [29].
An interesting result comes from the CYBATHLON 2016 [30], where competitors with
prosthetic arms attempted both fast and precise manipulations performing a generic task
choice based on ADL. The winner of the competition wore a simple body-powered arm.
A study carried out in [31] analyzed the real needs of the amputee subjects and
provided insights into the development of prostheses more similar to the human hand. The
fundamental demands emerging from this study are: to carry out ADLs, to have sensory
feedback, to regulate force during grasp lightening the visual attention and the cognitive
burden for the user, to avoid the slippage of the grasped object [32], to manipulate objects,
and to handle small objects.
Leaving aside the recovery of sensory feedback in amputee subjects, the other issues
can be addressed by improving the current control strategies.
The functioning of an active prosthetic hand is guaranteed by (i) a mechatronic device,
with sensors and actuators, (ii) a system decoding human biological signals into gestures
and (iii) a control law to translate all the inputs (from the hand and the user) into desired
movements [33].
A control law replacing human hand functions and making a prosthesis acceptable
and simple to use by the amputee [34] is the real challenge.
Sensors 2022, 22, 2521 3 of 23

2. Aim of the Study


This paper intends to carry out an in-depth study of the literature on control strategies
for prosthetic hands with multiple layers or a hierarchical structure to consolidate current
knowledge in this field and highlight the lack of a control strategy allowing stability
and usability during a simple grasping task, encouraging the prosthesis acceptability by
the amputee. This work has the twofold purpose of (i) focusing research efforts toward
the development of control strategies for hand prostheses replicating the performance
of the human hand; (ii) providing foundations for future studies to in-depth explore
the neurophysiological behavior of a limb related to the hierarchical management of the
prehension aimed to replicate its functioning on a robotic device. The expected added
value provided by this work is to update the current knowledge of control strategies with
more recent papers, by critically evaluating and (possibly) comparing the available results
and pointing out inconsistencies and neglected aspects. Indications for the development of
future strategies for making hand prostheses appealing to individuals with hand loss are
also provided.
The paper is organized as follows. Section 3 describes the methods used to select
the reviewed articles. Section 4 introduces an overview of the control laws for prosthetic
hands. Section 5 reports the hand functioning useful to understand the information to
use in the development of prostheses control strategies. Section 6 describes the control
strategies used in the analyzed papers. Section 7 underlines the principal limits of the
current control strategies and suggests a methodology to develop new control strategies.
Finally, conclusions are drawn in Section 8.

3. Methods
An extensive literature analysis was carried out on the following databases: PubMed,
Google Scholar, IEEE Explore, and Scopus. The keywords (and their combinations) adopted
for the research are the following: control strategy, upper limb, prosthesis, prosthesis control,
grasping, pre-shaping, hierarchical control, multilevel control, and prehensile control. All
publications in English appearing between 1960 and 2021 were considered. Moreover,
from the selected papers, bibliographies were examined for extracting additional papers.
The inclusion criteria for selecting the publications relevant for the review purpose
are as follows: control strategies for prosthetic hands, prehensile control of a prosthetic
robotic hand, pre-shaping and grasping phases for controlling prosthesis, reach, and grasp.
A flowchart of the search and inclusion process is shown in Figure 1. The result of applying
the described method was 506 papers. They were evaluated by applying the addition
criteria from the multilevel strategy and hierarchical strategy. After analyzing the title
and abstract, all irelevant papers were discarded. From the initial 506 papers, 473 were
excluded because they were considered not relevant and 33 papers have been carefully
read. Twenty-five of these have been further excluded because they reported redundant
information or did not meet the inclusion criteria.
The authors reviewed the remaining eight papers fulfilling the inclusion criteria.
In particular, each analyzed paper must describe:
1. A control strategy for prosthetic hands that mimics the human hand behavior;
2. A control strategy with complete management of the different phases of the grasp;
3. A control strategy with multiple-layers o hierarchical structure;
4. Be a full-length publication in a peer-reviewed journal or conference proceedings.
Sensors 2022, 22, 2521 4 of 23

Figure 1. Flowchart of the search and inclusion process.

4. The Beginning of Control Laws for Prosthetic Hands


Myoelectric prosthetic hands were initially inspired by robotic hands, focusing on
the experience achieved with them, as reported in [31,35,36]. In [37,38], the first computer-
operated mechanical hand and a robotic hand that can be considered the first dexterous
multi-finger hand have been presented, respectively.
Once an object in a stable grasp moves due to some disturbance, a control system is
necessary to allow the prosthetic hand to avoid the loss of the contact points between the
hand and the grasped object. The successive step is applying the correct force on the object
to be grasped and manipulated, guaranteeing grasp stability.
The development of both prosthetic hands and control strategies has been pursued in
parallel, as shown in [38] where in addition to the hand, a control strategy was developed
to mimic human reflexes and in [39] where a sequential controller made of relay circuits
was realized to drive a three-fingered hand. In particular, these control methods emphasize
reaction rather than stability.
Sensors play a significant role in control development since providing information
about positions, forces and torques. The information from positions, forces, and torques
allow realizing control strategies to regulate forces and to avoid object slippages during
grasping [40].
The first prosthetic hands tried to imitate the behavior of human hands but due to the
limitation of technology and interfacing systems, they looked more like robotic grippers.
In the last 50 years, many control strategies were developed, exploring several scenarios to
overcome these limitations.
Until the 1990s, hardware and control strategies were developed without considering
the human hand as an inspiration, but around the 2000s there was a countertrend. Indeed,
the properties of the human hand, such as the opposition of the thumb [41] or the postural
synergies for the dimensionality reduction of hand DoFs [42], were introduced to develop
bio-inspired controllers and hand structures [43].
Empirical approaches based on the imitation of human grasping strategies have been
proposed [44] to reduce the computational burden of grasp control. In particular, a study
on whether humans use a combination of basic grasp configurations has been performed
to facilitate the replication of human-like behavior on robotic devices [45].
The development of a device replicating human behavior requires knowledge of
that behavior. In particular, the computational burden of control approaches needs to
consider the physiological reaction time of the human hands, useful to perform a simple
task (i.e., between 50 and 100 ms) [46–48]. Therefore, the controller cycle velocity should
also consider this aspect, avoiding too high values which could not make natural the use of
Sensors 2022, 22, 2521 5 of 23

prostheses. In particular, prostheses with control strategies with too rapid responses may
not be managed by the subject who has slower reaction times, affecting the naturalness of
the action.
The human hand behavior should be the basis for the development of a control
strategy to be applied on prosthetic hands. Therefore, the first step to making an active
prosthesis inspired by the human hand is understanding the motor control of the hand by
the brain [49]. The evolution of neurosciences allowed the intensification of the study of
hand functioning.

5. Human Hand Functioning


A study performed by analyzing the brain could describe in detail the real functioning
of the human hand. First studies were performed on the primate brain to find some
similarities with that of humans.
Thanks to innovative techniques [50–53], non-invasive studies were carried out on
humans allowing finding similarities and differences in brain activity between macaques
and humans [54–58] bringing out the grasping mechanism is based on the properties of the
object to be grasped [55], including weight, surface, etc. [59–64]. Furthermore, the choice
of the grasping configuration is affected by the task to be performed with the grasped
object [65–68].
Information, as electrical impulses, travel from one region to another of the nervous
system through a series of connected nerves formed by axons making synapses among
neurons [69]. The flow of information related to the various phases of prehension is allowed
by two pathways: the dorsolateral, to code the grasping, and the dorsomedial, to code the
reaching (Figure 2).

Figure 2. Activation of brain areas during prehension.


Sensors 2022, 22, 2521 6 of 23

The first one connects the anterior part of the intraparietal sulcus (AIP) [70] until the
inferior parietal lobule (IPL) and the F5 area until the ventral premotor cortex (PMv) [71,72].
This pathway is involved in the motor commands for the hand pre-shaping, by transforming
the grasped object proprieties (e.g., texture, size, etc.), derived by the visually guided
grasping [73], in the corresponding commands to open the hand.
The second one connects two regions within the posterior parietal cortex (PPC),
area V6A [74] and medial intraparietal area (MIP) [75], with the dorsal premotor cor-
tex (PMd [76]). This pathway integrates somatosensory and visual information [55] for
planning and controlling arm position during the transport phase.
However, a specific pathway subdivision is not possible because the functioning of
each phase happens with an overlapping of the different areas [77–80]. The core region in
the dorsomedial pathway codes information for grasping and reaching. In the same way,
some regions between the two pathways code reaching information. The areas forming
the pathways are highly distributed and the overlapping moves to the desired hand
movement with a gradient [81]. Nevertheless, presently, many studies are still focused on
this mechanism to find out how prehension works in humans.
Despite the lack of a complete explanation of the neurophysiological behavior for the
prehension, the development of some control strategies would be possible by inspiring the
information derived from these studies [82].

5.1. Tactile Sensory Mechanisms


During object manipulation, the human brain uses tactile information related to contact
forces, the shape of the surfaces, and friction between the object surface and the fingertips.
The glabrous skin of the hand is equipped with about 17,000 sensory units sensitive
to mechanical skin deformation and represents the enormous capability for spatial and
temporal discrimination in this skin area [83]. These sensory units are of four types with
distinctly different response properties: two fast adapting (FA-I, FA-II) and two slowly
adapting (SA-I, SA-II) [84–86].
FA-I and SA-I afferents terminate superficially in the skin, with a particularly high
density in the fingertips. FA-Is, connected to Meissner endings, exhibit sensitivity to
dynamic skin deformations of relatively high frequency [87,88]. A single FA-I unit elicits
a sensation of touch [83]. SA-Is, connected to Merkel cells, are most easily excited by
lower-frequency skin deformations [87,88].
FA-II and SA-II afferents innervate the hand with a lower and roughly uniform density
and terminate deeper in dermal and subdermal fibrous tissues [83–86]. The sensitivity
of FA-II units, presumably connected to Pacinian corpuscles, is extremely high for skin
deformation, particularly for rapidly moving stimuli [83–85]. The SA-II units, presumably
connected to the spindle-shaped Ruffini ending, respond to direct skin indentations and to
the skin stretching which normally occurs during the joints movements [83–85]. Moreover,
during the manipulation of an object with the hand, SA-II units respond to the tangential
forces in the skin and can provide information for controlling the grip force to avoid
slipping, eliciting a reflex response in the muscle [86].

5.2. Grasp Stability


When moving and manipulating an object, the fingers involved in grasping apply
tangential forces to the object surface while they apply normal forces on it to ensure
grasp stability [89–94]. The grip force control is based on the prediction of the dynamic
properties of the objects influencing the mapping between motor commands of the arm and
resultant tangential forces and torques [95–98]. Dexterous manipulation involves balancing
grip and load forces with object surface properties, a capability lost with an amputation.
Indeed, healthy people regulate grip and load forces according to different frictional
conditions, using high grip forces with more slippery surfaces [89–93,99]. Similarly, people
adjust grip and load forces to the shape of the object to ensure grasp stability [90,100,101].
The result of these adaptations avoids an excessive grip force. The responses of the tactile
Sensors 2022, 22, 2521 7 of 23

afferents at the initial contact provide information about surface properties. A mismatch
between predicted and actual sensory information can trigger corrective actions, leading to
changes in grip-to-load force ratios after ~100 ms from the contact and giving place to an
updating of the representation of the surface properties used in future interactions with the
object [68,102]. Visual cues about the object shape can provide the information required to
make predictions [100,101], but shape information provided by tactile signals after contact
can override predictions based on visual cues.

5.3. Link between Brain Organization and Prosthesis Control Levels


The previous paragraphs describe the functioning of the hand during grasping. Differ-
ent human brain areas manage each grasping phase:
• object recognition;
• object properties transformed into coordinates for the hand pre-shaping;
• object reaching;
• touch recognition with the object and slippage detection;
• evaluation of the forces to be applied during grasping and reactions to slippage events.
This organization can be replicated on a prosthetic device by organizing the control
strategy in levels. In particular, a high-level could decode movement information from
the biological signals of the amputee, corresponding to PPC, V6A, MIP and PMd areas
in the human brain (Section 5). At a middle-level, thanks to the information from the
high-level, the prosthetic hand fingers necessary for the grip are moved, on the basis of the
user intention, to start the reaching and preshaping phases. Similarly, in the human brain,
the IPLs, F5 and PMv areas are activated during preshaping, while the core region in the
dorsomedial pathway and some regions between dorsomedial and dorsolateral code are
responsible for the reaching phase (Section 5). The use of force sensors on the prosthetic
hand allows measuring the grip force and detecting object slippage. They have the same
role as SA-I, SA-II, and FA-II (Section 5.1) in the human hand. Once in contact with the
object, the human hand modulates the grip force and continuously checks that the object
does not fall, reacting if slippage events are detected (Section 5.2). To replicate this behavior,
a low-level control lets the prosthetic hand to detect the first contact with the object, and to
automatically adjust the grasping force, increasing it during the object slippage.

6. Control Strategies for Hand Prostheses


Over the years, several studies were performed to return good hand functioning to
amputees. The first attempts at control strategies for hand prostheses date back to the
1960s when the different prototypes were developed with electronic hardware or logical-
programming solutions [103]. Although knowledge of the brain was scarce at that time, it
nevertheless proved sufficient to develop multiple-layers or hierarchical control strategies
inspired by the distinct phases of the prehension [104,105]. The aim of this section is to
analyze the selected papers to evaluate the multiple layers/hierarchical structure of control
strategies, and the inspiration of each part in the functioning of the human hand.
The Southampton Adaptive Manipulation Scheme (SAMS) is the evolution of a work
born in ’60 and expanded in the following by other researchers by adding new functionali-
ties. In the ’60, at the University of Southampton, a group of PhD students researched the
control of prosthetic hands. Their intention was obtaining a control more similar to a human
hand despite a device with a limited number of DoFs independently controlled. In 1973,
Codd, Nightingale, and Todd [106] proposed their solutions by introducing a hierarchy
of control systems made of three levels. The lower level (reflex system) is automatic and
independent by conscious intervention and generates a fast reflexive action. The intermedi-
ate level (intermediate system) intervenes in object shape decision, grip configuration and
force control, receiving sensory information from the motor, accordingly with the sensory
mechanism explained in Section 5.1 and the grasp stability of Section 5.2. The last level
(supervisor system) receives command signals from the user, interpreting them in signals
for the lower levels (in the human brain PPC, V6A, MIP and PMd are devoted to this task,
Sensors 2022, 22, 2521 8 of 23

Section 5). This strategy presents a hierarchical structure whereby each part is related
to a specific task, as it happens in the human brain (Section 5), but lacks a reach phase.
In 1985, Nightingale [107] expanded the concept of hierarchical control by introducing a
microprocessor to work as a coordinator between the user and the prosthesis. Feedback
about the position and the force for each drive is given by sensors such as encoders and
strain gauges, as well as the peripheral neural loops that receive information from the
muscles during a contraction to obtain a fine control during a movement (or a grasp).
To switch from one activity to another, the human hand involves various groups of muscles,
commanded by neural signals from the central nervous system (CNS). To achieve a similar
behavior, the intermediate level was split into two subsystems: the ‘posture logic’ and
the ‘force logic’. The first subsystem selects the motor drive for the movement chosen
by the user. The use of the hand is simple for the user: for example, the user sends the
command to close the hand and the hand automatically adapts its shape around the object.
The second subsystem, called ‘force logic’, regulates the force when the object is grasped.
In addition, the user can select a function among ‘touch’, ‘hold’, ‘squeeze’, ‘maneuver’,
and ‘release’ and the force controller automatically adjusts the input to the drive involved
during the grip phase. The force levels and the adjustments are automatically controlled
to reduce the burden for the user (the same behavior in the human brain as explained in
Sections 5.1 and 5.2). A ‘command logic’ level was introduced to discard the use of the
EMG signal for the proportional control to use it as a multilevel discrete switch; in this
case, after the muscle contraction, the level interprets the EMG signal as input of the below
level. In 1991, Chappell and Kyberd [108] described the transition among the functions
(here called states) flat hand, position, touch, hold, squeeze, release, and link. The EMG
signal from two antagonistic muscles (i.e., extensor and flexor carpi radii) are used to form
a bipolar signal sent to the microcontroller (that also receives information about position,
touch, forces, and slip). After the power or reset input has been sent, the first state is a
flat hand. Starting from this state, the user performs the flexion to enter in the chosen
position, the extension to return in the previous state. After contact with the object surface,
the controller moves to the touch state. With a flexion, and after overcoming a threshold
value, the controller goes into the hold state (in which the force control is activated). If the
applied force is not sufficient and the object slips, an automatic force increment occurs.
Another flexion signal places the controller in the squeeze state. Conversely, to release the
grasped object an extension signal is necessary for returning in the position state. The user
can choose from a set of hand postures (three types of precision, fist, small fist, side, flat
hand), based on Napier’s classification [109]. With a sequence of signals (full flexion, full
extension, full flexion, and relaxation) the hand adopts the full tips posture (where the
thumb is abducted and opposes the tip of the index digit) and the user, with the sequence
flexion-relax signal, switches the controller in the link state (a transaction state) where
can select the precision states P1 or P2 (P1 where middle, ring and little fingers are flexed
and the others are available for the grasp while P2 foresees middle, ring and little fingers
extended and allows the same grasp as P1). The controller intervenes, thanks to sensor
information, if the hand posture is unsuitable for a specific task (for example, if the user
chooses precision and the controller receives information about the touch from the sensors
on the palm, the controller moves in the first posture). In 1994, Kyberd et al. [110] added a
new state (called PARK) to power the hand off when unused.
A validation of this strategy was carried out with a subject with congenital, below-
elbow right hand loss who usually used a split hook. He was equipped with a laboratory
version of the original Southampton Hand [111] with the SAMS control and a conventional
proportional myoelectric to perform a comparison among the three prostheses. After a
training phase to familiarize him with the myoelectric and the SAMS controls, the subject
performed positional tasks, consisting of moving abstract objects from lower shelves to
upper ones and vice versa, and practical tests consisted of abstract tasks and simulated real
tasks (based on those devised by the Department of Health and Social Security (DHSS),
Table 1), in the United Kingdom to assess artificial limbs [110]). Task times were recorded
Sensors 2022, 22, 2521 9 of 23

and an independent, experienced observer assigned scores comparing the hands, each with
their control, with the hook (1. The hand was inferior to the split hook, 2. The hand was as
successful as the split hook, 3. The hand was superior to the split hook).
The Southampton Hand with the SAMS control, the hook, and the two-channel Vienna-
tone MM3 with a conventional myoelectric hand worked equally well for the larger abstract
prehension tests. However, the standard myoelectric hand showed grasp limitations for
small objects. The SAMS control with the Southampton Hand was able to adapt to the real
object shape during grasping the contrary to the proportional control on the conventional
myoelectric hand. The hook exhibited limitations with the largest object due to a small
grasp capacity. Moreover, for the user it was very tiring to sufficiently open the hook to
grasp large objects. The SAMS control with the Southampton Hand did not show these
drawbacks and has been superior in performance (rating of an independent, experienced
observer, Table 1) than to the hook in over half of the tasks (especially in power grip with
large grasp) and equal in the rest, despite the execution time. The proportional control on
the conventional myoelectric hand behaved similarly to or worse than the hook. Results
are reported in Table 1.

Table 1. Comparison of three controls (RH: Right hand—LH: Left hand) [110].

Time (s) Rating


Task SAMS Hook Myo SAMS Myo
Cutting
Fork LH Knife RH 57 26 - 2 1
Fork RH Knife LH 49 42 - 2 1
Change grip, Spear to scoop 12 14 - 2 1
Open bottle and pour
Top LH Bottle RH 26 29 - 3 1
Top RH Bottle LH 11 12 12 3 1
Carry tray 21 17 18 2 2
Cut slice of bread
Loaf LH Knife RH 41 42 - 3 1
Loaf RH Knife LH 17 26 17 3 2
Butter bread
Bread RH Knife LH 16 19 20 2 1
Bread LH Knife RH 36 31 29 3 2
Fasten belt 32 29 31 3 2
Toothpaste onto brush
Brush LH Tube RH 36 21 20 3 2
Brush RH Tube LH 42 - 15 3 2
Grasp telephone receiver 19 5 5 3 2
Grasp pen and write 30 20 22 2 2
Cigarette from pack
Pack LH Cig RH 28 20 44 2 2
Pack RH Cig LH 12 11 13 2 2
Use mallet and chisel
Mallt LH Chis RH 16 11 9 3 3
Mallt RH Chis LH 18 - 15 3 3
Sensors 2022, 22, 2521 10 of 23

Table 1. Cont.

Time (s) Rating


Task SAMS Hook Myo SAMS Myo
Pick up coins 34 17 27 3 2
Lift and pour kettle 29 15 - 3 1
Tear and fold paper 46 46 26 2 2
Put paper in envelope
Paper LH Env RH 19 18 22 2 2
Paper RH Env LH 13 18 20 2 2
Grasp cup 8 6 7 2 2

In 1987, Tomovic, Bekey, and Karplus [112] developed a control strategy based on the
reflex arc [113]. This strategy can be described in four phases:
1. Creation of a small number of geometric primitives to represent target objects with
arbitrary shapes.
2. Pre-shaping and alignment of the hand to select the appropriate primitive (AIP, IPL,
F5, and PMv in the human brain, Section 5).
3. Reduction of hand configurations to a limited number of standard configurations for
grasping tasks.
4. Separation of the grasping in target approach phase and shape adaptation phase,
with reflex control application (Section 5).
The reflex control principle is based on the activation of the movement patterns by
specific sensory input and the subsequent completion of the movement without other
intervention from nervous system higher centers (Section 5). This principle assumes that
most reaching and grasping tasks in humans are derived from experience. During the
target approach phase, the hand performs a reorientation and a pre-shaping to make easy
the grasp. This phase is divided into target identification—where the objects identified
by means of a vision system are replaced by geometric primitives, such as cylinders,
cones, parallelepipeds, spheres—hand structure, and grasp mode selection to choose the
involved fingers in the grasp. When the hand touches the object, the target approach
phase ends, and the grasp phase starts. In this phase, an automatic shape adaptation is
possible, employing control allowing a force selection related to the coefficients of friction
between finger material and the object surface, and slippage sensing, with an increase of
the forces until slippage stops. Force and slippage information are derived from sensors
positioned on the fingers. The hand is provided as input for the task selection, but the
authors did not specify the procedure to obtain it, while the knowledge base (containing
shape, orientation, grasping, etc.) for the target approach phase is obtained from studies
on human subjects performing several approach and grasping tasks, with a variety of
positions and orientations. This structure is similar to the human brain levels described
in Section 5. Moreover, the force and slippage management takes up the tactile sensory
mechanism (Section 5.1) with the use of touch and slippage information to stably grasp an
object (Section 5.2).
The Belgrade hand [114] has five fingers but only two motors, allowing a three-finger
mode and a five-finger mode. The hand was equipped with touch and slippage sensors and
then mounted on the PUMA 560 manipulator. There are no protocols and results because
the paper is only focused on the control strategy.
In 2006, Cipriani presented a control strategy composed of two parts: the first one
was devoted to high-level control and the second one focused on low-level control [115].
The high-level decodes the intention signals of the user used to choose the desired grasp
and forces. The selection of grasp is possible among cylindrical, spherical, tri-digital,
and lateral [116] and force between power or light. The low-level is composed of two
Sensors 2022, 22, 2521 11 of 23

subparts: pre-shaping and grasping phases. During pre-shaping all fingers are involved
while in the grasping phase only the fingers chosen by a table (correlating grasp types and
involved fingers) and grasp forces, are involved. After the pre-shaping phase, the desired
force is selected. In the grasping phase, the hand closes the fingers using force control
algorithms until the reaching of the global force. A global force error (about the total grip)
and the finger force error are evaluated. The global force is calculated as the sum of the
desired finger forces involved in the grip. Each finger can grip the object with the same
force (the global force divided among the fingers involved in the grasp) but if a finger closes
without touching the object, the global force is redistributed among the rest of the involved
fingers, with a safety margin that stops the finger to avoid a finger break. In this strategy,
low-level corresponds to the AIP, IPL, F5 and PMv areas (Section 5).
An underactuated five-finger with 16 DoFs (three for each finger plus one for the
thumb opposition) with only six active DoFs (F/E for each finger and the thumb opposition)
has been used. The force information is derived from strain gauges sensors placed on
the tendons. Five able-bodied subjects have been equipped with the hand assembled on
an orthopedic splint to reach and grasp different objects and the grasp (Table 2) has been
considered successful if the object was stably held.

Table 2. Grasp type and objects involved during the tests [115].

Grasp Type Object Size (mm) Weight (g)


Small Bottle Ø = 60 750
Big Bottle Ø = 85 1500
Cylindrical Cylinder Ø = 70 100
Cylinder Ø = 50 500
Cylinder Ø = 50 50
Rounde sponge Ø = 100 30
Spherical
Sphere Ø = 60 120
Sphere Ø = 35 20
Sphere Ø = 45 25
Sphere Ø = 55 30
Tri-digital
Felt-tip pen Ø = 20 70
Mobile phone Ø = 40 200
Cube Ø = 50 80
Postcard 1 10
Key 2 80
Lateral
Floppy disk 3 40
CD 1 30

Experiments showed that the control is stable and after a disturb resulting in force
distribution, and the hand returns to a stable grasp in a short time. The control strategy
allowed performing stable grasps in 96% of the whole experiments (Table 3).
In 2012, Pastulosa presented a control strategy consisting of four parts: pre-shaping,
closing, force control, and detection stages [117]. In the pre-shaping stage, the user can
select the desired hand configuration among four possibilities: cylindrical, tip, lateral, and
open hand (corresponding to AIP, IPL, F5 and PMv areas, Section 5). After this phase,
the hand closes with the maximum velocity until contact with the object. This is the closing
stage in which the velocity derivative is computed to determine the touch and then the
activation of the stage. After contact with the object, the force control is activated, and the
modulation of the force exerted on the object surface is possible (Section 5.2). This stage is
Sensors 2022, 22, 2521 12 of 23

alternated with the detection stage, activated when the stable grasp is reached Section 5.2).
In this stage, to detect the possible object slippage, the information from the derivatives of
the force sensor resistor (FSR, for detecting disturbances) and resistive flex sensors (RFS,
for the detection of the unintended object) signals are used (as well as FA-I and SA-II
are used in the human hand, Section 5.1). If the slippage is detected, in the force control
stage the reference force is increased (with an empirical increase). The force reference
of 1 N is empirically determined as a trade-off between object deformation and initial
slippage. With a force within 5% of the reference value, the motor is turned off to reduce
power consumption, a possible oscillatory behavior and to prevent overshoots because the
response of each finger is slowed down.

Table 3. Results about different tasks [115].

Grasp Type Cylindrical Spherical Tri-Digital Lateral


N° objects 5 2 6 4
N° trials 5 5 5 5
Successful rate 25/25 10/10 27/30 20/20
Global successful rate = 82/85

The five-finger prototype has ten DoFs (only three actives for the F/E of thumb,
index and the rest of fingers) allowing five grasping configurations: five-finger pinch,
transverse volar grip, spherical volar grip, pulp pinch, and lateral pinch [118]. FSR (Interlink
Electronics) sensors were placed on the tips and RFS (Spectra Symbol) were placed at the
dorsal part of the thumb, index, and middle fingers. Two experiments were carried out to
test the response to perturbations. In the first one, an aluminum cylinder was attached to a
mass hanger system through a dual-range force sensor. The hand grasped with transverse
volar grip and pultp pinch an object for about 20 s and different weights were placed on the
base of the hanger. The experiment was repeated seven times for each weight. A motion
sensor (Vernier MD-BTD) was used to measure the displacement of the object. To verify
the ability of this strategy to modulate the applied force when a rotational force is applied,
the hand grasped with a five-finger pinch a plastic lid attached to a fixed axle connected to
the hanger system and the force sensor. Different weights were placed on the hanger base
to produce different torques.
In all the experiments, the hand was able to quickly adjust the force during grasp
to avoid object dropping. The maximum average displacement of the transverse volar
grip experiments was 7.6 mm ± 2 mm while the same one of light weight objects was
within the resolution of the motion sensor, i.e., 2 mm. The displacement for the pulp pinch
configuration was 3.05 mm. In addition to small weights, the average displacement was
less than the precision of the sensor. During the torque experiments, the control strategy
was able to modulate the grasping force with no significant angular displacement. Indeed,
the maximum average angle displacement was 10.7 degrees when 11 N · cm of torque
was applied.
In 2017, Quinayàs proposed a hierarchical human-inspired architecture [119]. The ar-
chitecture levels are described below. The Human–Machine Interface (HMI) is devoted to
measuring and interpreting the humans’ signals for identifying four types of grip postures
(rest, open hand, power grip, and tripod grip) [120] and to sending this information to
the next level (AIP, IPL, F5 and PMv areas in human brain, Section 5). The Haptic Percep-
tion (HP) level receives information from robotic hand sensors and HMI and generates
information (contact and slip) to the high-level control (HLC), (FA-I and SA-II, Section 5.1).
A contact is identified by imposing a minimum threshold to differentiate between noise
and actual contact with the object. With a first-order time derivative of the force, slip can
be detected. HLC receives information from HP and HMI and coordinates the execution
order of the motor programs for the user task, by sending the commands to the mid-level
control (MLC). Moreover, HLC also shares information with the learning module involved
Sensors 2022, 22, 2521 13 of 23

to acquire new behaviors and store recently learned information. MLC receives information
from the above levels and generates a low-level command (LLC) and shares information,
as joint positions and motor primitives, with the knowledge database and sends newly
learned facts to be stored in a memory. The motor programs are: Repose, the default state
of the hand; Pre-shaping, the hand is configured on a primitive to prepare object grasping;
Grasping, a PI force control strategy (when the HLC detects the contact) is executed to
obtain a stable grip without slippages; Slip, 10% of the force proportionally increases to con-
trast the slippage event (when the HLC detects the contact); Release, the hand completely
opening; Point finger, the hand with the extended index finger; Reaching, the forearm
movement to reach the object; Wait, the standby state in which an action is executed.
The LLC level receives information from the hand sensors and generates the commands
for the hand actuator and the patient sensory feedback system. A PID position control is
used for tracking the trajectories in the pre-shaping phase and a PI control to maintain
the desired force in the grasping phase. Furthermore, in this structure the sub-division is
inspired to the brain areas (Section 5) but the reaching is not referred to the fingers onto the
object. The prototype prosthesis hand UC2 is composed of three fingers (each finger has
three phalanges) and nine DoFs: flexion/extension and also the opposing/repositioning
for the thumb [121]. The hand is equipped with FSR sensors on the fingertips, covered
with silicone. Two different validation tests were carried out: in the first one, a no-amputee
subject has performed the object grasp to real-time monitoring of the performance of the dif-
ferent modules of the architecture; in the second one, a no-amputee subject has performed
grasping of a cylindrical object of 190 g. A supplementary weight was added to simulate
the slippage. This work only presented a control strategy without testing it on amputees.
All of the aforementioned strategies present common levels like the brain areas,
as described in Section 5 and use information about force, touch, and slippage (Section 5.1)
to stably control the object during grasping.

7. Discussion
The control strategies summarized above (Table 4) show a subdivision in states in-
spired by the human hand: the choice of the hand initial configuration is managed by a
high-level, pre-shaping is obtained using predetermined forces and position values [122],
corresponding to the AIP, IPL, F5, and PMv areas (Section 5), touch (FA-I, Section 5.1) and
control of force and slippage (SA-II, Section 5.1) are completely automatic (Section 5.2),
lightening the cognitive burden of the amputees during the grasping task.
Sensors 2022, 22, 2521 14 of 23

Table 4. Summary of the reported analysis.

Control Strategy
Study Robotic Hand Force Sensor Slippage Detection Touch Detection
Subdivision
1. Automatic loop
2. Intermediate Laboratory version of the original Force-sensitive, Light-emitting diode,
SAMS [106–108,110] 2.1. Posture logic Southampton Hand [111] resistive sheet Microphone [108,123] phototransistor [108,123]
2.2. Force logic
3. Command logic
1. Target approach phase
1.1. Target identification
Reflex control strategy [112] 1.2. Hand structure Belgrade hand [114] Not specified Not specified Not specified
1.3. Grasp mode
2. Grasp phase
1. High level
1.1. Decoding of user intentions signals Underactuated five-finger with 16 DoFs
1.2. Choosing desired grasp and forces (three for each finger plus one for the thumb Strain gauges
Two-phases bio-inspired
2. Grasping task opposition) with only six actives (F/E for sensors on tendons Not specified Not specified
control strategy [115]
2.1. Pre-shaping each finger and the thumb opposition)
2.2. Grasping phase
1. Pre-shaping
Five-finger prototype hand has 10 DoFs
Neural Network-Based 2. Closing The derivative
(only three actives for the F/E of thumb, FSR Derivative of RFS
control strategy [117] 3. Force control of FSR e RFS
index and the rest of fingers)
4. Detection stage
1. Human–Machine Interface
UC2 [121] is composed of three fingers (each
2. Haptic Perception
Hierarchical human-inspired finger has 3 phalanges) and 9 DoFs: The negative derivative The positive derivative
3. High-level control FSR
architecture [119] flexion/extension and the opposing/repositioning of the force signal of the force signal
4. Mid-level control
for the thumb
5. Low-level command
Sensors 2022, 22, 2521 15 of 23

The studies [106–108,110,112,115,117,119] presented similar solutions in terms of


phases of prehension and management of forces and slippage events.
The SAMS strategy [106–108,110] replicates the behavior of motor control in the CNS.
Amputee subjects can choose the grasp by muscle contractions and go through the various
states. Once the configuration was chosen, with a muscle contraction, the hand starts the
movement until the touch with the object activates the control to regulate the interaction
force and manage the slippage, by incrementing the force during the event. In this phase,
the user can increment the force or release the object. This strategy has a limited number
of states allowing the user not to have a high cognitive burden and to avoid managing
force and slippage without feedback. In contrast, the transition between states is possible
through muscle contraction and co-contraction, an unnatural behavior.
The reflex control strategy [112] is based on the reflex arc: sensorial inputs to the brain
allows complex cognitive and computational processes such as trajectory planning, pattern
recognition, hand structure, etc., and produce signals to muscles to obtain the desired
movement. The various shapes of the objects are reduced to a small number of geometric
primitives. The hand, after the configuration has been chosen, can align and orient itself
to the object. After the touch, the hand adapts itself to the object shape by controlling
the interaction force and incrementing it during the slippage. The simplification of the
configuration possibilities and an adaptable control seem to result in a light commitment
for amputee patients.
The two-phase bio-inspired control strategy [115] allows a choice of configurations
and forces directly from a table. After the configuration has been chosen, the hand will close
touching the object, until the force references (both total and for each finger) are reached.
The strategy allows opening and configuration choice, leaving the rest to the automatic
control that independently closes the fingers and regulates the forces. In the pre-shaping
phase, the choice of the fingers involved in the grasp is possible. The table is limited to only
two force levels for the four possible configurations.
The neural network-based control strategy [117] allows the selection of the hand
configuration among five possibilities covering the most commonly performed grasps.
After the configuration choice, the hand closes with the maximum velocity until the contact
with the object. The touch activates the control automatically regulating the grasp force.
The slippage management occurs just when a stable grasp is reached. A possible slippage
before the stable grasp is not detected by the control.
The hierarchical human-inspired architecture [119] is based on both the task planning
paradigm and the imitation of the CNS behavior. The subject can choose from four configu-
rations, then the hand will adjust according to the primitives. Successively, the hand closes
and after the touch of the object, the control regulates the force and manages the slippage.
Reaching is the forearm movement to achieve the object.
The examined approaches show some common critical points:
• The subject, through EMG signals, can only choose the grasp type but the simplicity
of this functionality requires special attention when the hand starts closing because
the closure velocity applied before touch [124] could cause the object to go out of the
grasping area [125];
• A reach phase where the subject can voluntarily control the fingers during the object
approach is missing (Section 5);
• Without a reaching phase, predetermined configurations are necessary for the pre-
shaping phase not usable with a great number of objects (or shapes) [126];
• Except for the SAMS, in the other approaches the increase of the force during the
automatic control is not possible;
• The force reference is obtained based on tests performed for one or a few levels of
objects weights and they cannot be changed.
• A coordination strategy among the fingers to ensure the grasp stability is missing.
Sensors 2022, 22, 2521 16 of 23

To overcome these critical issues, a new control strategy based on the human hand
should have distinct phases inspired by the neurophysiological subdivision of the brain,
with a continuous presence of the user who can intervene at any time. A possible solution
is presented in Figure 3.

Figure 3. New solution inspired by human hand behavior.

At the high-level (corresponding to AIP, IPL, FR and PMv areas, Section 5.1), the hu-
man biological signals (EMG are typically used [127], the feasibility of the classification of
electroneural signals was demonstrated [128]) are acquired and processed, with different
techniques, to extract information about the desired movement. In this phase, the cor-
relation between the decoded movement and the natural way the user performs is very
important, to ensure the intuitiveness of the gesture. The choice of the movement is ob-
tained by decoding the human signals and can be changed every time and in each phase,
to replicate the user intention of switching the configuration during the grasp.
In the middle-level (corresponding to AIP, IPL, FR, PMv areas, the core region in
the dorsomedial pathway and some regions between the two pathways code reaching
information, Section 5), the pre-shaping activates only the involved fingers and then close
them with a gradual increment of the position, up to reach the object surface (position
control performed by the user) [126]. Analogously, in this phase it is also possible to open
the hand.
Tactile information is important during object manipulation, as described in Section 5.1
By miming FA-I afferents (Section 5.1) when the touch between the object and the involved
fingers (obtained from force sensors information) is detected, automatic control at a low
level is activated. The control regulates the forces during the grasp and avoids the object
slippage [129] using additional information from sensors and/or algorithms (Section 5.2).
Moreover, as SA-II afferents (Section 5.1), slippage detection for a prosthesis is important
Sensors 2022, 22, 2521 17 of 23

to prevent the object fall since, in the absence of feedback, the amputee cannot modulate
strength to counteract slippage [32]. The use of the only normal component of the force
allows detecting the slippage in a few milliseconds [130].
The weight perception is provided by both the proprioception feedback and cutaneous
cues [131]. The restriction of one of them tends to reduce the perception of the object
weight [132]. The overall force reference can be obtained by combining forces and torques
information obtained from a 6-axis sensor mounted on the wrist and from the sensors
positioned on the fingertips. From the touch and in a few cycles of the automatic control,
the mass could be determined of the object and to update the overall force reference to
adapt it to the real value of the grasped object. Then, the overall value of the force reference
can be redistributed to each finger according to its contribution during the grasp, as in the
human hand [133].
To guarantee grasp stability (Section 5.2) and to use the opposition DoF of the thumb
not only in the pre-shaping phase, an approach for controlling the fingers in a coordinated
manner based on the virtual finger concept [133] can also be used. The approach takes into
account the normal forces acquired by the sensors on the fingertips to calculate the torque
to reposition the thumb during a slippage event.
Although these actions are automatic, the user can intervene if they want to increase
the force. For instance, by using a hierarchical classification approach to assess the desired
hand/wrist gestures, as well as the desired force levels to exert during grasping tasks [134],
the user adds an input value to the reference force; the maximum force increment, built on
the correspondent maximum human signal performed, is limited in a safe range to avoid
the object and prosthetic breaking. The human signal relaxation leads to the subtraction of
the previous increment until return to the initial reference value when the signal activity is
close to zero. If the user wants to open the hand with the automatic control, they perform
the corresponding signals, and the prosthetic hand opens the involved fingers. After losing
the touch signal, the control returns to the medium level; the opening occurs as the closing,
but the user could re-close the hand starting a new grasp. However, the return to the open
hand configuration to choose a new configuration or a new grasp is not necessary.

8. Conclusions
The human hand is a complex system studied for thousands of years, and still fas-
cinates many researchers in different fields. Replicating its correct functioning in a pros-
thesis [135] is still an open challenge. Actual commercial myoelectric prosthetic hands are
simple devices allowing the opening and the closing, as a gripper [136], by using two antag-
onistic muscles [137], such as flexion and extension of the wrist [138]. An unnatural and not
intuitive behavior and for this reason, many amputees use cosmetic prostheses [139,140].
To replicate a human hand with a device is necessary not only as regards the external
aspect but also the functions [141]. In recent years, there were several attempts but all with-
out a complete replication of the human hand [142]. Thanks to neuroscience, study related
to the management of the functioning of the human hand by the nervous system has been
possible and sufficient to start to develop a control strategy replicating its functions [143].
The strategy presented in this work is based on the subdivision of the prehension in
the brain, pre-shaping, reach and grasp, and on the managing of the tactile information to
ensure grasp stability. Each part is not completely independent because an overlapping
among the areas in the brain is present [144,145]. Moreover, the user can intervene at any
time by assuming complete control, to stop or change the movement, as in human behavior,
without waiting for the end of the command in progress.
In recent years, new techniques were developed allowing the creation of new muscle
units to obtain more precise biological signals for the high-level [146–148]. This aspect
linked to surgery would allow increasing the quality of control overcoming the limita-
tions of current technologies [149] promoting the use of parallel-decoding-strategies of
intentional-movements information from EMGs [150].
Sensors 2022, 22, 2521 18 of 23

A prosthetic hand is a device with many actuators and sensors [151] but in the absence
of a control strategy able to replicate the human behavior of the hand, it will be a tool that
can be never accepted by those who have lost a so fundamental and versatile part of the
body [22,137].

Author Contributions: C.G. designed the paper, analyzed the literature and wrote the paper; F.C.
designed the paper, supervised the writing and wrote the paper; L.Z. designed the paper and
supervised the writing. All authors have read and agreed to the published version of the manuscript.
Funding: This research received no external funding.
Acknowledgments: This work was supported partly by the Italian Institute for Labour Accidents
(INAIL) prosthetic center with WiFi-MyoHand (CUP: E59E19001460005) project, partly by funding
from the innovation programme under grant agreement No. 899822, SOMA project. Thanks to
Simone Perini for improving the drawings.
Conflicts of Interest: The authors declare no conflict of interest. The founders had no role in the
design of the study; in the collection, analyses, or interpretation of data; in the writing of the
manuscript, or in the decision to publish the results.

References
1. Ogle, W. Aristotle: On the Parts of Animals; K. Paul, French & Company: Tokyo, Japan, 1882.
2. Bell, C. The Hand: Its Mechanism and Vital Endowments, as Evincing Design; Bell & Daldy: London, UK, 1865; Volume 4.
3. Jones, F.W. The Principles of Anatomy: As Seen in the Hand; Bailliere, Tindall & Cox: Paris, France, 1944.
4. Napier, J.R. Studies of the hands of living primates. In Proceedings of the Zoological Society of London; Wiley Online Library:
Hoboken, NJ, USA, 1960; Volume 134; pp. 647–657.
5. Lemelin, P.; Schmitt, D. On primitiveness, prehensility, and opposability of the primate hand: The contributions of Frederic Wood
Jones and John Russell Napier. In The Evolution of the Primate Hand; Springer: Berlin/Heidelberg, Germany, 2016; pp. 5–13.
6. Hernigou, P. Ambroise Paré IV: The early history of artificial limbs (from robotic to prostheses). Int. Orthop. 2013, 37, 1195–1197.
[CrossRef] [PubMed]
7. Wellerson, T.L. Historical development of upper extremity prosthetics. Orthop. Prosthet. Appl. J. 1957, 11, 73–77.
8. Bostock, J.; Riley, H.T. The Natural History of Pliny; G. Bell: London, UK, 1900; Volume 2.
9. Schlesinger, G. Der mechanische aufbau der künstlichen glieder. In Ersatzglieder und Arbeitshilfen; Springer: Berlin/Heidelberg,
Germany, 1919; pp. 321–661.
10. Reiter, R. Eine neue Electro Kunsthand. Grenzgeb. Med. 1948, 4, 133–135.
11. Scott, R.N. Myoelectric control of prostheses: A brief history. In Proceedings of the 1992 MyoElectric Controls/Powered
Prosthetics Symposium, Fredericton, NB, Canada, 1 August 1992; Volume 1.
12. Zecca, M.; Micera, S.; Carrozza, M.C.; Dario, P. Control of multifunctional prosthetic hands by processing the electromyographic
signal. Crit. Rev. Biomed. Eng. 2002, 30, 459–485. [CrossRef]
13. Scott, R.N.; Parker, P.A. Myoelectric prostheses: State of the art. J. Med Eng. Technol. 1988, 12, 143–151. [CrossRef]
14. Popov, B. The bio-electrically controlled prosthesis. J. Bone Jt. Surgery. Br. Vol. 1965, 47, 421–424. [CrossRef]
15. De Luca, C.J. The use of surface electromyography in biomechanics. J. Appl. Biomech. 1997, 13, 135–163. [CrossRef]
16. Oskoei, M.A.; Hu, H. Myoelectric control systems—A survey. Biomed. Signal Process. Control. 2007, 2, 275–294. [CrossRef]
17. Saridis, G.N.; Gootee, T.P. EMG pattern analysis and classification for a prosthetic arm. IEEE Trans. Biomed. Eng. 1982, 46, 403–412.
[CrossRef]
18. Christodoulou, C.I.; Pattichis, C.S. Unsupervised pattern recognition for the classification of EMG signals. IEEE Trans. Biomed.
Eng. 1999, 46, 169–178. [CrossRef]
19. Park, S.H.; Lee, S.P. EMG pattern recognition based on artificial intelligence techniques. IEEE Trans. Rehabil. Eng. 1998, 6, 400–405.
[CrossRef] [PubMed]
20. Ciancio, A.L.; Cordella, F.; Barone, R.; Romeo, R.A.; Dellacasa Bellingegni, A.; Sacchetti, R.; Davalli, A.; Di Pino, G.; Ranieri, F.;
Di Lazzaro, V.; et al. Control of prosthetic hands via the peripheral nervous system. Front. Neurosci. 2016, 10, 116. [CrossRef]
[PubMed]
21. Markovic, M.; Schweisfurth, M.A.; Engels, L.F.; Farina, D.; Dosen, S. Myocontrol is closed-loop control: Incidental feedback is
sufficient for scaling the prosthesis force in routine grasping. J. Neuroeng. Rehabil. 2018, 15, 81. [CrossRef] [PubMed]
22. Cipriani, C.; Zaccone, F.; Micera, S.; Carrozza, M.C. On the shared control of an EMG-controlled prosthetic hand: Analysis of
user—Prosthesis interaction. IEEE Trans. Robot. 2008, 24, 170–184. [CrossRef]
23. Biddiss, E.; Chau, T. Upper-limb prosthetics: Critical factors in device abandonment. Am. J. Phys. Med. Rehabil. 2007, 86, 977–987.
[CrossRef] [PubMed]
24. Biddiss, E.A.; Chau, T.T. Upper limb prosthesis use and abandonment: A survey of the last 25 years. Prosthetics Orthot. Int. 2007,
31, 236–257. [CrossRef] [PubMed]
Sensors 2022, 22, 2521 19 of 23

25. Resnik, L.; Borgia, M. Reliability, validity, and responsiveness of the QuickDASH in patients with upper limb amputation. Arch.
Phys. Med. Rehabil. 2015, 96, 1676–1683. [CrossRef]
26. Scotland, T.; Galway, H. A long-term review of children with congenital and acquired upper limb deficiency. J. Bone Jt. Surgery.
Br. Vol. 1983, 65, 346–349. [CrossRef]
27. Berke, G.M.; Nielsen, C. Establishing parameters affecting the use of myoelectric prostheses in children: A preliminary
investigation. J. Prosthet. Orthot. 1991, 3, 162–167. [CrossRef]
28. Roeschlein, R.A.; Domholdt, E. Factors related to successful upper extremity prosthetic use. Prosthetics Orthot. Int. 1989, 13, 14–18.
[CrossRef]
29. Biddiss, E.; Chau, T. The roles of predisposing characteristics, established need, and enabling resources on upper extremity
prosthesis use and abandonment. Disabil. Rehabil. Assist. Technol. 2007, 2, 71–84. [CrossRef] [PubMed]
30. Riener, R. The Cybathlon promotes the development of assistive technology for people with physical disabilities. J. Neuroeng.
Rehabil. 2016, 13, 1–4. [CrossRef] [PubMed]
31. Cordella, F.; Ciancio, A.L.; Sacchetti, R.; Davalli, A.; Cutti, A.G.; Guglielmelli, E.; Zollo, L. Literature review on needs of upper
limb prosthesis users. Front. Neurosci. 2016, 10, 209. [CrossRef] [PubMed]
32. Kyberd, P.J.; Wartenberg, C.; Sandsjö, L.; Jönsson, S.; Gow, D.; Frid, J.; Almström, C.; Sperling, L. Survey of upper-extremity
prosthesis users in Sweden and the United Kingdom. Jpo J. Prosthetics Orthot. 2007, 19, 55–62. [CrossRef]
33. Belter, J.T.; Segil, J.L.; Dollar, A.M.; Weir, R.F. Mechanical design and performance specifications of anthropomorphic prosthetic
hands: A review. J. Rehabil. Res. Dev. 2013, 50, 599–618. [CrossRef]
34. Rodriguez-Cheu, L.E.; Casals, A. Sensing and control of a prosthetic hand with myoelectric feedback. In Proceedings of the First
IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006, BioRob 2006, Pisa, Italy, 20–22
February 2006; IEEE: Piscataway, NJ, USA, 2006; pp. 607–612.
35. Huang, H.P.; Liu, Y.H.; Lee, W.C.; Kuan, J.Y.; Huang, T.H. Rehabilitation robotic prostheses for upper extremity. Contemp. Issues
Syst. Sci. Eng. 2015, 661–697. [CrossRef]
36. Naidu, D.S.; Chen, C.H.; Perez, A.; Schoen, M.P. Control strategies for smart prosthetic hand technology: An overview. In
Proceedings of the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver,
BC, Canada, 20–24 August 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 4314–4317.
37. Ernst, H.A. MH-1, a computer-operated mechanical hand. In Proceedings of the Spring Joint Computer Conference, San Francisco,
CA, USA, 1–3 May 1962; ACM: New York, NY, USA, 1962; pp. 39–51.
38. Tomovic, R.; Boni, G. An adaptive artificial hand. Ire Trans. Autom. Control. 1962, 7, 3–10. [CrossRef]
39. Yamashita, T. Engineering approaches to function of fingers. Rep. Inst. Ind. Sci. Univ. Tokyo 1963, 13, 60–110.
40. Michelman, P. Precision object manipulation with a multifingered robot hand. IEEE Trans. Robot. Autom. 1998, 14, 105–113.
[CrossRef]
41. van Duinen, H.; Gandevia, S.C. Constraints for control of the human hand. J. Physiol. 2011, 589, 5583–5593. [CrossRef]
42. Santello, M.; Flanders, M.; Soechting, J.F. Postural hand synergies for tool use. J. Neurosci. 1998, 18, 10105–10115. [CrossRef]
[PubMed]
43. Ozawa, R.; Tahara, K. Grasp and dexterous manipulation of multi-fingered robotic hands: A review from a control view point.
Adv. Robot. 2017, 31, 1030–1050. [CrossRef]
44. Li, Y.; Fu, J.L.; Pollard, N.S. Data-driven grasp synthesis using shape matching and task-based pruning. IEEE Trans. Vis. Comput.
Graph. 2007, 13, 732–747. [CrossRef] [PubMed]
45. Cordella, F.; Zollo, L.; Salerno, A.; Accoto, D.; Guglielmelli, E.; Siciliano, B. Human hand motion analysis and synthesis of optimal
power grasps for a robotic hand. Int. J. Adv. Robot. Syst. 2014, 11, 37. [CrossRef]
46. Dafotakis, M.; Sparing, R.; Eickhoff, S.B.; Fink, G.R.; Nowak, D.A. On the role of the ventral premotor cortex and anterior
intraparietal area for predictive and reactive scaling of grip force. Brain Res. 2008, 1228, 73–80. [CrossRef]
47. Prabhu, G.; Voss, M.; Brochier, T.; Cattaneo, L.; Haggard, P.; Lemon, R. Excitability of human motor cortex inputs prior to grasp. J.
Physiol. 2007, 581, 189–201. [CrossRef]
48. Loh, M.N.; Kirsch, L.; Rothwell, J.C.; Lemon, R.N.; Davare, M. Information about the weight of grasped objects from vision and
internal models interacts within the primary motor cortex. J. Neurosci. 2010, 30, 6984–6990. [CrossRef]
49. Valero-Cuevas, F.J.; Santello, M. On neuromechanical approaches for the study of biological and robotic grasp and manipulation.
J. Neuroeng. Rehabil. 2017, 14, 101. [CrossRef]
50. Fox, P.T.; Raichle, M.E. Focal physiological uncoupling of cerebral blood flow and oxidative metabolism during somatosensory
stimulation in human subjects. Proc. Natl. Acad. Sci. USA 1986, 83, 1140–1144. [CrossRef]
51. Bandettini, P.A.; Jesmanowicz, A.; Wong, E.C.; Hyde, J.S. Processing strategies for time-course data sets in functional MRI of the
human brain. Magn. Reson. Med. 1993, 30, 161–173. [CrossRef]
52. Logothetis, N.K.; Pauls, J.; Augath, M.; Trinath, T.; Oeltermann, A. Neurophysiological investigation of the basis of the fMRI
signal. Nature 2001, 412, 150. [CrossRef] [PubMed]
53. Pascual-Leone, A.; Walsh, V.; Rothwell, J. Transcranial magnetic stimulation in cognitive neuroscience—Virtual lesion, chronome-
try, and functional connectivity. Curr. Opin. Neurobiol. 2000, 10, 232–237. [CrossRef]
54. Culham, J.C.; Cavina-Pratesi, C.; Singhal, A. The role of parietal cortex in visuomotor control: What have we learned from
neuroimaging? Neuropsychologia 2006, 44, 2668–2684. [CrossRef]
Sensors 2022, 22, 2521 20 of 23

55. Castiello, U. The neuroscience of grasping. Nat. Rev. Neurosci. 2005, 6, 726. [CrossRef] [PubMed]
56. Kroliczak, G.; McAdam, T.D.; Quinlan, D.J.; Culham, J.C. The human dorsal stream adapts to real actions and 3D shape processing:
A functional magnetic resonance imaging study. J. Neurophysiol. 2008, 100, 2627–2639. [CrossRef] [PubMed]
57. Grefkes, C.; Fink, G.R. The functional organization of the intraparietal sulcus in humans and monkeys. J. Anat. 2005, 207, 3–17.
[CrossRef] [PubMed]
58. Tunik, E.; Rice, N.J.; Hamilton, A.; Grafton, S.T. Beyond grasping: Representation of action in human anterior intraparietal sulcus.
Neuroimage 2007, 36, T77–T86. [CrossRef]
59. Savelsbergh, G.J.P.; Steenbergen, B.; Van der Kamp, J. The role of fragility information in the guidance of the precision grip. Hum.
Mov. Sci. 1996, 15, 115–127. [CrossRef]
60. Bootsma, R.J.; Marteniuk, R.G.; MacKenzie, C.L.; Zaal, F.T.J.M. The speed-accuracy trade-off in manual prehension: Effects of
movement amplitude, object size and object width on kinematic characteristics. Exp. Brain Res. 1994, 98, 535–541. [CrossRef]
61. Weir, P.L.; MacKenzie, C.L.; Marteniuk, R.G.; Cargoe, S.L.; Frazer, M.B. The effects of object weight on the kinematics of
prehension. J. Mot. Behav. 1991, 23, 192–204. [CrossRef]
62. Weir, P.L.; MacKenzie, C.L.; Marteniuk, R.G.; Cargoe, S.L. Is Object Texture a Constraint on Human Prehension: Kinematic
Evidence. J. Mot. Behav. 1991, 23, 205–210. [CrossRef]
63. Johansson, R.S.; Westling, G. Coordinated isometric muscle commands adequately and erroneously programmed for the weight
during lifting task with precision grip. Exp. Brain Res. 1988, 71, 59–71. [CrossRef] [PubMed]
64. Gordon, A.M.; Forssberg, H.; Johansson, R.S.; Westling, G. Visual size cues in the programming of manipulative forces during
precision grip. Exp. Brain Res. 1991, 83, 477–482. [CrossRef] [PubMed]
65. Ansuini, C.; Santello, M.; Massaccesi, S.; Castiello, U. Effects of end-goal on hand shaping. J. Neurophysiol. 2006, 95, 2456–2465.
[CrossRef] [PubMed]
66. Ansuini, C.; Giosa, L.; Turella, L.; Altoè, G.; Castiello, U. An object for an action, the same object for other actions: Effects on hand
shaping. Exp. Brain Res. 2008, 185, 111–119. [CrossRef]
67. Cohen, R.G.; Rosenbaum, D.A. Where grasps are made reveals how grasps are planned: Generation and recall of motor plans.
Exp. Brain Res. 2004, 157, 486–495. [CrossRef]
68. Armbrüster, C.; Spijkers, W. Movement planning in prehension: Do intended actions influence the initial reach and grasp
movement? Mot. Control. 2006, 10, 311–329. [CrossRef]
69. Moore, K.L.; Dalley, A.F.; Agur, A.M.R. Clinically Oriented Anatomy; Lippincott Williams & Wilkins: Philadelphia, PA, USA, 2013.
70. Baumann, M.A.; Fluet, M.C.; Scherberger, H. Context-specific grasp movement representation in the macaque anterior intrapari-
etal area. J. Neurosci. 2009, 29, 6436–6448. [CrossRef]
71. Rizzolatti, G.; Camarda, R.; Fogassi, L.; Gentilucci, M.; Luppino, G.; Matelli, M. Functional organization of inferior area 6 in the
macaque monkey. Exp. Brain Res. 1988, 71, 491–507. [CrossRef]
72. Fluet, M.C.; Baumann, M.A.; Scherberger, H. Context-specific grasp movement representation in macaque ventral premotor
cortex. J. Neurosci. 2010, 30, 15175–15184. [CrossRef]
73. Castiello, U.; Begliomini, C. The cortical control of visually guided grasping. Neuroscientist 2008, 14, 157–170. [CrossRef]
[PubMed]
74. Bosco, A.; Breveglieri, R.; Chinellato, E.; Galletti, C.; Fattori, P. Reaching activity in the medial posterior parietal cortex of monkeys
is modulated by visual feedback. J. Neurosci. 2010, 30, 14773–14785. [CrossRef] [PubMed]
75. Johnson, P.B.; Ferraina, S.; Bianchi, L.; Caminiti, R. Cortical networks for visual reaching: Physiological and anatomical
organization of frontal and parietal lobe arm regions. Cereb. Cortex 1996, 6, 102–119. [CrossRef] [PubMed]
76. Caminiti, R.; Johnson, P.B.; Galli, C.; Ferraina, S.; Burnod, Y. Making arm movements within different parts of space: The
premotor and motor cortical representation of a coordinate system for reaching to visual targets. J. Neurosci. 1991, 11, 1182–1197.
[CrossRef] [PubMed]
77. Davare, M.; Kraskov, A.; Rothwell, J.C.; Lemon, R.N. Interactions between areas of the cortical grasping network. Curr. Opin.
Neurobiol. 2011, 21, 565–570. [CrossRef]
78. Turella, L.; Lingnau, A. Neural correlates of grasping. Front. Hum. Neurosci. 2014, 8, 686. [CrossRef]
79. Macuga, K.L.; Frey, S.H. Neural representations involved in observed, imagined, and imitated actions are dissociable and
hierarchically organized. Neuroimage 2012, 59, 2798–2807. [CrossRef]
80. Rizzolatti, G.; Luppino, G. The cortical motor system. Neuron 2001, 31, 889–901. [CrossRef]
81. Filimon, F. Human cortical control of hand movements: Parietofrontal networks for reaching, grasping, and pointing.
Neuroscientist 2010, 16, 388–407. [CrossRef]
82. Jonas, E.; Kording, K.P. Could a neuroscientist understand a microprocessor? PLoS Comput. Biol. 2017, 13, e1005268. [CrossRef]
83. Quintero, D. Properties of Cutaneous Mechanoreceptors in the Human Hand-Related to Touch Sensation. Hum. Neurobiol. 1984,
3, 3–14.
84. Macefield, V. The signalling of touch, finger movements and manipulation forces by mechanoreceptors in human skin. In
Advances in Psychology; Elsevier: Amsterdam, The Netherlands, 1998; Volume 127, pp. 89–130.
85. Johansson, R.S.; Flanagan, J.R. Tactile Sensory Control of Object Manipulation in Human, Volume Handbook of the Senses: Volume
5-Somatosensation; Elsevier: Amsterdam, The Netherlands, 2007.
Sensors 2022, 22, 2521 21 of 23

86. Vallbo, Å. Touch, Sensory Coding of, in the Human Hand. In Sensory Systems: II; Springer: Berlin/Heidelberg, Germany, 1988;
pp. 136–138.
87. Johansson, R.S.; Landstro, U.; Lundstro, R. Responses of mechanoreceptive afferent units in the glabrous skin of the human hand
to sinusoidal skin displacements. Brain Res. 1982, 244, 17–25. [CrossRef]
88. Lo, J.; Johansson, R.S. Regional differences and interindividual variability in sensitivity to vibration in the glabrous skin of the
human hand. Brain Res. 1984, 301, 65–72.
89. Johansson, R.S.; Westling, G. Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip
when lifting rougher or more slippery objects. Exp. Brain Res. 1984, 56, 550–564. [CrossRef] [PubMed]
90. Goodwin, A.W.; Jenmalm, P.; Johansson, R.S. Control of grip force when tilting objects: Effect of curvature of grasped surfaces
and applied tangential torque. J. Neurosci. 1998, 18, 10724–10734. [CrossRef]
91. Wing, A.M.; Lederman, S.J. Anticipatory load torques produced by voluntary movements. J. Exp. Psychol. Hum. Percept. Perform.
1998, 24, 1571. [CrossRef] [PubMed]
92. Johansson, R.S.; Backlin, J.L.; Burstedt, M.K.O. Control of grasp stability during pronation and supination movements. Exp. Brain
Res. 1999, 128, 20–30. [CrossRef]
93. Flanagan, J.R.; Wing, A.M. The stability of precision grip forces during cyclic arm movements with a hand-held load. Exp. Brain
Res. 1990, 105, 455–464. [CrossRef]
94. Flanagan, J.R.; Tresilian, J.R. Grip-load force coupling: A general control strategy for transporting objects. J. Exp. Psychol. Hum.
Percept. Perform. 1994, 20, 944. [CrossRef]
95. Johansson, R.S.; Westling, G. Programmed and triggered actions to rapid load changes during precision grip. Exp. Brain Res.
1988, 71, 72–86. [CrossRef]
96. Witney, A.G.; Goodbody, S.J.; Wolpert, D.M. Predictive motor learning of temporal delays. J. Neurophysiol. 1999, 82, 2039–2048.
[CrossRef]
97. Flanagan, J.R.; Wing, A.M. The role of internal models in motion planning and control: Evidence from grip force adjustments
during movements of hand-held loads. J. Neurosci. 1997, 17, 1519–1528. [CrossRef] [PubMed]
98. Flanagan, J.R.; Vetter, P.; Johansson, R.S.; Wolpert, D.M. Prediction precedes control in motor learning. Current Biology 2003,
13, 146–150. [CrossRef]
99. Cadoret, G.; Smith, A.M. Friction, not texture, dictates grip forces used during object manipulation. J. Neurophysiol. 1996,
75, 1963–1969. [CrossRef] [PubMed]
100. Jenmalm, P.; Johansson, R.S. Visual and somatosensory information about object shape control manipulative fingertip forces. J.
Neurosci. 1997, 17, 4486–4499. [CrossRef] [PubMed]
101. Jenmalm, P.; Dahlstedt, S.; Johansson, R.S. Visual and tactile information about object-curvature control fingertip forces and grasp
kinematics in human dexterous manipulation. J. Neurophysiol. 2000, 84, 2984–2997. [CrossRef] [PubMed]
102. Johansson, R.S.; Flanagan, J.R. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat. Rev.
Neurosci. 2009, 10, 345. [CrossRef] [PubMed]
103. Salisbury, L.L.; Colman, A.B. A mechanical hand with automatic proportional control of prehension. Med Biol. Eng. 1967,
5, 505–511. [CrossRef] [PubMed]
104. Baits, J.C.; Todd, R.W.; Nightingale, J.M. Paper 10: The Feasibility of an Adaptive Control Scheme for Artificial Prehension. Proc.
Inst. Mech. Eng. Conf. Proc. 1968, 183, 54–59. [CrossRef]
105. Childress, D.S. Closed-loop control in prosthetic systems: Historical perspective. Ann. Biomed. Eng. 1980, 8, 293–303. [CrossRef]
106. Codd, R.D.; Nightingale, J.M.; Todd, R.W. An adaptive multi-functional hand prosthesis. J. Physiol. 1973, 232, 55P. [PubMed]
107. Nightingale, J.M. Microprocessor control of an artificial arm. J. Microcomput. Appl. 1985, 8, 167–173. [CrossRef]
108. Chappell, P.H.; Kyberd, P.J. Prehensile control of a hand prosthesis by a microcontroller. J. Biomed. Eng. 1991, 13, 363–369.
[CrossRef]
109. Napier, J.R. The prehensile movements of the human hand. J. Bone Jt. Surgery. Br. Vol. 1956, 38, 902–913. [CrossRef]
110. Kyberd, P.J.; Chappell, P.H. The Southampton Hand: An intelligent myoelectric prosthesis. J. Rehabil. Res. Dev. 1994, 31, 326.
111. Nightingale, J.; Todd, R. An adaptively-controlled prosthetic hand. Eng. Med. 1971, 1, 3–6. [CrossRef]
112. Tomovic, R.; Bekey, G.; Karplus, W. A strategy for grasp synthesis with multifingered robot hands. In Proceedings of the 1987
IEEE International Conference on Robotics and Automation, Raleigh, NC, USA, 31 March–3 April 1987; IEEE: Piscataway, NJ,
USA, 1987; Volume 4, pp. 83–89.
113. Tomovic, R. Control of assistive systems by external reflex arcs. In Advances in External Control of Human Extremities VIII; ETAN:
Belgrade, Yugoslavia, 1984; pp. 7–21.
114. Rakić, M. Paper 11: The ‘Belgrade Hand Prosthesis’. Proc. Inst. Mech. Eng. Conf. Proc. 1968, 183, 60–67.
115. Cipriani, C.; Zaccone, F.; Stellin, G.; Beccai, L.; Cappiello, G.; Carrozza, M.C.; Dario, P. Closed-loop controller for a bio-inspired
multi-fingered underactuated prosthesis. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation,
2006, ICRA 2006, Orlando, FL, USA, 15–19 May 2006; IEEE: Piscataway, NJ, USA, 2006; pp. 2111–2116.
116. Cutkosky, M.R. On grasp choice, grasp models, and the design of hands for manufacturing tasks. IEEE Trans. Robot. Autom. 1989,
5, 269–279. [CrossRef]
117. Pasluosta, C.F.; Chiu, A.W.L. Evaluation of a neural network-based control strategy for a cost-effective externally-powered
prosthesis. Assist. Technol. 2012, 24, 196–208. [CrossRef]
Sensors 2022, 22, 2521 22 of 23

118. Sollerman, C.; Ejeskär, A. Sollerman hand function test: A standardised method and its use in tetraplegic patients. Scand. J. Plast.
Reconstr. Surg. Hand Surg. 1995, 29, 167–176. [CrossRef]
119. Quinayás, C.; Ruiz, A.; Torres, L.; Gaviria, C. Hierarchical-Architecture Oriented to Multi-task Planning for Prosthetic Hands
Controlling. In Proceedings of the International Work-Conference on the Interplay Between Natural and Artificial Computation,
Almeria, Spain, 3–7 June 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 157–166.
120. Quinayás-Burgos, C.A.; López, C.A.G. Sistema de identificación de intención de movimiento para el control mioeléctrico de una
prótesis de mano robótica. Ing. Univ. 2015, 19, 27–50. [CrossRef]
121. Quinayás-Burgos, C.A.; Muñoz-Añasco, M.; Vivas-Albán, Ó.A.; Gaviria-López, C.A. Diseño y construcción de la prótesis robótica
de mano UC-1. Ing. Univ. 2010, 14, 223–237.
122. Iberall, T. The nature of human prehension: Three dextrous hands in one. In Proceedings of the 1987 IEEE International
Conference on Robotics and Automation, Raleigh, NC, USA, 31 March–3 April 1987; IEEE: Piscataway, NJ, USA, 1987; Volume 4;
pp. 396–401.
123. Chappell, P.; Nightingale, J.; Kyberd, P.; Barkhordar, M. Control of a single degree of freedom artificial hand. J. Biomed. Eng. 1987,
9, 273–277. [CrossRef]
124. Light, C.M.; Chappell, P.H.; Hudgins, B.; Engelhart, K. Intelligent multifunction myoelectric control of hand prostheses. J. Med.
Eng. Technol. 2002, 26, 139–146. [CrossRef] [PubMed]
125. Zhu, G.; Duan, X.; Deng, H. Hybrid force-position fuzzy control for a prosthetic hand. In Proceedings of the International
Conference on Intelligent Robotics and Applications, Busan, Korea, 25–28 September 2013; Springer: Berlin/Heidelberg, Germany,
2013; pp. 415–426.
126. Betti, S.; Castiello, U.; Begliomini, C. Reach-to-Grasp: A Multisensory Experience. Front. Psychol. 2021, 12, 213. [CrossRef]
[PubMed]
127. Muzumdar, A. Powered Upper Limb Prostheses: Control, Implementation and Clinical Application; Springer Science & Business Media:
Berlin/Heidelberg, Germany, 2004.
128. Noce, E.; Gentile, C.; Cordella, F.; Ciancio, A.; Piemonte, V.; Zollo, L. Grasp control of a prosthetic hand through peripheral neural
signals. J. Physics Conf. Ser. Iop Publ. 2018, 1026, 012006. [CrossRef]
129. Cordella, F.; Gentile, C.; Zollo, L.; Barone, R.; Sacchetti, R.; Davalli, A.; Siciliano, B.; Guglielmelli, E. A force-and-slippage
control strategy for a poliarticulated prosthetic hand. In Proceedings of the 2016 IEEE International Conference on Robotics and
Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 3524–3529.
130. Gentile, C.; Cordella, F.; Rodrigues, C.R.; Zollo, L. Touch-and-slippage detection algorithm for prosthetic hands. Mechatronics
2020, 70, 102402. [CrossRef]
131. McCloskey, D. Muscular and cutaneous mechanisms in the estimation of the weights of grasped objects. Neuropsychologia 1974,
12, 513–520. [CrossRef]
132. Giachritsis, C.; Wright, R.; Wing, A. The contribution of proprioceptive and cutaneous cues in weight perception: Early evidence
for maximum-likelihood integration. In Proceedings of the International Conference on Human Haptic Sensing and Touch
Enabled Computer Applications, Amsterdam, The Netherlands, 8–10 July 2010; Springer: Berlin/Heidelberg, Germany, 2010;
pp. 11–16.
133. Kargov, A.; Pylatiuk, C.; Martin, J.; Schulz, S.; Döderlein, L. A comparison of the grip force distribution in natural hands and in
prosthetic hands. Disabil. Rehabil. 2004, 26, 705–711. [CrossRef]
134. Leone, F.; Gentile, C.; Ciancio, A.L.; Gruppioni, E.; Davalli, A.; Sacchetti, R.; Guglielmelli, E.; Zollo, L. Simultaneous sEMG
classification of wrist/hand gestures and forces. Front. Neurorobot. 2019, 13, 42. [CrossRef]
135. Provenzale, A.; Cordella, F.; Zollo, L.; Davalli, A.; Sacchetti, R.; Guglielmelli, E. A grasp synthesis algorithm based on postural
synergies for an anthropomorphic arm-hand robotic system. In Proceedings of the 5th IEEE RAS/EMBS International Conference
on Biomedical Robotics and Biomechatronics, Sao Paulo, Brazil, 12–15 August 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 958–963.
136. Laschi, C.; Dario, P.; Carrozza, M.C.; Guglielmelli, E.; Teti, G.; Taddeucci, D.; Leoni, F.; Massa, B.; Zecca, M.; Lazzarini, R. Grasping
and Manipulation in Humanoid Robotics; Scuola Superiore Sant Anna: Pisa, Italy, 2000.
137. Kent, B.A.; Karnati, N.; Engeberg, E.D. Electromyogram synergy control of a dexterous artificial hand to unscrew and screw
objects. J. Neuroeng. Rehabil. 2014, 11, 41. [CrossRef]
138. Scott, R.M. Biomedical Engineering in upper-limb prosthetics. In The Comprehensive Management of the Upper-Limb Amputee;
Atkins, D.J., Meier, R.H., Eds.; Springer: New York, NY, USA, 1989; pp. 173–189.
139. Carrozza, M.C.; Massa, B.; Micera, S.; Lazzarini, R.; Zecca, M.; Dario, P. The development of a novel prosthetic hand-ongoing
research and preliminary results. IEEE/ASME Trans. Mechatron. 2002, 7, 108–114. [CrossRef]
140. Kyberd, P.J.; Holland, O.E.; Chappell, P.H.; Smith, S.; Tregidgo, R.; Bagwell, P.J.; Snaith, M. MARCUS: A two degree of freedom
hand prosthesis with hierarchical grip control. IEEE Trans. Rehabil. Eng. 1995, 3, 70–76. [CrossRef]
141. Kyberd, P.J.; Gow, D.; Scott, H.; Griffiths, M.; Sperling, L.; Sandsjo, L.; Almstrom, C.; Wartenberg, C.; Jonsson, S. A comparison
of upper limb prostheses users in Europe. In Proceedings of the 1999 MyoElectric Controls/Powered Prosthetics Symposium,
Fredericton, NB, Canada, 25–27 August 1999.
142. Woodward, R.B.; Hargrove, L.J. Adapting myoelectric control in real-time using a virtual environment. J. Neuroeng. Rehabil. 2019,
16, 11. [CrossRef] [PubMed]
Sensors 2022, 22, 2521 23 of 23

143. Jacobs, S.; Danielmeier, C.; Frey, S.H. Human anterior intraparietal and ventral premotor cortices support representations of
grasping with the hand or a novel tool. J. Cogn. Neurosci. 2010, 22, 2594–2608. [CrossRef]
144. Betti, S.; Zani, G.; Guerra, S.; Castiello, U.; Sartori, L. Reach-to-grasp movements: A multimodal techniques study. Front. Psychol.
2018, 9, 990. [CrossRef]
145. Handjaras, G.; Bernardi, G.; Benuzzi, F.; Nichelli, P.F.; Pietrini, P.; Ricciardi, E. A topographical organization for action
representation in the human brain. Hum. Brain Mapp. 2015, 36, 3832–3844. [CrossRef]
146. Mereu, F.; Leone, F.; Gentile, C.; Cordella, F.; Gruppioni, E.; Zollo, L. Control Strategies and Performance Assessment of
Upper-Limb TMR Prostheses: A Review. Sensors 2021, 21, 1953. [CrossRef]
147. Vu, P.P.; Vaskov, A.K.; Irwin, Z.T.; Henning, P.T.; Lueders, D.R.; Laidlaw, A.T.; Davis, A.J.; Nu, C.S.; Gates, D.H.;
Gillespie, R.B.; et al. A regenerative peripheral nerve interface allows real-time control of an artificial hand in upper
limb amputees. Sci. Transl. Med. 2020, 12, eaay2857. [CrossRef]
148. Kumar, N.G.; Kung, T.A.; Cederna, P.S. Regenerative Peripheral Nerve Interfaces for Advanced Control of Upper Extremity
Prosthetic Devices. Hand Clin. 2021, 37, 425–433. [CrossRef]
149. Beck, M.M.; Spedden, M.E.; Dietz, M.J.; Karabanov, A.N.; Christensen, M.S.; Lundbye-Jensen, J. Cortical signatures of precision
grip force control in children, adolescents and adults. Elife 2021, 10, e61018. [CrossRef]
150. Leone, F.; Gentile, C.; Cordella, F.; Gruppioni, E.; Guglielmelli, E.; Zollo, L. A parallel classification strategy to simultaneous
control elbow, wrist, and hand movements. J. Neuroeng. Rehabil. 2022, 19, 1–17. [CrossRef] [PubMed]
151. Matulevich, B.; Loeb, G.E.; Fishel, J.A. Utility of contact detection reflexes in prosthetic hand control. In Proceedings of the 2013
IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; IEEE: Piscataway, NJ,
USA, 2013; pp. 4741–4746.

You might also like