Taxonomy Everyday Grasps
Taxonomy Everyday Grasps
Abstract— Grasping has been well studied in the robotics goal is to build a taxonomy / database that captures most of
and human subjects literature, and numerous taxonomies have everyday grasping and manipulation actions.
been developed to capture the range of grasps employed in Towards this goal, two subjects attempted to capture all
work settings or everyday life. But how completely do these
taxonomies capture grasping actions that we see every day? We actions accomplished during a typical day, with a focus on
asked two subjects to monitor every action that they performed critical humanoid robot capabilities such as home care and
with their hands during a typical day, as well as to role- manipulation in unstructured environments such as a home or
play actions important for self-care, rehabilitation, and various workplace. For each observed grasp or manipulation action,
careers and then to classify all grasping actions using existing our subjects attempted to classify it using the Comprehensive
taxonomies. While our subjects were able to classify many
grasps, they also found a collection of grasps that could not be Grasp Taxonomy of Feix and colleagues [5]. In all, 179
classified. In addition, our subjects observed that single entries distinct grasping actions were captured and classified.
in the taxonomy captured not one grasp, but many. When we We found that although many grasping actions could
investigated, we found that these grasps were distinguished be classified in the existing taxonomies, there were im-
by features related to the grasping action, such as intended portant differences between grasps that the taxonomy did
motion, force, and stiffness – properties also needed for robot
control. We suggest a format for augmenting grasp taxonomies not consider. To capture these differences, we propose an
that includes features of motion, force, and stiffness using a extended set of annotations capturing force, motion, and
language that can be understood and expressed by subjects with stiffness information. Table XII shows an example. Our goal
light training, as would be needed, for example, for annotating in devising this classification was to communicate these
examples or coaching a robot. This paper describes our study, grasping action characteristics as precisely as possible while
the results, and documents our annotated database.
still making it possible for individuals with light training to
I. INTRODUCTION understand and classify examples of manipulation actions or
Grasping is an essential part of people’s daily lives and is communicate differences between those actions to a robot,
critical for creating robots that can interact with and make for example.
changes to their environment. Grasping has been the focus Furthermore, we found 40 grasp types which could not
of numerous human studies (e.g. [1]), and a large body of be well captured by existing taxonomies, including actions
robotics research has worked within a grasp-move-ungrasp of pushing, grasping while pressing a button or lever, and
paradigm. Within these studies, one area of focus has been grasping with extension (inside-out) forces. We believe our
hand shape and the contacts between hand and object, which database is an improvement on our prior work, because we
constrain how the hand and object can interact. characterize human grasps by taking into account forces and
A number of taxonomies with hand shape and object motion exerted after a grasp is achieved. These added prop-
contact as central elements have been developed to classify erties may tie into existing impedance [15] and operational
grasps e.g. [2][3][4][5]. These taxonomies have been widely space controllers [16] used in robotics.
used in robotics, for applications including grasp recognition We report our complete process and findings below. The
[6][7], robot hand design and evaluation [8], programming database of classified and annotated grasps from our study
by demonstration [9], and even to support more sophisticated can be viewed at [17].
interaction with grasp sensitive objects [10]. They also allow II. RELATED WORK
researchers to communicate grasp differences in research
and discussion, distinguishing power from precision grasps, A. Grasp taxonomies
tripod vs. pinch, spherical vs. cylindrical, for example. Perhaps the earliest often cited grasp taxonomy is that
However, the actual act of grasping can be complex. We of Schlesinger [18]. Napier [19] also contributes a basic
may slide, rotate, tumble, or bend an object in order to taxonomy and an interesting discussion of hand evolution and
grasp it e.g. [11][12]. In fact, pushing, rotating, tumbling, and use. Grasp taxonomies have been developed that are targeted
other manipulations have been studied independently (e.g., at tasks of everyday living, including those of Kapandji [20],
[13][11][14]) and can be considered essential aspects of any Edwards et al. [4] and Kamakura et al. [2]. Kamakura and
dexterous robot’s repertoire. colleagues, for example, classified static prehensile patterns
In this paper, we ask to what extent our existing grasp of normal hands into 14 patterns under 4 categories (power
taxonomies capture the actions we do in everyday life. Our grip, intermediate grip, precision grip and grip involving no
thumb). They illustrated detailed contact areas on the hand
*This research is supported by NSF award IIS-1218182
1 School of Computer Science, Carnegie Mellon University, 5000 Forbes for each grasp and analyzed for which objects the grasp may
Ave, Pittsburgh, PA15213, USA [email protected] be used.
Perhaps the most widely cited taxonomy in robotics is that III. METHODS
of Cutkosky [3], which includes 16 grasp types observed in We compiled a task list from various sources for our study.
skilled machining tasks. The Cutkosky taxonomy consists of First, we studied previous literature that measured self-care
a hierarchical tree of grasps, with categories classified under and mobility skills for patient rehabilitation [27][28][29][30].
power and precision. Moving from left to right in the tree, the The measured skills listed in these papers such as dressing,
grasps become less powerful and the grasped objects become eating, and grooming cover typical and important tasks
smaller. Zheng and his colleagues [21] used this taxonomy to humans need to do, even for those who are disabled. Our
capture the daily activities of a skilled machinist and a house initial list of actions was a union of the tasks mentioned in
maid, giving for the first time a count of how frequently those papers. In work such as Choi et al. [31], tasks were
different grasps are used. The intent of our study is similar. ranked by importance, and tasks like buttoning, putting on
However, we consider a broader variety of actions beyond socks, and personal hygiene were discarded because they
static grasps and make special note of differences observed in received a low ranking and are difficult for a robot to
grasps that have the same entries within the grasp taxonomy. accomplish. However, we also included these less important
Feix et al. [5] recently developed a comprehensive tax- tasks in our list, with the goal of having a more inclusive
onomy of grasps that brings together previous research study.
with their own observations. They propose a definition of We next observed two college students’ life from the time
a grasp as follows: ”A grasp is every static hand posture they woke up until the time they went to bed. We categorized
with which an object can be held securely with one hand.” all the hand gestures and motions that the person would use
This definition excludes intrinsic movements, bimanual tasks, into hundreds of tasks. However, we felt this was insufficient
gravity dependent grasps, and flat hand grasps, for a total since there are many skilled gestures (e.g. of tradespeople)
of 33 classified grasp types. Because it was developed with that are not found in everyday life, and that the task list
the goal of being inclusive, we selected this taxonomy as a so far was biased toward the office settings of the subjects.
starting place in our experiments. However, our taxonomy Therefore, we expanded our task list to include specific tasks
is not limited to grasps, and reincorporates non-prehensile that people from various careers would accomplish in their
manipulation tasks that people do everyday. workplace.
Next, we further separated the compound tasks into small
B. Manipulation Taxonomies
task-components and movement elements, such as in Kopp
A number of taxonomies have been developed to ex- et al. [27]. For example, wearing a T-shirt was broken down
press manipulation actions. Chang and Pollard [22] classify into three basic tasks: (1) arms in T-shirt sleeves, (2) grab
manipulations prior to grasping, with a focus on how the the neck hole and move head through neck hole, (3) pull
object is adjusted, considering both rigid transformation and down and straighten shirt. We collapsed similar gestures
non-rigid reconfigurations. Worgotter and colleagues [23] together and classified these movements into the existing 33-
discuss how manipulation actions are structured in space and grasp database of [5]. When we encountered daily-use hand
time. Focusing on actions of bringing together and breaking gestures that were not in the basic database, we added them
apart, they identify 30 fundamental manipulations that allow to the database.
sequences of activities to be encoded. Elliott and Connolly Our final database contains 50 different grasp types, 4
[24] classify coordinated motions of the hand that are used press types, 10 grasp and press type, 2 extend types and
to manipulate objects, identifying three classes of intrinsic 7 other hand types. We also illustrate where each movement
movements: simple synergies such as squeeze, reciprocal may be used in daily life with corresponding pictures.
synergies such as roll, and sequential patterns such as a rotary
stepping motion of the fingers to change contact positions on IV. DATABASE ANNOTATIONS
the object. Bullock et al. [25] encode manipulation instances Fig. 1 shows the classification we have developed in order
at a more abstract level, focusing on motion of the hand to distinguish the different manipulation actions we have
and relative motion of the hand and object at contact, with observed. The focus of previous grasp taxonomies has often
the goal of creating a classification scheme that does not been on hand shape (highlighted in purple).
assume a specific hand design. We adopt a structure similar With our observations, however, we annotated grasps with
to theirs for expressing intended motion of grasped object, four features: (1) hand shape, (2) force type, (3) direction,
but incorporate it as extra information within the context of and (4) flow. The object related property is another factor
a more conventional grasp taxonomy. that influences the hand shape and motion, but these rela-
Torigoe [26] investigated manipulation in 74 species of tionships are not made explicit in our database. In contrast
great apes, identifying over 500 different body part manip- to traditional grasp taxonomy research, our research focuses
ulation acts, 300 of which are related to hand manipulation, on grasps within the context of the action that is intended.
including drape, flip, pick up, pull, push or press, roll, rotate, The rationale behind this focus came about when we mapped
throw, untwist and so on. We find that a similar approach is the grasping actions we encountered onto the existing grasp
useful for distinguishing between different grasps that have taxonomy of [5] and realized that a wide variety of these
the same grasp taxonomy label and use this approach to grasping actions belonged to one grasp type within the
capture the types of force applied in human manipulation. taxonomy, but involved very different motion, force, or flow.
Fig. 4: Local coordinates and thumb positions of the left
hand
Example
Force Type Throw Grab&Press
Annotation Shoot a basket ball Press down a door handle
In our database, both force and motion are important. For Example
this reason, grab and hold are not the same, even though Motion along x/-x (ob- around x axis along xz plane
Axes ject) (hand) (hand)
they feature the same motion (i.e. no motion). We define Force inward, hold inward, hold against the sur-
grab as touching or securing an object that is resting on a Axes zipper egg beater face
surface. We define hold with a gravitational factor, where the Annotation Zip a zipper Beat with egg Move a mouse
beater
hand/arm is applying an upward force to counteract gravity
(Table IV).
Most of the time, we use the local coordinates of the
TABLE IV: Hold & Grab Examples
hand to describe the direction of movement. However, we
also sometimes use global coordinates of the world or
local coordinates of the object, depending on its degree of
usefulness.
Hand coordinates: The local coordinates of the hand are
defined as follows: The direction of four fingers are defined
Example as the x-axis. The y-axis is defined as coming out of the palm
Force Type Grab Hold in the ventral/palmar direction. The z-axis is defined as the
Annotation Grab the ladder Hold a laundry detergent thumb pointing away from the little finger for both hands
(Figures 4 and 5). This results in using either the left hand
rule for left hand or right hand rule for right hand to compute
C. Direction the z-axis. This unorthodox use of coordinate frames results
In order to specify the direction of a force or motion, we in symmetrical descriptions of movements and grasps using
need to specify the direction subspace and the coordinate the two hands. Local coordinates of the hand are mostly
frame as shown in Table V. The direction subspace describes used when the motion is along one of the hand coordinate
a subset of the six-dimensional space within which the axes. For example, Table VII, first column, shows rubbing
motion is occurring. Examples of direction subspaces that the hands along the local x-axis.
we use include: (1) along a linear axis, (2) rotation around an Global coordinates: Global coordinates of the world are
axis, (3) movement within a plane, or (4) inwards/outwards used when the motion is along the direction of gravity or
(towards or away from the center of an object). We note that within a coordinate system that could be fixed to our local
the motion direction can be very different from the force environment. For example, when we dribble a basketball,
direction. For example, when we zip a zipper, the internal we maneuver the ball within a coordinate frame fixed to the
force direction of the hand is inwards for the zipper (i.e. grab world, not the hand or the ball (Table VII, second column).
the zipper tightly), but the direction of motion is along the The direction of gravity is defined as the global z-axis.
zipper. Similarly, the internal force direction is inwards to Object coordinates: Finally, occasionally the local coor-
hold the egg beater but the direction of motion is around the dinates of the object must be used since, in some motions, the
x-axis (Table VI). We use the notation x(45)y to describe object shape decides the direction of motion. If the object is
a long stick or string type, we define the direction along the TABLE VIII: Flow Factor Examples
stick to be the x-axis. If the object is rectangular in shape, we
define the direction along the long side to be the x-axis and
the direction along the short side as the z-axis. For example,
when we pull out measuring tape, the motion direction is
along the tape’s long dimension: the x-axis (Table VII, third
column). Example
Many motions or forces can be described naturally in Flow Bound Motion/ Bound Free Motion/ Half Bound
multiple coordinate frames. For example, plugging in a Force Force
charger could be expressed in the coordinate frame of the Annotation Stick key into key hole Hold keys
charger, the wall, or the hand. We asked our subjects to
make the annotations that were most intuitive for them. The
important point is that all three coordinate frames are useful, TABLE IX: Weight of Object Examples
as different actions may have their focus on different frames.
Example
Example Object Light Heavy
Coordinate Hand Global Object weight
Frame Annotation Grab an empty box Hold a heavy box
Motion along x/-x along z/-z along x/-x
Axes
Annotation Rub hands Dribble basket- Measure with a
ball tape measure
The material of the object also strongly affects grasping
strategy. For example, grabbing highly deformable material
requires continuous adjustment of grasp shape as the object
changes shape. Another example of the effect of material is
D. Flow
that people will grab raw meat differently than paper.
The effort factor we use here is flow. Flow comes from The shape and size of the object affects hand shape. We
the Laban Effort / Shape notation [33]. It refers to “attitude usually pinch a thin wire but grab a thick string, see Table
toward bodily tension and control” and can be free, bound X.
and half-bound. Free refers to the moving direction of Finally, the friction coefficient of an object determines
the gesture being very casual, while bound refers to the how hard we grab the object. The thick string in Table X
action being very stiff or tightly controlled. The half bound is rougher then the exercise bar, so we would grab the bar
annotation is used when the action is bound along one or harder to prevent our hand from sliding.
more axes and free along the rest. For example, in Table XIII,
the flow of motion in dragging toilet paper is half-bound TABLE X: Shape & Size & Roughness of Object Examples
because in the plane that is perpendicular to the axis of the
toilet paper, the motion is still free. Our informal observation
is that most of the time we specify an action as being free
or bound depending on whether the action includes a goal
location. For example, if we try to plug in a charger into a Example
wall or stick a key into a key hole, the motion is bound, but Size Thin Thick Thick
if we just throw the key for fun, the action is entirely free Roughness Slippery Rough Slippery
Annotation Grab a wire Grab a rope Grab exercise
(Table VIII). bar
E. Object related factors
Most grasps depend on the object our hands manipulate,
thus object related factors is another important feature in V. RESULTS
describing hand gestures.
From our observations, weight is an important factor since Our main result can be found on our website [17]. Of
it affects both internal and cumulative force applied on the all the force types, as shown in Table I (frequency column),
object. A simple example is when we hold an empty box or hold (41), grab (32), press (31) and pull (18) make up the
a full box. If the box is empty, we tend to grab the top piece majority of tasks that we observed in our study.
of the box, but if the box is heavy, we would hold from the We show two examples from our database here, including
bottom and lift it up (Table IX). all entries for the Large Diameter Grasp and all entries for
the Lateral Grasp (Table XI).
A. Large diameter cylinder TABLE XII: Large Diameter Cylinder Grasp Examples
In a large-diameter grasp (as shown in Table XI), the hand
shape is more appropriate for a larger-diameter cylinder-
shaped objects using all five fingers. The opposition type
is palm, and we use all finger pads. The thumb is clearly
abducted. Example
Force Type Hold Hold
TABLE XI: Large Diameter and Lateral Grasp Motion Dir around y axis (hand) -
Force Dir - -z (global)
Name Large Diameter Lateral Flow Free Motion/ Bound Bound Force
Force
Annotation Drink water Hold a bottle
Picture Example
Type Power Intermediate Force Type Hold Grab&Press
Opp.Type Palm Side Motion Dir x(45)y (hand) -
Thumb Pos Abd Add Force Dir - z (global)
VF2 2-5 2 Flow Free Motion/ Half Bound Bound Force
Shape Cylinder/Cuboid Card piece Force
Size Large Diameter Thin Annotation Throw paper Grab cabbage
Example
Annotation Tie Shuffle Lift up the Scratch
cards switch
Example
Force Type Pull Pull
Motion Dir -x (hand) xz plane (hand)
Force Dir - -
Flow Bound Motion/ Bound Half Bound Motion/ Example
Force Bound Force Annotation Press per- Open soda Use Use pliers
Annotation Put on gloves(along the Drag toilet paper fume bot- bottle screw-
arm) tle driver
broad than grasping, but also because many grasps that may
Example look similar from a snapshot involve very different uses of
Force Type Twist Twist the hand to accomplish a task.
Motion Dir around y axis (hand) around x axis (hand) We find that to communicate these differences, we need
Force Dir - -
Flow Bound Motion Bound Motion
to express the type of force, directional information, and
Annotation Twist the key to start up Twist the knob in car stiffness information for the action.
the car It is interesting to note the similarities between our annota-
tions and the parameters required for impedance control [15]
or operational space control [16], where one expresses a task
in terms of the desired impedance or motion/force/stiffness
properties of the manipulator. It is possible that alternative
Example
annotation schemes could be developed that fall even more
Force Type Hold Rub/Stroke
Motion Dir xy plane (hand) xy plane (hand) closely in line with these control approaches. However,
Force Dir - inwards (hand) through extensive trial and error, we found that the an-
Flow Free Motion/ Half Bound Half Bound Motion/ notation scheme we suggest supports clear communication
Force Bound Force
Annotation Give card to someone Wipe classes between people as well as ease of labeling examples. It
certainly seems possible that these annotations can be used
to inform robot control approaches or be incorporated into a
demonstration and coaching framework. Demonstrating these
connections is left as future research.
Example One limitation of this database is that we need a more
Force Type Hold Hold accurate system for describing the direction of motion and
Motion Dir z (global)/ -z (global)/ around x axis (hand)
around x axis (hand)
force that takes into account that the direction might not be
Force Dir - - perfectly aligned onto a single axis.
Flow Free Motion/ Bound Half Bound Motion/ We can also ask whether all entries in our database are
Force Bound Force relevant for humanoid robots. We believe that as robot
Annotation Eat with scoop Pour washing powder
become more pervasive, especially in the home and in health
care and rehabilitation scenarios, a large majority of the
grasps depicted here will become of interest. However, we
Some, but not all of the new grasp types can be found in did not attempt to make this distinction in order to have a
other taxonomies, such as [20] and [4]. more inclusive and comprehensive database.
It may be possible to organize this database from a
VI. DISCUSSION
different point of view, such as making the force types or
Effective grasp taxonomies capture not only hand shape, motion types the central classification rather than grasp type.
but also the nature of contact between the hand and object. We chose grasp type as the first level of organization in
The best in this regard is perhaps the Kamakura taxonomy order to be consistent with existing taxonomies. However,
[2], which illustrates in great detail regions on the hand that it is interesting to consider whether a different organization
come in contact with the object. The patterns and extent may lead to a simpler or more intuitive way of describing
of these regions reveals much, especially when considering these results.
grasp control and robot hand design.
However, we find annotating only shape and contact ACKNOWLEDGMENT
to be insufficient to convey important differences between This research was supported by NSF awards IIS-1218182
everyday actions; in part because this set of actions is more and IIS-1344361. We would like to thank Daniel Troniak
and Michael Koval for many helpful comments, discussion [17] J. Liu, F. Feng, Y. Nakamura, and N. Pollard, “A taxonomy
and suggestions. of everyday grasps in action,” June 2014. [Online]. Available:
http://www.cs.cmu.edu/∼jialiu1/database.html
[18] I. G. Schlesinger, “Der mechanische aufbau der künstlichen glieder,”
R EFERENCES in Ersatzglieder und Arbeitshilfen. Springer, 1919, pp. 321–661.
[19] J. R. Napier and R. Tuttle, Hands. Princeton University Press, 1993.
[1] C. L. MacKenzie and T. Iberall, The grasping hand. Elsevier, 1994,
[20] I. A. Kapandji and L. H. Honoré, The physiology of the joints:
vol. 104.
annotated diagrams of the mechanics of the human joints. E. &
[2] N. Kamakura, M. Matsuo, H. Ishii, F. Mitsuboshi, and Y. Miura,
S. Livingstone London, 1970, vol. 1, no. 2.
“Patterns of static prehension in normal hands,” The American Jour-
[21] J. Z. Zheng, S. De La Rosa, and A. M. Dollar, “An investigation
nal of Occupational Therapy: Official Publication of the American
of grasp type and frequency in daily household and machine shop
Occupational Therapy Association, vol. 34, no. 7, pp. 437–445, 1980.
tasks,” in IEEE International Conference on Robotics and Automation
[3] M. R. Cutkosky, “On grasp choice, grasp models, and the design of
(ICRA), 2011. IEEE, 2011, pp. 4169–4175.
hands for manufacturing tasks,” IEEE International Conference on
[22] L. Y. Chang and N. S. Pollard, “Video survey of pre-grasp interactions
Robotics and Automation (ICRA), 1989, vol. 5, no. 3, pp. 269–279,
in natural hand activities,” June 2009.
1989.
[23] F. Worgotter, E. E. Aksoy, N. Kruger, J. Piater, A. Ude, and
[4] D. J. B. S. J. Edwards and J. D. McCoy-Powlen, “Developmental and
M. Tamosiunaite, “A simple ontology of manipulation actions based
functional hand grasps,” Oct 2002.
on hand-object relations,” Autonomous Mental Development, IEEE
[5] T. Feix, R. Pawlik, H. Schmiedmayer, J. Romero, and D. Kragic, “A
Transactions on, vol. 5, no. 2, pp. 117–134, 2013.
comprehensive grasp taxonomy,” in Robotics, Science and Systems:
[24] J. M. Elliott and K. Connolly, “A classification of manipulative hand
Workshop on Understanding the Human Hand for Advancing Robotic
movements,” Developmental Medicine & Child Neurology, vol. 26,
Manipulation, June 2009. [Online]. Available: http://grasp.xief.net
no. 3, pp. 283–296, 1984.
[6] S. Ekvall and D. Kragic, “Grasp recognition for programming by
[25] I. M. Bullock, R. R. Ma, and A. M. Dollar, “A hand-centric classi-
demonstration,” in IEEE International Conference on Robotics and
fication of human and robot dexterous manipulation,” Haptics, IEEE
Automation (ICRA), 2005. IEEE, 2005, pp. 748–753.
Transactions on, vol. 6, no. 2, pp. 129–144, 2013.
[7] G. Heumer, H. Ben Amor, M. Weber, and B. Jung, “Grasp recognition [26] T. Torigoe, “Comparison of object manipulation among 74 species of
with uncalibrated data gloves-a comparison of classification methods,” non-human primates,” Primates, vol. 26, no. 2, pp. 182–194, 1985.
in Virtual Reality Conference, 2007. VR’07. IEEE. IEEE, 2007, pp. [27] B. Kopp, A. Kunkel, H. Flor, T. Platz, U. Rose, K.-H. Mauritz,
19–26. K. Gresser, K. L. McCulloch, and E. Taub, “The Arm Motor Ability
[8] A. M. Dollar and R. D. Howe, “The highly adaptive SDM hand: Test: reliability, validity, and sensitivity to change of an instrument for
Design and performance evaluation,” The International Journal of assessing disabilities in activities of daily living,” Archives of Physical
Robotics Research, vol. 29, no. 5, pp. 585–597, 2010. Medicine and Rehabilitation, vol. 78, no. 6, pp. 615–620, 1997.
[9] S. B. Kang and K. Ikeuchi, “Toward automatic robot instruction [28] C. Collin, D. Wade, S. Davies, and V. Horne, “The Barthel ADL
from perception-mapping human grasps to manipulator grasps,” IEEE Index: a reliability study,” Disability & Rehabilitation, vol. 10, no. 2,
International Conference on Robotics and Automation (ICRA), 1997, pp. 61–63, 1988.
vol. 13, no. 1, pp. 81–95, 1997. [29] J. M. Linacre, A. W. Heinemann, B. D. Wright, C. V. Granger, and
[10] R. Wimmer, “Grasp sensing for human-computer interaction,” in Pro- B. B. Hamilton, “The structure and stability of the functional inde-
ceedings of the Fifth International Conference on Tangible, Embedded, pendence measure,” Archives of Physical Medicine and Rehabilitation,
and Embodied Interaction. ACM, 2011, pp. 221–228. vol. 75, no. 2, pp. 127–132, 1994.
[11] K. M. Lynch, “Toppling manipulation,” in IEEE International Confer- [30] N. Pollock, M. McColl, A. Carswell, and T. Sumsion, “The Canadian
ence on Robotics and Automation (ICRA), 1999, 1999, pp. 2551–2557. occupational performance measure,” Client-centred Practice in Occu-
[12] K. Hauser, V. Ng-Thow-Hing, and H. Gonzalez-Baños, “Multi-modal pational Therapy: A Guide to Implementation, pp. 103–114, 2006.
motion planning for a humanoid robot manipulation task,” in Robotics [31] Y. S. Choi, T. Deyle, T. Chen, J. D. Glass, and C. C. Kemp, “A list of
Research. Springer, 2011, pp. 307–317. household objects for robotic retrieval prioritized by people with ALS,”
[13] M. T. Mason, “Mechanics and planning of manipulator pushing in Rehabilitation Robotics, 2009. ICORR 2009. IEEE International
operations,” The International Journal of Robotics Research, vol. 5, Conference on. IEEE, 2009, pp. 510–517.
no. 3, pp. 53–71, 1986. [32] J. R. Napier, “The prehensile movements of the human hand,” Journal
[14] E. Yoshida, M. Poirier, J.-P. Laumond, O. Kanoun, F. Lamiraux, of Bone and Joint Surgery, vol. 38, no. 4, pp. 902–913, 1956.
R. Alami, and K. Yokoi, “Pivoting based manipulation by a humanoid [33] A.-A. Samadani, S. Burton, R. Gorbet, and D. Kulic, “Laban effort
robot,” Autonomous Robots, vol. 28, no. 1, pp. 77–88, 2010. and shape analysis of affective hand and arm movements,” in Affective
[15] N. Hogan, “Impedance control: An approach to manipulation: Part Computing and Intelligent Interaction (ACII), 2013 Humaine Associ-
iiimplementation,” Journal of Dynamic Systems, Measurement, and ation Conference on. IEEE, 2013, pp. 343–348.
Control, vol. 107, no. 1, pp. 8–16, 1985.
[16] O. Khatib, “A unified approach for motion and force control of robot
manipulators: The operational space formulation,” IEEE International
Conference on Robotics and Automation (ICRA), 1987, vol. 3, no. 1,
pp. 43–53, 1987.