0% found this document useful (0 votes)
9 views30 pages

Haptic Augmented Reality: Taxonomy, Research Status, and Challenges

Uploaded by

zhouziyang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views30 pages

Haptic Augmented Reality: Taxonomy, Research Status, and Challenges

Uploaded by

zhouziyang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

10 Taxonomy, ResearchHaptic Augmented Reality

Status, and Challenges


Seokhee Jeon, Seungmoon Choi,
and Matthias Harders

CONTENTS
10.1 Introduction................................................................................................... 227
10.2 Taxonomies.................................................................................................... 229
10.2.1 Visuo-Haptic Reality–Virtuality Continuum.................................... 229
10.2.2 Artificial Recreation and Augmented Perception.............................. 232
10.2.3 Within- and Between-Property Augmentation.................................. 233
10.3 Components Required for Haptic AR........................................................... 234
10.3.1 Interface for Haptic AR..................................................................... 234
10.3.2 Registration between Real and Virtual Stimuli................................. 236
10.3.3 Rendering Algorithm for Augmentation........................................... 237
10.3.4 Models for Haptic AR....................................................................... 238
10.4 Stiffness Modulation..................................................................................... 239
10.4.1 Haptic AR Interface...........................................................................240
10.4.2 Stiffness Modulation in Single-Contact Interaction..........................240
10.4.3 Stiffness Modulation in Two-Contact Squeezing.............................. 243
10.5 Application: Palpating Virtual Inclusion in Phantom with Two Contacts......245
10.5.1 Rendering Algorithm......................................................................... 247
10.6 Friction Modulation.......................................................................................248
10.7 Open Research Topics................................................................................... 249
10.8 Conclusions.................................................................................................... 250
References............................................................................................................... 251

10.1 INTRODUCTION
This chapter introduces an emerging research field in augmented reality (AR), called
haptic AR. As AR enables a real space to be transformed to a semi-virtual space by
providing a user with the mixed sensations of real and virtual objects, haptic AR does
the same for the sense of touch; a user can touch a real object, a virtual object, or a real
object augmented with virtual touch. Visual AR is a relatively mature technology and
is being applied to diverse practical applications such as surgical training, industrial
manufacturing, and entertainment (Azuma et al. 2001). In contrast, the technology

227
228 Fundamentals of Wearable Computers and Augmented Reality

for haptic AR is quite recent and poses a great number of new research problems
ranging from modeling to rendering in terms of both hardware and software.
Haptic AR promises great potential to enrich user interaction in various applications.
For example, suppose that a user is holding a pen-shaped magic tool in the hand, which
allows the user to touch and explore a virtual vase overlaid on a real table. Besides, the
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

user may draw a picture on the table with an augmented feel of using a paint brush on
a smooth piece of paper, or using a marker on a stiff white board. In a more practical
setting, medical students can practice cancer palpation skills by exploring a phantom
body while trying to find virtual tumors that are rendered inside the body. A consumer-
targeted application can be found in online stores. Consumers can see clothes displayed
on the touchscreen of a tablet computer and feel their textures with bare fingers, for
which the textural and frictional properties of the touchscreen are modulated to those
of the clothes. Another prominent example is augmentation or guidance of motor skills
by means of external haptic (force or vibrotactile) feedback, for example, shared con-
trol or motor learning of complex skills such as driving and calligraphy. Creating such
haptic modulations belongs to the realm of haptic AR. Although we have a long way to
go in order to realize all the envisioned applications of haptic AR, some representative
examples that have been developed in recent years are shown in Figure 10.1.

Virtual tumor
(a) (b)

HMD

Haptic device

Actuator

(c) (d)

FIGURE 10.1 Representative applications of haptic AR. (a) AR-based open surgery simu-
lator introduced. (From Harders, M. et al., IEEE Trans. Visual. Comput. Graph., 15, 138,
2009.) (b) Haptic AR breast tumor palpation system. (From Jeon, S. and Harders, M., IEEE
Trans. Haptics, 99, 1, 2014.) (c) Texture modeling and rendering based on contact accelera-
tion data. (Reprinted from Romano, J.M. and Kuchenbecker, K.J., IEEE Trans. Haptics, 5,
109, 2011. With permission.) (d) Conceptual illustration of the haptic AR drawing example.
Haptic Augmented Reality 229

In this chapter, we first address three taxonomies for haptic AR based on a com-
posite visuo-haptic reality–virtuality continuum, a functional aspect of haptic AR
applications, and the subject of augmentation (Section 10.2). A number of studies
related to haptic AR are reviewed and classified based on the three taxonomies.
Based on the review, associated research issues along with components needed for
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

a haptic AR system are elucidated in Section 10.3. Sections 10.4 through 10.6 intro-
duce our approach for the augmentation of real object stiffness and friction, in the
interaction with one or two contact points. A discussion of the open research issues
for haptic AR is provided in Section 10.7, followed by brief conclusions in Section
10.8. We hope that this chapter could prompt more research interest in this exciting,
yet unexplored, area of haptic AR.

10.2 TAXONOMIES
10.2.1 Visuo-Haptic Reality–Virtuality Continuum
General concepts associated with AR, or more generally, mixed reality (MR) were
defined earlier by Milgram and Colquhoun Jr. (1999) using the reality–virtuality
continuum shown in Figure 10.2a. The continuum includes all possible combinations
of purely real and virtual environments, with the intermediate area corresponding to
MR. Whether an environment is closer to reality or virtuality depends on the amount
of overlay or augmentation that the computer system needs to perform; the more aug-
mentation performed, the closer to virtuality. This criterion allows MR to be further
classified into AR (e.g., a heads-up display in an aircraft cockpit) and augmented
virtuality (e.g., a computer game employing a virtual dancer with the face image
of a famous actress). We, however, note that the current literature does not strictly
discriminate the two terms, and uses AR and MR interchangeably.
Extending the concept, we can define a similar reality–virtuality continuum for
the sense of touch and construct a visuo-haptic reality–virtuality continuum by com-
positing the two unimodal continua shown in Figure 10.2b. This continuum can be
valuable for building the taxonomy of haptic MR. In Figure 10.2b, the whole visuo-
haptic continuum is classified into nine categories, and each category is named in an
abbreviated form. The shaded regions belong to the realm of MR. In what follows, we
review the concepts and instances associated with each category, with more attention
to those of MR. Note that the continuum for touch includes all kinds of haptic feed-
back and does not depend on the specific types of haptic sensations (e.g., kinesthetic,
tactile, or thermal) or interaction paradigms (e.g., tool-mediated or bare-handed).
In the composite continuum, the left column has the three categories of h­ aptic
reality, vR-hR, vMR-hR, and vV-hR, where the corresponding environments pro-
vide only real haptic sensations. Among them, the simplest category is vR-hR,
which represents purely real environments without any synthetic stimuli. The other
end, vV-hR, refers to the conventional visual virtual environments with real touch,
for example, using a tangible prop to interact with virtual objects. Environments
between the two ends belong to vMR-hR, in which a user sees mixed objects but
still touches real objects. A typical example is the so-called tangible AR that has
been actively studied in the visual AR community. In tangible AR, a real prop held
230 Fundamentals of Wearable Computers and Augmented Reality

Mixed reality
Augmented reality Augmented virtuality
Reality–virtuality continuum
Real environment Virtual environment
(a)
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Visual virtuality

vV-hR vV-hMR vV-hV


Degree of virtuality in vision
Visual mixed reality

vMR-hR vMR-hMR vMR-hV


Visual reality

vR-hR vR-hMR vR-hV

Haptic reality Haptic mixed reality Haptic virtuality


(b) Degree of virtuality in touch

FIGURE 10.2 Reality–virtuality continuum extended to encompass touch. (Figures taken


from Jeon, S. and Choi, S., Presence Teleop. Virt. Environ., 18, 387, 2009. With permission.)
(a) Original reality–virtuality continuum. (From Milgram, P. and Colquhoun, H. Jr., A tax-
onomy of real and virtual world display integration, in Mixed Reality—Merging Real and
Virtual Worlds, Y.O.A.H. Tamura (ed.), Springer-Verlag, Berlin, Germany, 1999, pp. 1–16.)
(b) Composite visuo-haptic reality–virtuality continuum. (Jeon, S. and Choi, S., Presence
Teleop. Virt. Environ., 18, 387, 2009.) Shaded areas in the composite continuum represent the
realm of mixed reality.

in the hand is usually used as a tangible interface for visually mixed environments
(e.g., the MagicBook in Billinghurst et al. 2001), and its haptic property is regarded
unimportant for the applications. Another example is the projection augmented
model. A computer-generated image is projected onto a real physical model to create
a ­realistic-looking object, and the model can be touched by the bare hand (e.g., see
Bennett and Stevens 2006). Since the material property (e.g., texture) of the real
object may not agree with its visually augmented model, haptic properties are usu-
ally incorrectly displayed in this application.
The categories in the right column of the composite continuum, vR-hV, vMR-hV,
and vV-hV, are for haptic virtuality, corresponding to environments with only virtual
haptic sensations, and have received the most attention from the haptics research
community. Robot-assisted motor rehabilitation can be an example of vR-hV where
Haptic Augmented Reality 231

synthetic haptic feedback is provided in a real visual environment, while an interac-


tive virtual simulator is an instance of vV-hV where the sensory information of both
modalities is virtual. In the intermediate category, vMR-hV, purely virtual haptic
objects are placed in a visually mixed environment, and are rendered using a hap-
tic interface on the basis of the conventional haptic rendering methods for virtual
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

objects. Earlier attempts in this category focused on how to integrate haptic render-
ing of virtual objects into the existing visual AR framework, and they identified
the precise registration between the haptic and the visual coordinate frame as a key
issue (Adcock et al. 2003, Vallino and Brown 1999). For this topic, Kim et al. (2006)
applied an adaptive low-pass filter to reduce the trembling error of a low-cost vision-
based tracker using ARToolkit, and upsampled the tracking data for use with 1 kHz
haptic rendering (Kim et al. 2006). Bianchi et al. further improved the registration
accuracy via intensive calibration of a vision-based object tracker (Bianchi et al.
2006a,b). Their latest work explored the potential of visuo-haptic AR technology
for medical training with their highly stable and accurate AR system (Harders et al.
2009). Ott et al. also applied the HMD-based visuo-haptic framework to training
processes in industry and demonstrated its potential (Ott et al. 2007). In applica-
tions, a half mirror was often used for constructing a visuo-haptic framework due to
the better collocation of visual and haptic feedback, for example, ImmersiveTouch
(Luciano et al. 2005), Reachin Display (Reachin Technology), PARIS display
(Johnson et al. 2000), and SenseGraphics 3D-IW (SenseGraphics). Such frameworks
were, for instance, applied to cranial implant design (Scharver et al. 2004) or MR
painting application (Sandor et al. 2007).
The last categories for haptic MR, vR-hMR, vMR-hMR, and vV-hMR, with which
the rest of this chapter is concerned, lie in the middle column of the composite con-
tinuum. A common characteristic of haptic MR is that synthetic haptic signals that
are generated by a haptic interface modulate or augment stimuli that occur due to a
contact between a real object and a haptic interface medium, that is, a tool or a body
part. The VisHap system (Ye et al. 2003) is an instance of ­vR-hMR that provides
mixed haptic sensations in a real environment. In this system, some properties of a
virtual object (e.g., shape and stiffness) are rendered by a haptic device, while others
(e.g., texture and friction) are supplied by a real prop attached at the end-effector of
the device. Other examples in this category are the SmartTool (Nojima et al. 2002)
and SmartTouch systems (Kajimoto et al. 2004). They utilized various sensors (opti-
cal and electrical conductivity sensors) to capture real signals that could hardly be
perceived by the bare hand, transformed the signals into haptic information, and then
delivered them to the user in order to facilitate certain tasks (e.g., peeling off the
white from the yolk in an egg). The MicroTactus system (Yao et al. 2004) is another
example of vR-hMR, which detects and magnifies acceleration signals caused by
the interaction of a pen-type probe with a real object. The system was shown to
improve the performance of tissue boundary detection in arthroscopic surgical train-
ing. A similar pen-type haptic AR system, Ubi-Pen (Kyung and Lee 2009), embed-
ded miniaturized texture and vibrotactile displays in the pen, adding realistic tactile
feedback for interaction with a touch screen in mobile devices.
On the other hand, environments in vV-hMR use synthetic visual stimuli. For exam-
ple, Borst et al. investigated the utility of haptic MR in a visual virtual environment
232 Fundamentals of Wearable Computers and Augmented Reality

by adding synthetic force to a passive haptic response for a panel control task (Borst
and Volz 2005). Their results showed that mixed force feedback was better than syn-
thetic force alone in terms of task performance and user preference. In vMR-hMR,
both modalities rely on mixed stimuli. Ha et al. installed a vibrator in a real tangible
prop to produce virtual vibrotactile sensations in addition to the real haptic informa-
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

tion of the prop in a visually mixed environment (Ha et al. 2007). They demonstrated
that the virtual vibrotactile feedback enhances immersion for an AR-based handheld
game. Bayart et al. introduced a teleoperation framework where force measured at
the remote site is presented at the master side with additional virtual force and mixed
imagery (Bayart et al. 2007, 2008). In particular, they tried to modulate a certain real
haptic property with virtual force feedback for a hole-patching task and a painting
application, unlike most of the related studies introduced earlier.
Several remarks need to be made. First, the vast majority of related work, except
(Bayart et al. 2008, Borst and Volz 2005, Nojima et al. 2002), has used the term
haptic AR without distinguishing vMR-hV and hMR, although research issues asso-
ciated with the two categories are fundamentally different. Second, haptic MR can
be further classified to haptic AR and haptic augmented virtuality using the same
criterion of visual MR. All of the research instances of hMR introduced earlier cor-
respond to haptic AR, since little knowledge regarding an environment is managed
by the computer for haptic augmentation. However, despite its potential, attempts to
develop systematic and general computational algorithms for haptic AR have been
scanty. An instance of haptic augmented virtuality can be haptic rendering systems
that use haptic signals captured from a real object (e.g., see Hoever et al. 2009,
Okamura et al. 2001, Pai et al. 2001, Romano and Kuchenbecker 2011) in addition
to virtual object rendering, although such a concept has not been formalized before.
Third, although the taxonomy is defined for composite visuo-haptic configurations,
a unimodal case (e.g., no haptic or visual feedback) can also be mapped to the cor-
responding 1D continuum on the axes in Figure 10.2b.

10.2.2 Artificial Recreation and Augmented Perception


The taxonomy described in the previous section is based on the visuo-haptic ­reality–
virtuality continuum, thereby elucidating the nature of stimuli provided to users
and associated research issues. Also useful is a taxonomy that specifies the aims of
­augmentation. Hugues et al. (2011) defined two functional categories for visual AR:
artificial recreation (or environment) and augmented perception, which can be also
applied to hMR category in Figure 10.2. This is in line with the terms used by Bayart
and Kheddar (2006)—haptic enhancing and enhanced haptics, respectively.
In artificial recreation, haptic augmentation is used to provide a realistic presentation
of physical entities by exploiting the crucial advantage of AR, that is, more efficient
and realistic construction of an immersive environment, compared to VR. Artificial
recreation can be further classified into two sub-categories. It can be either for real-
istic reproduction of a specific physical environment, for example, the texture display
example of clothes described in Section 10.1, or for creating a nonexisting environment,
for example, the tumor palpation example in Jeon et al. (2012). The latter is a particularly
important area for haptic AR, since it maximizes the advantages of both VR and AR.
Haptic Augmented Reality 233

In contrast, augmented perception aims at utilizing touch as an additional channel


for transferring useful information that can assist decision-making. Since realism is
no longer a concern, the form of virtual haptic stimuli in this category significantly
varies depending on the target usage. For example, one of the simplest forms is
vibration alerts. Synthetic vibratory signals, while mixed with other haptic attri-
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

butes of the environment, are a powerful means of conveying timing information,


for example, mobile phone alarms, driving hazard warnings (Chun et al. 2013), and
rhythmic guidance (Lee et al. 2012b). Recently, many researchers also tried to use
vibration for spatial information (e.g., Lee and Choi 2014, Sreng et al. 2008) and dis-
crete categorical information (e.g., haptic icon [Rovers and van Essen 2004, Ternes
and MacLean 2008]).
Force feedback is another widely used form for augmentation in this category.
The most common example is virtual fixtures used for haptic guidance. They add
guiding or preventing forces to the operator’s movement while she/he performs a
motor task, in order to improve the safety, accuracy, and speed of task execution
(Abbott et al. 2007). The term was originally coined in Rosenberg (1993), and it
has been applied to various areas, for example, a manual milling tool (Zoran and
Paradiso 2012), the SmartTool (Nojima et al. 2002), or surgical assistance systems
(Li et al. 2007).
There have also been attempts that faithfully follow the original meaning of
augmentation of reality. The aforementioned MicroTactus system (Yao et al.
2004) is one example. Sometimes, augmentation is done by mapping nonhaptic
information into haptic cues for the purpose of data perceptualization, for exam-
ple, color information mapped to tactile stimuli (Kajimoto et al. 2004). Another
interesting concept is diminished reality, which hides reality, for example, remov-
ing the surface haptic texture of a physical object (Ochiai et al. 2014). This con-
cept of diminished reality can also be applied to hand tremor cancellation in
surgical operations (Gopal et al. 2013, Mitchell et al. 2007). Lastly, in a broad
sense, exoskeletal suits are also an example of augmentation through mixing real
and virtual force.

10.2.3 Within- and Between-Property Augmentation


Various physical properties, such as shape, stiffness, friction, viscosity, and surface
texture, contribute to haptic perception. Depending on the haptic AR scenario, some
object properties may remain intact while the rest may be subject to augmentation.
Here, the augmentation may occur within a property, for example, mixing real and
virtual stiffness for rendering harder virtual nodules inside a tissue phantom (Jeon
et al. 2012), or it may be between different properties, for example, adding virtual
stiffness to real surface textures (Yokokohji et al. 1999) or vice versa (Borst and
Volz 2005).
This distinction is particularly useful for gauging the degree, accuracy, and type
of registration needed for augmentation. Consequently, this taxonomy allows the
developer to quantify the amount of environment modeling necessary for registra-
tion in preprocessing and rendering steps. The next section further describes issues
and requirements for registration and environment modeling for haptic AR.
234 Fundamentals of Wearable Computers and Augmented Reality

TABLE 10.1
Classification of Related Studies Using the Composite Taxonomy
Artificial Recreation Augmented Perception
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Within-property Borst and Volz (2005) Abbott and Okamura (2003)


augmentation Jeon and Choi (2009) Bose et al. (1992)
Jeon and Choi (2011) Gopal et al. (2013)
Jeon et al. (2012) Kajimoto et al. (2004)
Jeon et al. (2011) Mitchell et al. (2007)
Jeon and Harders (2014) Nojima et al. (2002)
SoIanki and Raja (2010) Ochiai et al. (2014)
Gerling and Thomas (2005) Yao et al. (2004)
Kurita et al. (2009) Yang et al. (2008)
Hachisu et al. (2012) Lee et al. (2012a)
Between-property Bayart et al. (2008) Brewster and Brown (2004)
augmentation Bayart et al. (2007) Brown and Kaaresoja (2006)
Fukumoto and Sugimura (2001) Kim and Kim (2012)
Iwata et al. (2001) Kyung and Lee (2009)
Minamizawa et al. (2007) Lee and Choi (2014)
Park et al. (2011) Powell and O’Malley (2011)
Ye et al. (2003) Rosenberg (1993)
Yokokohji et al. (1999) Spence and Ho (2008)
Frey et al. (2006) Zoran and Paradiso (2012)
Parkes et al. (2009) Grosshauser and Hermann (2009)
Ha et al. (2006)
Romano and Kuchenbecker (2011)

Further, the last two taxonomies are combined to construct a composite taxonomy,
and all relevant literature in the hMR category is classified using this taxonomy in
Table 10.1. Note that most of the haptic AR systems have both within- and between-
property characteristics to some degree. For clear classification, we only examined
key augmentation features in Table 10.1.

10.3 COMPONENTS REQUIRED FOR HAPTIC AR


10.3.1 Interface for Haptic AR
A haptic AR framework inherently involves interactions with real e­ nvironments. There­
fore, three systems—a haptic interface, a human operator, and a real e­ nvironment—
react to each other through an interaction tool, leading to tridirectional interaction as
shown in Figure 10.3.
During interaction, the interaction tool is coupled with the three components,
and this coupling is the core for the realization of haptic AR, that is, merging the
real and the virtual. Through this coupled tool, relevant physical signals from
Haptic Augmented Reality 235

Real environment

Re/action
based on
physics Object
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Coupled when in contact

Interaction
tool
Sensing Perception

Coupled Coupled
Haptic Sensorimotor
Computer Brain
interface system

Actuation Action

Haptic rendering system Human user

FIGURE 10.3 Tridirectional interaction in haptic AR.

both the real environment and the haptic interface are mixed and transmitted to
the user. Therefore, designing this feel-through tool is of substantial importance
in designing a haptic AR interface.
The feel-through can be either direct or indirect. Direct feel-through, analogous
to optical see-through in visual AR, transmits relevant physical signals directly to
the user via a mechanically coupled implement. In contrast, in indirect feel-through
(similar to video see-through), relevant physical signals are sensed, modeled, and
synthetically reconstructed for the user to feel, for example, in master–slave tele-
operation. In direct feel-through, preserving the realism of a real environment and
mixing real and virtual stimuli is relatively easy, but real signals must be compen-
sated for with great care for augmentation. To this end, the system may need to
employ very accurate real response estimation methods for active compensation
or special hardware for passive compensation, for example, using a ball bearing
tip to remove friction (Jeon and Choi 2010) and using a deformable tip to compen-
sate for real contact vibration (Hachisu et al. 2012). On the contrary, in indirect
feel-through, modulating real signals is easier since all the final stimuli are syn-
thesized, but more sophisticated hardware is required for transparent rendering of
virtual stimuli with high realism.
Different kinds of coupling may exist. Mechanical coupling is a typical example,
a force feedback haptic stylus instrumented with a contact tip, for example (Jeon and
Choi 2011). Other forms such as thermal coupling and electric coupling are also pos-
sible depending on the target property. In between-property augmentation, coupling
may not be very tight, for example, only position data and timing are shared (Borst
and Volz 2005).
Haptic AR tools can come in many different forms. In addition to typical styli,
very thin sheath-type tools are also used, for example, sensors on one side and
236 Fundamentals of Wearable Computers and Augmented Reality

actuators on the other side of a sheath (Nojima et al. 2002). Sometimes a real object
itself is a tool, for example, when both sensing and actuation modules are embedded
in a tangible marker (Ha et al. 2006).
A tool and coupling for haptic AR needs to be very carefully designed. Each of
the three components involved in the interaction requires a proper attachment to the
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

tool, appropriate sensing and actuation capability, and eventually, all of these should
be compactly integrated into the tool in a way that it can be appropriately used by
a user. To this end, the form factors of the sensors, attachment joints, and actuation
parts should be carefully designed to maximize the reliability of sensing and actua-
tion while maintaining a sufficient degree of freedom of movement.

10.3.2 Registration between Real and Virtual Stimuli


An AR system generally faces two registration problems between real and virtual
environments: spatial and temporal registrations. Virtual and real stimuli must be
spatially and temporally aligned with each other with high accuracy and robustness.
In visual AR, proper alignment of virtual graphics (usually in 3D) on real video
streams has been a major research issue (Feng et al. 2008). Tracking an AR camera,
a user, and real objects and localizing them in a world coordinate frame are the core
technical problems (Harders et al. 2009).
In haptic AR, virtual and real haptic stimuli also have to be spatially and
temporally aligned, for example, adding a virtual force at the right position and
at the right moment. While sharing the same principle, registration in haptic AR
sometimes has different technical requirements. In many haptic AR scenarios,
an area of interest for touching is very small (even one or a couple of points),
and touch usually occurs via a tool. Therefore, large area tracking used in visual
AR is not necessary, and tracking can be simplified, for example, detecting the
moment and location of contact between a haptic tool and a real object using a
mechanical tracker. However, tracking data are directly used for haptic render-
ing in many cases, so the update rate and accuracy of tracking should be care-
fully considered.
In addition to such basic position and timing registration, other forms of spatial
and temporal quantities related to the target augmentation property often require
adequate alignment. For example, in order to augment stiffness, the direction of
force for virtual stiffness must be aligned with the response force direction from
real stiffness. Another example is an AR pulse simulator where the frequency and
phase of a virtual heart beat should match with those of the real one. These align-
ments usually can be done by acquiring the real quantity through sophisticated real-
time sensing and/or estimation modules and setting corresponding virtual values
to them. Examining and designing such property-related registration is one of the
major research issues in developing haptic AR systems.
The requirements of this property-related registration largely depend on an applica-
tion area, a target augmentation property, and physical signals involved. However, the
within/between-property taxonomy can provide some clues for judging what kinds of
and how accurate registration is needed, as the taxonomy gives the degree of association
Haptic Augmented Reality 237

between virtual and real signals. In the case of within-property augmentation, mixing
happens in a single property, and thus virtual signals related to a target property need to
be exactly aligned with corresponding real signals for harmonious merging and smooth
transition along the line between real and virtual. This needs very sophisticated regis-
tration, often with the estimation of real properties based on sensors and environment
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

models (see Section 10.4 for how we have approached this issue). However, in between-
property augmentation, different properties are usually treated separately, and virtual
signals of one target property do not have to be closely associated with real signals of
the other properties. Thus, the registration may be of lesser accuracy in this case.

10.3.3 Rendering Algorithm for Augmentation


A rendering frame of an AR system consists of (1) sensing the real environment,
(2) real–virtual registration, (3) merging stimuli, and (4) displaying the stimuli. The
following paragraphs overview the steps for haptic AR. Steps 2 and 3 are the core
parts for haptic AR.

Step 1 prepares data for steps 2 and 3 by sensing variables from the real envi-
ronment. Signal processing can also be applied to the sensor values.
Step 2 conducts a registration process based on the sensed data and pre-­
identified models (see Section 10.3.4 for examples). This step usually esti-
mates the spatial and temporal state of the tool and the real environment and
then conducts the registration as indicated in Section 10.3.2, for example,
property-related registration and contact detection between the tool and real
objects. Depending on the result of this step, the system decides whether to
proceed to step 3 or go back to step 1 in this frame.
Step 3 is dedicated to the actual calculation of virtual feedback (in direct feel-
through) or mixed feedback (in indirect feel-through). Computational proce-
dures in this step largely depend on the categories of haptic AR (Table 10.1).
For artificial recreation, this step simulates the behaviors of the properties
involved in the rendering using physically based models. However, augmented
perception may need to derive the target signal based on purely sensed signals
and/or using simpler rules, for example, doubling the amplitude of measured
contact vibration (Yao et al. 2004). In addition, within-property augmenta-
tion often requires an estimation of the properties of a real object in order to
compensate for or augment it. For instance, modulating the feel of a brush
in the AR drawing example first needs the compensation of the real tension
and friction of the manipulandum. This estimation can be done either using a
model already identified in a preprocessing step or by real-time estimation of
the property using sensor values, or both (see Section 10.3.4 for more details).
In between-property augmentation, however, this estimation process is not
required in general, and providing virtual properties is simpler.
Step 4 sends commands to the haptic AR interface to display the feedback
­calculated in Step 3. Sometimes we need techniques for controlling the
hardware for the precise delivery of stimuli.
238 Fundamentals of Wearable Computers and Augmented Reality

10.3.4 Models for Haptic AR


As aforementioned in Sections 10.3.2 and 10.3.3, haptic AR requires predefined
models for three different purposes. First, models are needed for simulating the
responses of the signals associated with rendering properties, which is the same for
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

haptic VR rendering. Such computational models have been extensively studied in


haptics and virtual reality. In most cases, they include some degree of simplification
to fulfill the real-time requirement of haptic rendering.
The second model is for real–virtual registration (Step 2 in Section 10.3.3). The
most common example is the geometry model of real objects for contact and sur-
face normal detection, which is usually built in preprocessing. Employing such a
geometry model makes rendering simpler since conventional rendering algorithms
for haptic VR can be readily applied. However, acquiring and using such models
should be minimized in order to fully utilize the general advantage of AR: efficient
construction of a realistic and immersive environment without extensive modeling.
Balancing the amount of modeling and complexity of rendering algorithm is impor-
tant. In addition to geometry models, property augmentation sometimes needs mod-
els for the estimation of real information. For example, in Section 10.4, we estimate
local geometry and local deformation near to the contact point based on a simplified
environment model that is identified in preprocessing in order for stiffness direction
registration.
The last model is for the estimation of real signals in order for modulation in
Step 3 of the rendering (Section 10.3.3). The estimation often has challenging
­accuracy requirements while still preserving efficiency for real-time performance.
For properties such as stiffness and friction, estimating physical responses has been
extensively studied in robotics and mechatronics for the purpose of environment
modeling and/or compensation. In general, there are two approaches for this: open-
loop model-based estimation and closed-loop sensor-based estimation. One of the
research issues is how to adapt those techniques for use in haptic AR, which has
the ­following requirements. The estimation should be perceptually accurate since
the errors in estimation can be directly fed into the final stimuli. The identification
process should also be feasible for the application, for example, very quick identi-
fication is mandatory for scenarios in which real objects for interaction frequently
change. Lastly, using the same hardware for both identification and rendering is pre-
ferred for the usability of the system.
Each category in Table 10.1 has different requirements for models. Systems in the
artificial recreation category may need more sophisticated models for both simula-
tion and estimation, while those in the augmenting perception category may suffice
with simpler model for simulation. Furthermore, systems in the between-property
category may have to use very accurate registration and estimation models, while
merging between properties may not need models for registration and estimation.
For summary, Table 10.2 outlines the rendering and registration characteristics of
the categories in the two taxonomies.
In the following sections, we introduce example algorithms for haptic AR, target-
ing a system that can modulate stiffness and friction of a real object by systemati-
cally adding virtual stiffness and friction.
Haptic Augmented Reality 239

TABLE 10.2
Characteristics of the Categories
Category Within Property Between Property
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Registration and • Registration: position and timing • Registration: only basic position
rendering registration as well as property- and timing registration needed.
related registration needed. • Rendering: algorithms for haptic
• Rendering includes estimation and VR can be applied.
compensation of real signals and
merging of them with virtual signals.

Category Artificial Recreation Augmented Perception


Models required • Models for physics simulation. • Models for registration and
• Sometimes models for registration compensation.
and compensation.

Category Direct Feel-Through Indirect Feel-Through


Rendering • Real-time compensation of real • Transparent haptic rendering
property needed. algorithm and interface needed.

10.4 STIFFNESS MODULATION


We initiated our endeavor toward haptic AR with the augmentation or modulation
of real object stiffness, which is one of the most important properties for rendering
the shape and hardness of an object. We summarize a series of our major results on
stiffness modulation (Jeon and Choi 2008, 2009, 2010, 2011; Jeon and Harders 2012)
in the following sections. This topic can be categorized into artificial recreation and
within-property augmentation.
We aim at providing a user with augmented stiffness by adding virtual force
feedback when interacting with real objects. We took two steps for this goal. The
first step was single-point interaction supporting typical exploratory patterns, such
as tapping, stroking, or contour following (Section 10.4.2). The second step extended
the first system to two-point manipulation, focusing on grasping and squeezing
(Section 10.4.3).
Our augmentation methods emphasize minimizing the need for prior knowledge
and preprocessing, for example, the geometric model of a real object, used for reg-
istration, while preserving plausible perceptual quality. Our system requires a mini-
mal amount of prior information such as the dynamics model of a real object. This
preserves a crucial advantage of AR; only models for the objects of interest, not the
entire environment, are required, which potentially leads to greater simplicity in
application development.
Our framework considers elastic objects with moderate stiffness for interaction.
Objects made of plastic (e.g., clay), brittle (e.g., glass), or high stiffness material
(e.g., steel) are out of scope due to either complex material behavior or the per-
formance limitations of current haptic devices. In addition, homogeneous dynamic
material responses are assumed for real objects.
240 Fundamentals of Wearable Computers and Augmented Reality
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

PHANToM
Premium 1.5 PHANToM
Premium 1.5

NANO17 force sensor

FIGURE 10.4 Haptic AR interface. (Reprinted from Jeon, S. and Harders, M., Extending
haptic augmented reality: Modulating stiffness during two-point squeezing, in Proceedings
of the Haptics Symposium, 2012, pp. 141–146. With permission.)

10.4.1 Haptic AR Interface


We constructed a haptic AR interface using two general impedance-type haptic
interfaces (Geomagic; PHANToM premium model 1.5), each of which has a custom-
designed tool for interaction with a real object (see Figure 10.4). The tool is instru-
mented with a 3D force/torque sensor (ATI Industrial Automation, Inc.; model Nano17)
attached between the tool tip and the gimbal joints at the last link of the PHANToM.
This allows the system to measure the reaction force from a real object that is equal
to the sum of the force from the haptic interface and the force from the user’s hand.

10.4.2 Stiffness Modulation in Single-Contact Interaction


Suppose that a user indents a real object with a probing tool. This makes the object
deform, and the user feels a reaction force. Let the apparent stiffness of the object at time
t be k(t). This is the stiffness that the user perceives when no additional virtual force is
rendered. The goal of stiffness augmentation is to systematically change the stiffness that
the user perceives k(t) to a desired stiffness k(t ) by providing virtual force to the user.
As shown in Figure 10.5, two force components, the force that the haptic device
exerts to the tool, fd(t), and the force from the user’s hand, fh(t), deform the object
surface and result in a reaction force fr(t), such that

fr ( t ) = − fh ( t ) + fd ( t ) .
{ } (10.1)

The reaction force fr(t) during contact can be decomposed into two orthogonal
force components, as shown in Figure 10.5:

fr ( t ) = frn ( t ) + frt ( t ) , (10.2)

where
frn (t ) is the result of object elasticity in the normal direction
frt (t ) is the frictional tangential force
Haptic Augmented Reality 241

f rn
Original surface fr
fr pc

x
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

un
Deformed surface
fh fd frt
p

FIGURE 10.5 Variables for single-contact stiffness modulation. (Reprinted from Jeon, S.
and Choi, S., Presence Teleop. Virt. Environ., 20, 337, 2011. With permission.)

Let x(t) be the displacement caused by the elastic force component, which represents
the distance between the haptic interface tool position, p(t), and the original non-
deformed position pc(t) of a contacted particle on the object surface. If we denote the
unit vector in the direction of frn (t ) by un(t) and the target modulation stiffness by k(t ),
the force that a user should feel is:

fh ( t ) = k ( t ) x(t )u n (t ). (10.3)

Using (10.3), the force that the haptic device needs to exert is

fd ( t ) = − fr ( t ) − k ( t ) x(t )u n (t ). (10.4)

This equation indicates the tasks that a stiffness modulation algorithm has to do in
every loop: (1) detection of the contact between the haptic tool and the real object for
spatial and temporal registration, (2) measurement of the reaction force fr(t), (3) esti-
mation of the direction un(t) and magnitude x(t) of the resulting deformation for stiff-
ness augmentation, and (4) control of the device-rendered force fd(t) to produce the
desired force fd (t ). The following section describes how we address these four steps.
In Step 1, we use force sensor readings for contact detection since the entire
­geometry of the real environment is not available. A collision is regarded to have
occurred when forces sensed during interaction exceed a threshold. To increase the
accuracy, we developed algorithms to suppress noise, as well as to compensate for
the weight and dynamic effects of the tool. See Jeon and Choi (2011) for details.
Step 2 is also simply done with the force sensor attached to the probing tool.
Step 3 is the key process for stiffness modulation. We first identify the friction
and deformation dynamics of a real object in a preprocessing step, and use them later
during rendering to estimate the known variables for merging real and virtual forces.
The details of this process are summarized in the following section.
Before augmentation, we carry out two preprocessing steps. First, the f­riction
between the real object and the tool tip is identified using the Dahl friction model (Jeon
and Choi 2011). The original Dahl model is transformed to an ­equivalent ­discrete-time
difference equation, as described in Mahvash and Okamura (2006). It also includes
a velocity-dependent term to cope with viscous friction. The procedure for friction
242 Fundamentals of Wearable Computers and Augmented Reality

identification adapts the divide-and-conquer strategy by performing identification


separately for the presliding and the sliding regime, which decouples the nonlinear
identification problem to two linear problems. Data for lateral displacement, velocity,
normal force, and friction force are collected during manual stroking, and then are
divided into presliding and sliding bins according to the lateral displacement. The
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

data bins for the presliding regime are used to identify the parameters that define
behavior at almost zero velocity, while the others are used for Coulomb and viscous
parameters.
The second preprocessing step is for identifying the deformation dynamics of the
real object. We use the Hunt–Crossley model (Hunt and Crossley 1975) to account
for nonlinearity. The model determines the response force magnitude given dis-
placement x(t) and velocity x (t ) by

( ) ( ) x ( t ) ,
m m
f (t ) = k x (t ) + b x (t ) (10.5)

where
k and b are stiffness and damping constants
m is a constant exponent (usually 1 < m < 2)

For identification, the data triples consisting of displacement, velocity, and reaction
force magnitude are collected through repeated presses and releases of a deformable
sample in the normal direction. The data are passed to a recursive least-squares algo-
rithm for an iterative estimation of the Hunt–Crossley model parameters (Haddadi
and Hashtrudi-Zaad 2008).
For rendering, the following computational process is executed in every ­haptic ren-
dering frame. First, two variables, the deformation direction un(t) and the magnitude
of the deformation x(t) are estimated. The former is derived as follows. Equation 10.2
indicates that the response force fr(t) consists of two perpendicular force compo-
nents: frn (t ) and frt (t ). Since un(t) is the unit vector of frn (t ), un(t) becomes:

fr ( t ) − frt ( t )
un (t ) = . (10.6)
fr ( t ) − frt ( t )

The known variable in (10.6) is frt ( t ). The magnitude of frt (t ) is estimated using the
identified Dahl model. Its direction is derived from the tangent vector at the current
contact point p(t), which is found by projecting Δp(t) onto un(t−Δt) and subtracting
it from Δp(t).
The next part is the estimation of x(t). The assumption of material homogeneity
allows us to directly approximate it from the inverse of the Hunt–Crossley model
identified previously. Finally, using the estimated un(t) and x(t), fd (t ) is calculated
using (10.4), which is then sent to the haptic AR interface.
In Jeon and Choi (2011), we assessed the physical performance of each compo-
nent and the perceptual performance of the final rendering result using various real
samples. In particular, the perceptual quality of modulated stiffness evaluated in a
Haptic Augmented Reality 243

psychophysical experiment showed that rendering errors were less than the human
discriminability of stiffness. This demonstrates that our system can provide percep-
tually convincing stiffness modulation.
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

10.4.3 Stiffness Modulation in Two-Contact Squeezing


After confirming the potential of the concept, we moved to a more challenging
scenario: stiffness modulation in two-contact squeezing (Jeon and Harders 2012).
We developed new algorithms to provide stiffness augmentation while grasping and
squeezing an object with two probing tools. In this system, we assume that the object
is fully lifted from the ground and the contact points do not change without slip.
We also do not take inertial effects into account.
During lifting an object, an additional force due to the object weight, fw(t) in
Figure 10.6, is involved in the system. When the user applies forces fh,* (t) to hold and
squeeze the object (* is either 1 or 2 depending on the contact point) and the haptic
interfaces exert forces fd,* (t) for modulation, weight forces fw,* (t) are also present at
the two contact points. At each contact point, these three force components deform
the object and make reaction force fr,* (t):

fr ,* ( t ) = fh,* ( t ) + fd ,* ( t ) + fw,* ( t ) . (10.7)

fr,* (t) can be further decomposed to pure weight fw,* (t) and a force component in
squeezing direction fsqz,* (t) as shown in Figure 10.6, resulting in

fr ,* ( t ) = fsqz,* ( t ) + fw,* ( t ) . (10.8)

Since the displacement and the force along the squeezing direction contribute to stiffness
perception, the force component of interest is fsqz,* (t). Then, (10.7) can be rewritten as

fsqz,* ( t ) = fh,* ( t ) + fd ,* ( t ) . (10.9)

x1u1 fsqz,1
pc,1
x2u2 p1
l
p2
pc,2 fr,1
fw,1
fd fsqz,2
fh fw,2

fw fr fr,2 Deformed surface

FIGURE 10.6 Variables for two-contact stiffness modulation. (Reprinted from Jeon, S.
and Harders, M., Extending haptic augmented reality: Modulating stiffness during two-point
squeezing, in Proceedings of the Haptics Symposium, 2012, pp. 141–146. With permission.)
244 Fundamentals of Wearable Computers and Augmented Reality

To make the user feel the desired stiffness k(t ),

fh,* ( t ) = k ( t ) x* ( t ) u* ( t ) , (10.10)
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

where x*(t) represents the displacement along the squeezing direction and u*(t)
is the unit vector toward the direction of that deformation. Combining (10.9) and
(10.10) results in the virtual force for the haptic interfaces to render for the desired
augmentation:

fd ,* ( t ) = fsqz,* ( t ) − k ( t ) x* ( t ) u* ( t ) . (10.11)

Here again, (10.11) indicates that we need to estimate the displacement x*(t) and the
deformation direction u*(t) at each contact point. The known variables are the reac-
tion forces fr,*(t) and the tool tip positions p*(t). To this end, the following three obser-
vations about an object held in the steady state are utilized. First, the magnitudes of
the two squeezing forces fsqz,1(t) and fsqz,2(t) are the same, but the directions are the
opposite (fsqz,1(t) = −fsqz,2(t)). Second, each squeezing force falls on the line connect-
ing the two contact locations. Third, the total weight of the object is equal to the sum
of the two reaction force vectors:

fr ,1 ( t ) + fr ,2 ( t ) = fw,1 ( t ) + fw,2 ( t ) .

The first
LLLLLLLLL I and second
LLLLLLLLL I observations provide the directions of fsqz,*(t) (= u*(t) =
p1 (t )p2 (t ) or p2 (t )p1 (t ) ; also see l(t) in Figure 10.6). The magnitude of fsqz,*(t), fsqz,*
(t) is determined as follows. The sum of the reaction forces along the l(t) direction,
fr↓ sqz (t ) = fr ,1 (t ) ⋅ u l (t ) + fr ,2 (t ) ⋅ u l (t ) , includes not only the two squeezing forces, but
also the weight. Thus, fsqz(t) can be calculated by subtracting the effect of the weight
along l(t) from fr↓sqz(t):

fsqz ( t ) = fr ↓sqz ( t ) − fw↓sqz ( t ) , (10.12)

where f w↓sqz(t) can be derived based on the third observation such that

fw↓sqz ( t ) = fr ,1 ( t ) + fr ,1 ( t ) ⋅ u l ( t ) .
( ) (10.13)

Then, the squeezing force at each contact point can be derived based on the first
observation:

fsqz,1 ( t ) = fsqz,2 ( t ) = 0.5 fsqz ( t ) . (10.14)


Haptic Augmented Reality 245
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

FIGURE 10.7 Example snapshot of visuo-haptic augmentation. Reaction force (dark gray
arrow), weight (gray arrow), and haptic device force (light gray arrow) are depicted. Examples
with increased stiffness (virtual forces oppose squeezing) and decreased stiffness (virtual
forces assist squeezing) are shown on left and right, respectively.

Steps for the estimation of the displacement x* (t) in (10.11) are as follows. Let
the distance between the two initial contact points on the non-deformed surface
(pc,1(t) and pc,2(t) in Figure 10.6) be d0. It is constant over time due to the no-slip
assumption. Assuming homogeneity, x1(t) is equal to x2(t), and the displacements can
be derived by

x1 ( t ) = x2 ( t ) = 0.5 d0 − d ( t ) ,
( ) (10.15)

where d(t) is p1 (t )p2 (t ) . All the unknown variables are now estimated and the final
virtual force can be calculated using (10.11).
In Jeon and Harders (2012), we also evaluated the system performance through
simulations and a psychophysical experiment. Overall, the evaluation indicated that
our system can provide physically and perceptually sound stiffness augmentation.
In addition, the system has further been integrated with a visual AR framework
(Harders et al. 2009). To our knowledge, this is among the first system that can
augment both visual and haptic sensations. We used the visual system to display
information related to haptic augmentation, such as the force vectors involved in the
algorithm. Figure 10.7 shows exemplar snapshots.

10.5 APPLICATION: PALPATING VIRTUAL INCLUSION


IN PHANTOM WITH TWO CONTACTS
This section introduces an example of the applications of our stiffness modulation
framework, taken from Jeon and Harders (2014). We developed algorithms for ren-
dering a stiffer inclusion in a physical tissue phantom during manipulations at more
than one contact location. The basic concept is depicted in Figure 10.8. The goal of
the system is to render forces that give an illusion of a harder inclusion in the mock-up.
In Figure 10.8, fR,*(t) are the reaction forces from the real environment to which
the system adds virtual force feedback fT,*(t) stemming from the simulated tumor
246 Fundamentals of Wearable Computers and Augmented Reality

fH,2
fH,1
fR,1 fR,2
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

fT,2
fT,1

Silicone tissue
mock up
Virtual tumor

FIGURE 10.8 Overall configuration of rendering stiffer inclusion in real mock-up.


(Reprinted from Jeon, S. and Harders, M., IEEE Trans. Haptics, 99, 1, 2014. With permission.)

with the consideration of the mutual effects between the contacts. The final com-
bined forces fH,* (t) enable a user to feel augmented sensations of the stiffer inclusion,
given as

fH ,* ( t ) = fR,* ( t ) + fT ,* ( t ) . (10.16)

Here, estimating and simulating fT,*(t) is the key for creating a sound illusion. The
hardware setup we used is the same as the one shown in Figure 10.4.
A two-step, measurement-based approach is taken to model the dynamic behavior
of the inclusion. First, a contact dynamics model representing the pure response of the
inclusion is identified using the data captured during palpating a physical mock-up. Then,
another dynamics model is constructed to capture the movement characteristics of the
inclusion in response to external forces. Both models are then used in rendering to deter-
mine fT,* (t) in real-time. The procedures are detailed in the following paragraphs.
The first preprocessing step is for identifying the overall contact force resulting
purely from an inclusion (inclusion-only case) as a function of the distance between
the inclusion and the contact point. Our approach is to extract the difference between
the responses of a sample with a stiffer inclusion (inclusion-embedded) and a sam-
ple without it (no-inclusion). To this end, we first identify the Hunt–Crossley model
using the no-inclusion model. We use the same identification procedure described in
Section 10.4.2. This model is denoted by f = H NT ( x, x ). Then, we obtain the data from
the inclusion-embedded sample by manually poking along a line from pTs to pT0 (see
Figure 10.9 for the involved quantities). This time, we also record the position changes
of pT using a position tracking system (TrackIR; NaturalPoint, Inc.). This gives us the
state vector when palpating the tumor-embedded model ( xTE , x TE , fTE , pT , p H ).
As depicted in Figure 10.8, the force f TE (t) can be decomposed into fR (t) and fT (t).
Since f = H NT ( x, x ) represents the magnitude of fR (t), the magnitude of fT (t) can be
obtained by passing all data pairs ( xTE , x TE ) to H NT ( x, x ) and by computing differ-
ences using

fT ( xTE , x TE ) = fTE − H NT ( xTE , x TE ) . (10.17)


Haptic Augmented Reality 247

Original
surface pTs

l0
Deformed
surface
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

pH Tool tip
lHT
pT0 Initial tumor

d
Displaced tumor pT

FIGURE 10.9 Variables for inclusion model identification. (Reprinted from Jeon, S. and
Harders, M., IEEE Trans. Haptics, 99, 1, 2014. With permission.)

f T (t) can be expressed as a function of the distance between the inclusion and the tool
tip. Let the distance from pH(t) to pT (t) be lHT (t), and the initial distance from pTs to
pT0 be l0. Then, the difference, l(t) = l0 −lHT (t), becomes a relative displacement toward
the inclusion. By using the data triples (l, l, fT ), a new response model with respect
to l(t) can be derived, which is denoted as HT (l, l). This represents the inclusion-only
force response at the single contact point pTs, poking into the direction of pT.
In the second step, the inclusion movement in response to external forces is char-
acterized. Nonlinear changes of d(t) with respect to an external force fT (t) can be
approximated using again the Hunt–Crossley model. After determining d(t) using a
position tracker and fT (t) using our rendering algorithm described in the next subsec-
tion, vector triples (d, d , fT ) are employed to identify three Hunt–Crossley models
for the three Cartesian directions, denoted by Gx (d x , d x ), Gy (d y , d y ), and Gz (dz , d z ).

10.5.1 Rendering Algorithm


Rendering begins with making a contact with the no-inclusion model. Forces from
multiple contacts deform the model as shown in Figure 10.10 and displace the con-
tact point from pHs,* to pH,* (t) and the inclusion from pT0 to pT (t). The force caus-
ing this movement is the same as the inclusion response at the user’s hand fT,* (t) in
(10.15). Therefore, the direction of fT,* (t) is from the inclusion position to the tool tip,
such that

p H ,* ( t ) − pT ( t )
fT ,* ( t ) = fT ,* ( t ) . (10.18)
| p H ,* ( t ) − pT ( t ) |

Equation 10.18 indicates that the unknown values, f T,* (t) and pT (t), should be approxi-
mated during the rendering.
f T,* (t) is derived based on HT. To this end, we first scale the current indentation
distance to match those during the recording:

l0
l* (t ) = (l0,* − lHT ,* (t )) . (10.19)
l0,*
248 Fundamentals of Wearable Computers and Augmented Reality

l0,1
pTs
pHs,1
Original
surface
l0
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

pH,1 pH,2
pT0
lHT,1 Tool tip 2 Deformed
Tool tip 1 d surface
Initial tumor
Displaced tumor pT

FIGURE 10.10 Variables for inclusion augmentation rendering. (Reprinted from Jeon, S.
and Harders, M., IEEE Trans. Haptics, 99, 1, 2014. With permission.)

Then, we can obtain a linearly-normalized indentation length along p H ,*pT


with respect to the reference deformation. Finally, f T,*(t) is approximated by
fT ,* (t ) = HT (l* (t ), l* (t )).
We take a similar approach for the update of d(t), and then eventually pT (t). Taking
the inverse of Gx,y,z allows us to approximate d(t) by

1/ m
n
� �
� ∑ fT ,*,i (t ) �
di ( t ) = � *=1
� i = x, y, z, (10.20)
� k + b d (t ) �
� �

where
n is the number of contact points
m is the exponential parameter in the Hunt–Crossley model

Finally, fT,*(t) is determined using (10.18), which is directly sent to the haptic AR
interface.
In Jeon and Harders (2014), we compared the simulation results of our algo-
rithm with actual measurement data recorded from eight different real mock-ups via
various interaction methods. Overall, inclusion movements and the mutual effects
between contacts are captured and simulated with reasonable accuracy; the force
simulation errors were less than the force perception thresholds in most cases.

10.6 FRICTION MODULATION


Our next target was the modulation of surface friction (Jeon et al. 2011). Here, we
introduce simple and effective algorithms for estimating and altering inherent fric-
tion between a tool tip and a surface to desired friction. We also use the same hard-
ware setup for friction modulation.
The specific goal of this work is to alter real friction force freal(t) such that a user
perceives a modulated friction force ftarg(t) that mimics the response of a surface
made of a different desired target material when the user strokes the real surface
Haptic Augmented Reality 249
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Real friction freal


Virtual modulation friction fmod
Target friction ftarg

FIGURE 10.11 Variables for friction modulation. (Reprinted from Jeon, S. et al., Extensions
to haptic augmented reality: Modulating friction and weight, in Proceedings of the IEEE
World Haptics Conference (WHC), 2011, pp. 227–232. With permission.)

with a tool. As illustrated in Figure 10.11, this is done by adding a modulation fric-
tion force f mod(t) to the real friction force:

fmod ( t ) = ftarg ( t ) − freal ( t ) . (10.21)

Thus, the task reduces to: (1) simulation of the desired friction response ftarg(t) and (2)
measurement of the real friction force freal(t).
For the simulation of the desired friction force ftarg(t) during rendering, we iden-
tify the modified Dahl model describing the friction of a target surface. For the Dahl
model parameter identification, a user repeatedly strokes the target surface with the
probe tip attached to the PHANToM. The identification procedure is the same as
that given in Section 10.4.2. The model is then used to calculate ftarg(t) using the tool
tip position and velocity and the normal contact force during augmented rendering.
freal(t) can be easily derived from force sensor readings after a noise reduction
process. Given the real friction and the target friction, the appropriate modulation
force that needs to be rendered by the device is finally computed using (10.20). The
modulation force is sent to the haptic interface for force control.
We tested the accuracy of our friction identification and modulation algorithms
with four distinctive surfaces (Jeon et al. 2011). The results showed that regardless of
the base surface, the friction was modulated to a target surface without perceptually
significant errors.

10.7 OPEN RESEARCH TOPICS


In spite of our and other groups’ endeavor for haptic AR, this field is still young and
immature, awaiting persistent research on many intriguing and challenging topics.
For instance, our work regarding stiffness modulation has focused on homogeneous
soft real objects, meaning that the material characteristics of the objects are iden-
tical regardless of contact point. However, most natural deformable objects exhibit
inhomogeneity. Such objects show much more complicated deformation and friction
250 Fundamentals of Wearable Computers and Augmented Reality

behaviors, and approaches that are based on more in-depth contact mechanics are
necessary for appropriate modeling and augmentation. This has been one direction
of our research, with an initial result that allows the user to model the shape of a soft
object using a haptic interface without the need for other devices (Yim and Choi 2012).
Our work has used a handheld tool for the exploration of real objects. This must
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

be extended to those which allow for the use of bare hands, or at least very similar
cases such as thin thimbles enclosing fingertips. Such extension will enlarge the
application area of haptic AR by the great extent, for example, palpation training on
a real phantom that includes virtual organs and lumps. To this end, we have begun to
examine the feasibility of sensing not only contact force but also contact pressure in
a compact device and its utility for haptic AR (Kim et al. 2014).
Another important topic is that for multi-finger interaction. This functionality
requires very complicated haptic interfaces that provide multiple, independent forces
with a very large degrees of freedom (see Barbagli et al. 2005), as well as very
sophisticated deformable body rendering algorithms that take into account the inter-
play between multiple contacts. Research effort on this topic is still ongoing even for
haptic VR.
Regarding material properties, we need methods to augment friction, texture, and
temperature. Friction is expected to be relatively easier in both modeling and render-
ing for haptic AR, as long as deformation is properly handled. Temperature modula-
tion is likely to be more challenging, especially due to the difficulty of integrating a
temperature display to the fingertip that touches real objects. This functionality can
greatly improve the realism of AR applications.
The last critical topic we wish to mention is texture. Texture is one of the most
salient material properties and determines the identifying tactual characteristics of
an object (Katz 1925). As such, a great amount of research has been devoted to
haptic perception and rendering of surface texture. Texture is also one of the most
complex issues because of the multiple perceptual dimensions involved in texture
perception; all of surface microgeometry and material’s elasticity, viscosity, and fric-
tion play an important role (Hollins et al. 1993, 2000). See Choi and Tan (2004a,b,
2005, 2007) for a review of texture perception relevant to haptic rendering, Campion
and Hayward (2007) for passive rendering of virtual textures, and Fritz and Barner
(1996), Guruswamy et al. (2011), Lang and Andrews (2011), and Romano and
Kuchenbecker (2011) for various models. All of these studies pertained to haptic VR
rendering. Among these, the work of Kuchenbecker and her colleagues has the best
feasibility for application to haptic AR; they have developed a high-quality texture
rendering system that overlays artificial vibrations on a touchscreen to deliver the
textures of real samples (Romano and Kuchenbecker 2011) and an open database of
textures (Culbertson et al. 2014). This research can be a cornerstone for the modeling
and augmentation of real textures.

10.8 CONCLUSIONS
This chapter overviewed the emerging AR paradigm for the sense of touch. We first
outlined the conceptual, functional, and technical aspects of this new paradigm with
three taxonomies and thorough review of existing literature. Then, we moved to
Haptic Augmented Reality 251

recent attempts for realizing haptic AR, where hardware and algorithms for aug-
menting the stiffness and friction of a real object were detailed. These frameworks
are applied to medical training of palpation, where stiffer virtual inclusions are ren-
dered in a real tissue mock-up. Lastly, we elucidate several challenges and future
research topics in this research area. We hope that our endeavor introduced in this
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

chapter will pave the way to more diverse and mature researches in the exciting field
of haptic AR.

REFERENCES
Abbott, J., P. Marayong, and A. Okamura. 2007. Haptic virtual fixtures for robot-assisted
manipulation. In Robotics Research, eds. S. Thrun, R. Brooks, and H. Durrant-Whyte,
pp. 49–64. Springer-Verlag: Berlin, Germany.
Abbott, J. and A. Okamura. 2003. Virtual fixture architectures for telemanipulation.
Proceedings of the IEEE International Conference on Robotics and Automation,
pp. 2798–2805. Taipei, Taiwan.
Adcock, M., M. Hutchins, and C. Gunn. 2003. Augmented reality haptics: Using ARToolKit
for display of haptic applications. Proceedings of Augmented Reality Toolkit Workshop,
pp. 1–2. Tokyo, Japan.
Azuma, R., Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. MacIntyre. 2001. Recent
advances in augmented reality. IEEE Computer Graphics & Applications 21 (6):34–47.
Barbagli, F., D. Prattichizzo, and K. Salisbury. 2005. A multirate approach to haptic interac-
tion with deformable objects single and multipoint contacts. International Journal of
Robotics Research 24 (9):703–716.
Bayart, B., J. Y. Didier, and A. Kheddar. 2008. Force feedback virtual painting on real
objects: A paradigm of augmented reality haptics. Lecture Notes in Computer Science
(EuroHaptics 2008) 5024:776–785.
Bayart, B., A. Drif, A. Kheddar, and J.-Y. Didier. 2007. Visuo-haptic blending applied to a
tele-touch-diagnosis application. Lecture Notes on Computer Science (Virtual Reality,
HCII 2007) 4563: 617–626.
Bayart, B. and A. Kheddar. 2006. Haptic augmented reality taxonomy: Haptic enhancing and
enhanced haptics. Proceedings of EuroHaptics, 641–644. Paris, France.
Bennett, E. and B. Stevens. 2006. The effect that the visual and haptic problems associ-
ated with touching a projection augmented model have on object-presence. Presence:
Teleoperators and Virtual Environments 15 (4):419–437.
Bianchi, G., C. Jung, B. Knoerlein, G. Szekely, and M. Harders. 2006a. High-fidelity visuo-
haptic interaction with virtual objects in multi-modal AR systems. Proceedings of the
IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 187–
196. Santa Barbara, USA.
Bianchi, G., B. Knoerlein, G. Szekely, and M. Harders. 2006b. High precision augmented real-
ity haptics. Proceedings of EuroHaptics, pp. 169–168. Paris, France.
Billinghurst, M., H. Kato, and I. Poupyrev. 2001. The MagicBook—Moving seamlessly
between reality and virtuality. IEEE Computer Graphics & Applications 21 (3):6–8.
Borst, C. W. and R. A. Volz. 2005. Evaluation of a haptic mixed reality system for interac-
tions with a virtual control panel. Presence: Teleoperators and Virtual Environments
14 (6):677–696.
Bose, B., A. K. Kalra, S. Thukral, A. Sood, S. K. Guha, and S. Anand. 1992. Tremor compen-
sation for robotics assisted microsurgery. Engineering in Medicine and Biology Society,
1992, 14th Annual International Conference of the IEEE, October 29, 1992–November
1 1992, pp. 1067–1068. Paris, France.
252 Fundamentals of Wearable Computers and Augmented Reality

Brewster, S. and L. M. Brown. 2004. Tactons: Structured tactile messages for non-visual
information display. Proceedings of the Australasian User Interface Conference,
pp. 15–23. Dunedin, New Zealand.
Brown, L. M. and T. Kaaresoja. 2006. Feel who’s talking: Using tactons for mobile phone
alerts. Proceeding of the Annual SIGCHI Conference on Human Factors in Computing
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Systems, pp. 604–609. Montréal, Canada.


Campion, G. and V. Hayward. 2007. On the synthesis of haptic textures. IEEE Transactions
on Robotics 24 (3):527–536.
Choi, S. and H. Z. Tan. 2004a. Perceived instability of virtual haptic texture. I. Experimental
studies. Presence: Teleoperators and Virtual Environment 13 (4):395–415.
Choi, S. and H. Z. Tan. 2004b. Toward realistic haptic rendering of surface textures. IEEE
Computer Graphics & Applications (Special Issue on Haptic Rendering—Beyond Visual
Computing) 24 (2):40–47.
Choi, S. and H. Z. Tan. 2005. Perceived instability of haptic virtual texture. II. Effects of
collision detection algorithm. Presence: Teleoperators and Virtual Environments
14 (4):463–481.
Choi, S. and H. Z. Tan. 2007. Perceived instability of virtual haptic texture. III. Effect of
update rate. Presence: Teleoperators and Virtual Environments 16 (3):263–278.
Chun, J., I. Lee, G. Park, J. Seo, S. Choi, and S. H. Han. 2013. Efficacy of haptic blind spot
warnings applied through a steering wheel or a seatbelt. Transportation Research Part
F: Traffic Psychology and Behaviour 21:231–241.
Culbertson, H., J. J. L. Delgado, and K. J. Kuchenbecker. 2014. One hundred data-driven hap-
tic texture models and open-source methods for rendering on 3D objects. Proceedings
of the IEEE Haptics Symposium, pp. 319–325. Houston, TX.
Feng, Z., H. B. L. Duh, and M. Billinghurst. 2008. Trends in augmented reality tracking, inter-
action and display: A review of ten years of ISMAR. Proceedings of the IEEE/ACM
International Symposium of Mixed and Augmented Reality, pp. 193–202. Cambridge, UK.
Frey, M., J. Hoogen, R. Burgkart, and R. Riener. 2006. Physical interaction with a virtual
knee joint—The 9 DOF haptic display of the Munich knee joint simulator. Presence:
Teleoperators and Virtual Environment 15 (5):570–587.
Fritz, J. P. and K. E. Barner. 1996. Stochastic models for haptic texture. Proceedings of
SPIE’s International Symposium on Intelligent Systems and Advanced Manufacturing—
Telemanipulator and Telepresence Technologies III, pp. 34–44. Boston, MA.
Fukumoto, M. and T. Sugimura. 2001. Active click: Tactile feedback for touch panels.
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,
pp. 121–122. Seattle, WA.
Gerling, G. J. and G. W. Thomas. 2005. Augmented, pulsating tactile feedback facilitates sim-
ulator training of clinical breast examinations. Human Factors 47 (3):670–681.
Gopal, P., S. Kumar, S. Bachhal, and A. Kumar. 2013. Tremor acquisition and reduction for
robotic surgical applications. Proceedings of International Conference on Advanced
Electronic Systems, pp. 310–312. Pilani, India.
Grosshauser, T. and T. Hermann. 2009. Augmented haptics—An interactive feedback system
for musicians. Lecture Notes in Computer Science (HAID 2012) 5763:100–108.
Guruswamy, V. L., J. Lang, and W.-S. Lee. 2011. IIR filter models of haptic vibration textures.
IEEE Transactions on Instrumentation and Measurement 60 (1):93–103.
Ha, T., Y. Chang, and W. Woo. 2007. Usability test of immersion for augmented reality based
product design. Lecture Notes in Computer Science (Edutainment 2007) 4469:152–161.
Ha, T., Y. Kim, J. Ryu, and W. Woo. 2006. Enhancing immersiveness in AR-based product
design. Lecture Notes in Computer Science (ICAT 2006) 4282:207–216.
Hachisu, T., M. Sato, S. Fukushima, and H. Kajimoto. 2012. Augmentation of material prop-
erty by modulating vibration resulting from tapping. Lecture Notes in Computer Science
(EuroHaptics 2012) 7282:173–180.
Haptic Augmented Reality 253

Haddadi, A. and K. Hashtrudi-Zaad. 2008. A new method for online parameter estimation
of hunt-crossley environment dynamic models. Proceedings of the IEEE International
Conference on Intelligent Robots and Systems, pp. 981–986. Nice, France.
Harders, M., G. Bianchi, B. Knoerlein, and G. Szekely. 2009. Calibration, registration, and
synchronization for high precision augmented reality haptics. IEEE Transactions on
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Visualization and Computer Graphics 15 (1):138–149.


Hoever, R., G. Kosa, G. Szekely, and M. Harders. 2009. Data-driven haptic rendering-from
viscous fluids to visco-elastic solids. IEEE Transactions on Haptics 2:15–27.
Hollins, M., S. J. Bensmäi, K. Karlof, and F. Young. 2000. Individual differences in percep-
tual space for tactile textures: Evidence from multidimensional scaling. Perception &
Psychophysics 62 (8):1534–1544.
Hollins, M., R. Faldowski, R. Rao, and F. Young. 1993. Perceptual dimensions of tactile surfaced
texture: A multidimensional scaling analysis. Perception & Psychophysics 54:697–705.
Hugues, O., P. Fuchs, and O. Nannipieri. 2011. New augmented reality taxonomy: Technologies
and features of augmented environment. In Handbook of Augmented Reality, ed.
B. Furht, pp. 47–63. Springer-Verlag: Berlin, Germany.
Hunt, K. and F. Crossley. 1975. Coefficient of restitution interpreted as damping in vibroim-
pact. ASME Journal of Applied Mechanics 42:440–445.
Iwata, H., H. Yano, F. Nakaizumi, and R. Kawamura. 2001. Project FEELEX: Adding haptic
surface to graphics. Proceedings of ACM SIGGRAPH, pp. 469–476. Los Angeles, CA.
Jeon, S. and S. Choi. 2008. Modulating real object stiffness for haptic augmented reality.
Lecture Notes on Computer Science (EuroHaptics 2008) 5024:609–618.
Jeon, S. and S. Choi. 2009. Haptic augmented reality: Taxonomy and an example of stiffness
modulation. Presence: Teleoperators and Virtual Environments 18 (5):387–408.
Jeon, S. and S. Choi. 2010. Stiffness modulation for haptic augmented reality: Extension to
3D interaction. Proceedings of the Haptics Symposium, pp. 273–280. Waltham, MA.
Jeon, S. and S. Choi. 2011. Real stiffness augmentation for haptic augmented reality. Presence:
Teleoperators and Virtual Environments 20 (4):337–370.
Jeon, S. and M. Harders. 2012. Extending haptic augmented reality: Modulating stiffness
during two-point squeezing. Proceedings of the Haptics Symposium, pp. 141–146.
Vancouver, Canada.
Jeon, S. and M. Harders. 2014. Haptic tumor augmentation: Exploring multi-point interaction.
IEEE Transactions on Haptics 99 (Preprints):1–1.
Jeon, S., M. Harders, and S. Choi. 2012. Rendering virtual tumors in real tissue mock-ups
using haptic augmented reality. IEEE Transactions on Haptics 5 (1):77–84.
Jeon, S., J.-C. Metzger, S. Choi, and M. Harders. 2011. Extensions to haptic augmented real-
ity: Modulating friction and weight. Proceedings of the IEEE World Haptics Conference
(WHC), pp. 227–232. Istanbul, Turkey.
Johnson, A., D. Sandin, G. Dawe, T. DeFanti, D. Pape, Z. Qiu, and D. P. S. Thongrong. 2000.
Developing the PARIS: Using the CAVE to prototype a new VR display. Proceedings of
the ACM Symposium on Immersive Projection Technology.
Kajimoto, H., N. Kawakami, S. Tachi, and M. Inami. 2004. SmartTouch: Electric skin to touch
the untouchable. IEEE Computer Graphics & Applications 24 (1):36–43.
Katz, D. 1925. The World of Touch. Hillsdale, NJ: Lawrence Erlbaum Associates.
Kim, H., S. Choi, and W. K. Chung. 2014. Contact force decomposition using tactile informa-
tion for haptic augmented reality. Proceedings of the IEEE/RSJ International Conference
on Robots and Systems, pp. 1242–1247. Chicago, IL.
Kim, S., J. Cha, J. Kim, J. Ryu, S. Eom, N. P. Mahalik, and B. Ahn. 2006. A novel test-bed for
immersive and interactive broadcasting production using augmented reality and haptics.
IEICE Transactions on Information and Systems E89-D (1):106–110.
Kim, S.-Y. and J. C. Kim. 2012. Vibrotactile rendering for a traveling vibrotactile wave based
on a haptic processor. IEEE Transactions on Haptics 5 (1):14–20.
254 Fundamentals of Wearable Computers and Augmented Reality

Kurita, Y., A. Ikeda, T. Tamaki, T. Ogasawara, and K. Nagata. 2009. Haptic augmented real-
ity interface using the real force response of an object. Proceedings of the ACM Virtual
Reality Software and Technology, pp. 83–86. Kyoto, Japan.
Kyung, K.-U. and J.-Y. Lee. 2009. Ubi-Pen: A haptic interface with texture and vibrotactile
display. IEEE Computer Graphics and Applications 29 (1):24–32.
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

Lang, J. and S. Andrews. 2011. Measurement-based modeling of contact forces and textures
for haptic rendering. IEEE Transactions on Visualization and Computer Graphics 17
(3):380–391.
Lee, H., W. Kim, J. Han, and C. Han. 2012a. The technical trend of the exoskeleton robot
system for human power assistance. International Journal of Precision Engineering and
Manufacturing 13 (8):1491–1497.
Lee, I. and S. Choi. 2014. Vibrotactile guidance for drumming learning: Method and per-
ceptual assessment. Proceedings of the IEEE Haptics Symposium, pp. 147–152.
Houston, TX.
Lee, I., K. Hong, and S. Choi. 2012. Guidance methods for bimanual timing tasks. Proceedings
of IEEE Haptics Symposium, pp. 297–300. Vancouver, Canada.
Li, M., M. Ishii, and R. H. Taylor. 2007. Spatial motion constraints using virtual fixtures gener-
ated by anatomy. IEEE Transactions on Robotics 23 (1):4–19.
Luciano, C., P. Banerjee, L. Florea, and G. Dawe. 2005. Design of the ImmersiveTouch™:
A high-performance haptic augmented virtual reality system. Proceedings of
International Conference on Human-Computer Interaction. Las Vegas, NV.
Mahvash, M. and A. M. Okamura. 2006. Friction compensation for a force-feedback telero-
botic system. Proceedings of the IEEE International Conference on Robotics and
Automation, pp. 3268–3273. Orlando, FL.
Milgram, P. and H. Colquhoun, Jr. 1999. A taxonomy of real and virtual world display integra-
tion. In Mixed Reality—Merging Real and Virtual Worlds, ed. by Y. O. A. H. Tamura,
pp. 1–16. Springer-Verlag: Berlin, Germany.
Minamizawa, K., H. Kajimoto, N. Kawakami, and S. Tachi. 2007. Wearable haptic display to
present gravity sensation. Proceedings of the World Haptics Conference, pp. 133–138.
Tsukuba, Japan
Mitchell, B., J. Koo, M. Iordachita, P. Kazanzides, A. Kapoor, J. Handa, G. Hager, and
R. Taylor. 2007. Development and application of a new steady-hand manipulator for
retinal surgery. Proceedings of the IEEE International Conference on Robotics and
Automation, pp. 623–629. Rome, Italy.
Nojima, T., D. Sekiguchi, M. Inami, and S. Tachi. 2002. The SmartTool: A system for
augmented reality of haptics. Proceedings of the IEEE Virtual Reality Conference,
pp. 67–72. Orlando, FL.
Ochiai, Y., T. Hoshi, J. Rekimoto, and M. Takasaki. 2014. Diminished haptics: Towards digital
transformation of real world textures. Lecture Notes on Computer Science (Eurohaptics
2014, Part I) LNCS 8618: pp. 409–417.
Okamura, A. M., M. R. Cutkosky, and J. T. Dennerlein. 2001. Reality-based models for vibra-
tion feedback in virtual environments. IEEE/ASME Transactions on Mechatronics
6 (3):245–252.
Ott, R., D. Thalmann, and F. Vexo. 2007. Haptic feedback in mixed-reality environment. The
Visual Computer: International Journal of Computer Graphics 23 (9):843–849.
Pai, D. K., K. van den Doel, D. L. James, J. Lang, J. E. Lloyd, J. L. Richmond, and S. H. Yau. 2001.
Scanning physical interaction behavior of 3D objects. Proceedings of the Annual Conference
on ACM Computer Graphics and Interactive Techniques, pp. 87–96. Los Angeles, CA.
Park, G., S. Choi, K. Hwang, S. Kim, J. Sa, and M. Joung. 2011. Tactile effect design and
evaluation for virtual buttons on a mobile device Touchscreen. Proceedings of the
International Conference on Human-Computer Interaction with Mobile Devices and
Services (MobileHCI), pp. 11–20. Stockholm, Sweden.
Haptic Augmented Reality 255

Parkes, R., N. N. Forrest, and S. Baillie. 2009. A mixed reality simulator for feline abdominal
palpation training in veterinary medicine. Studies in Health Technology and Informatics
142:244–246.
Powell, D. and M. K. O’Malley. 2011. Efficacy of shared-control guidance paradigms for
robot-mediated training. Proceedings of the IEEE World Haptics Conference, pp. 427–
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

432. Istanbul, Turkey.


Reachin Technology. Reachin Display. http://www.reachin.se/. Accessed March 4, 2015.
Romano, J. M. and K. J. Kuchenbecker. 2011. Creating realistic virtual textures from contact
acceleration data. IEEE Transactions on Haptics 5 (2):109–119.
Rosenberg, L. B. 1993. Virtual fixtures: Perceptual tools for telerobotic manipulation.
Proceedings of the IEEE Virtual Reality Annual International Symposium, pp. 76–82.
Rovers, L. and H. van Essen. 2004. Design and evaluation of hapticons for enriched instant
messaging. Proceedings of Eurohaptics, pp. 498–503. Munich, Germany.
Sandor, C., S. Uchiyama, and H. Yamamoto. 2007. Visuo-haptic systems: Half-mirrors con-
sidered harmful. Proceedings of the World Haptics Conference, pp. 292–297. Tsukuba,
Japan.
Scharver, C., R. Evenhouse, A. Johnson, and J. Leigh. 2004. Designing cranial implants in a
haptic augmented reality environment. Communications of the ACM 47 (8):32–38.
SenseGraphics. 3D-IW. http://www.sensegraphics.se/. Accessed on March 4, 2015.
SoIanki, M. and V. Raja. 2010. Haptic based augmented reality simulator for training clinical
breast examination. Proceedings of the IEEE Conference on Biomedical Engineering
and Sciences, pp. 265–269. Kuala Lumpur, Malaysia.
Spence, C. and C. Ho. 2008. Tactile and multisensory spatial warning signals for drivers. IEEE
Transactions on Haptics 1 (2):121–129.
Sreng, J., A. Lecuyer, and C. Andriot. 2008. Using vibration patterns to provide impact posi-
tion information in haptic manipulation of virtual objects. Lecture Notes on Computer
Science (EuroHaptics 2008) 5024:589–598.
Ternes, D. and K. E. MacLean. 2008. Designing large sets of haptic icons with rhythm. Lecture
Notes on Computer Science (EuroHaptics 2008) 5024:199–208.
Vallino, J. R. and C. M. Brown. 1999. Haptics in augmented reality. Proceedings of the
IEEE International Conference on Multimedia Computing and Systems, pp. 195–200.
Florence, Italy.
Yang, C., J. Zhang, I. Chen, Y. Dong, and Y. Zhang. 2008. A review of exoskeleton-type sys-
tems and their key technologies. Proceedings of the Institution of Mechanical Engineers,
Part C: Journal of Mechanical Engineering Science 222 (8):1599–1612.
Yao, H.-Y., V. Hayward, and R. E. Ellis. 2004. A tactile magnification instrument for mini-
mally invasive surgery. Lecture Notes on Computer Science (MICCAI) 3217:89–96.
Ye, G., J. Corso, G. Hager, and A. Okamura. 2003. VisHap: Augmented reality combining
haptics and vision. Proceedings of the IEEE International Conference on Systems, Man
and Cybernetics, pp. 3425–3431. Washington, D.C.
Yim, S. and S. Choi. 2012. Shape modeling of soft real objects using force-feedback hap-
tic interface. Proceedings of the IEEE Haptics Symposium, pp. 479–484. Vancouver,
Canada.
Yokokohji, Y., R. L. Hollis, and T. Kanade. 1999. WYSIWYF display: A visual/haptic
interface to virtual environment. Presence: Teleoperators and Virtual Environments
8 (4):412–434.
Zoran, A. and J. A. Paradiso. 2012. The FreeD—A handheld digital milling device for craft
and fabrication. Proceedings of the ACM Symposium on User Interface Software and
Technology, pp. 3–4. Toronto, Canada.
Downloaded by [Pohang University of Science and Technology], [Seungmoon Choi] at 17:27 03 August 2015

You might also like