Wearable Computing: Project Glass: An Extension of The Self
Wearable Computing: Project Glass: An Extension of The Self
Project Glass:
An Extension of the Self
Thad Starner
u r goal is to reduce t he
time between intention and
action. The first time I heard Google
CEO Larry Page say that phrase, we
were talking about the then-secret
Project Glass. However, this idea runs
throughout Googles products.
As an example, go to google.com
and type weara as if to start searching for wearable computer. After just
the first five letters, Google suggests
possible completions (including wearable shoe trees). By the time the letters
wearable com are entered, Google
has matched wearable computer
as the first completion result and has
already loaded results in the page below
the search box. Often, the information
the user needs will be on the screen
before the query is even finished. These
techniques save seconds in the interaction, completing the intention and making the Google search experience more
pleasant. On a mobile device, however,
fast interaction can be the difference
between a successful interface and a
device thats left in the pocket.
Hooked on HUDs
For the past 20 years, Ive been wearing
my computer as an intelligent assistant
in my daily life. Wearable computers
and head-up displays (HUDs) have
existed since the 1960s in some form,
but as far as I know, nobody has been
using them daily for as long as I have.
Ever since my first month of use in
1993, Ive been hooked.
The device enabled a compelling new
lifestyle in which I felt more powerful,
14
PER VA SI V E computing
PC-12-02-Wear.indd 14
3/26/13 10:00 AM
AprilJune 2013
PC-12-02-Wear.indd 15
PER VA SI V E computing
15
3/26/13 10:00 AM
Wearable Computing
Wearable Computing
Figure 2. A user on a hot-air balloon ride, wearing Glass. A clock is the default screen
for Glass, so the user sees the time in the upper right corner. (Figure courtesy of
Google.)
Microinteractions
Unfortunately, smartphones have very limited potential for microinteractions, which
leads to users being mired and head-down in their touch screens. Once the user has
invested the 20 seconds needed to retrieve the device and navigate to the appropriate function of the user interface, she tends to devote even more attention to the
device while she has it in hand.
For example, when reading an incoming SMS, which should be a microinteraction,
the user is soon lured into a scan of the email inbox. To make matters worse, smartphone user interfaces require significant hand-eye coordination, with the users eyes
typically cast down toward the floor. The users attention is tunneled away from
the world around her, into the virtual. However, this attention-tunneling problem is
alleviated once an interaction becomes micro, with an interface designed to get
the user in and out of the interaction as quickly as possible. Even better, a head-up
interface, when used for a microinteraction, keeps the users eyes pointed outward
toward the world, which also helps avoid attention tunneling. Compared to current
devices, wearable interfaces will help the user pay attention to the real world as
opposed to retreating from it.
16
PER VA SI V E computing
PC-12-02-Wear.indd 16
www.computer.org/pervasive
3/26/13 10:00 AM