0% found this document useful (0 votes)
9 views52 pages

Us 7844915

The patent US 7,844,915 B2 describes application programming interfaces for scrolling operations, specifically focusing on user interface software interacting with software applications. It outlines methods for handling user input to create event objects, determine scrolling or gesture operations, and manage bounce effects during scrolling. The patent was filed by Andrew Platzer and Scott Herz, and is assigned to Apple Inc., with a patent date of November 30, 2010.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views52 pages

Us 7844915

The patent US 7,844,915 B2 describes application programming interfaces for scrolling operations, specifically focusing on user interface software interacting with software applications. It outlines methods for handling user input to create event objects, determine scrolling or gesture operations, and manage bounce effects during scrolling. The patent was filed by Andrew Platzer and Scott Herz, and is assigned to Apple Inc., with a patent date of November 30, 2010.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

US007844915B2

(12) United States Patent (10) Patent No.: US 7,844.915 B2


Platzer et al. (45) Date of Patent: Nov.30, 2010

(54) APPLICATION PROGRAMMING 7,009,626 B2 3/2006 Anwar


INTERFACES FORSCROLLING 7,088,374 B2 8/2006 David et al.
OPERATIONS 7,117,453 B2 10/2006 Drucker et al.
(75) Inventors: Andrew Platzer, Santa Clara, CA (US); 7,173,623 B2 2/2007 Calkins et al.
Scott Herz, Santa Clara, CA (US) 7,337.412 B2 2/2008 Guido et al.
7.346,850 B2 3/2008 Swartz et al.
(73) Assignee: Apple Inc., Cupertino, CA (US)
(*) Notice: Subject to any disclaimer, the term of this
patent is extended or adjusted under 35 (Continued)
U.S.C. 154(b) by 583 days. FOREIGN PATENT DOCUMENTS
(21) Appl. No.: 11/620.717 EP 1517228 3, 2005
(22) Filed: Jan. 7, 2007
(65) Prior Publication Data
(Continued)
US 2008/O168384 A1 Jul. 10, 2008
OTHER PUBLICATIONS
(51) Int. Cl.
G06F 3/00 (2006.01) Toshiyuki Masui etal; "Elastic Graphical Interfaces for Precise Data
G06F 3/033 (2006.01) Manipulation”, 1995; ACM; pp. 143-144.*
G06F 3/04 (2006.01) (Continued)
G06F 3/048 (2006.01)
(52) U.S. Cl. ....................... 715/781: 715/784; 715/800; Primary Examiner Xiomara L. Bautista
34.5/173 (74) Attorney, Agent, or Firm Blakely, Sokoloff, Taylor &
(58) Field of Classification Search ................. 715/764, Zafman LLP
715/765, 784, 786, 788, 800, 864, 866,973,
715/974; 34.5/156, 157, 169,173 (57) ABSTRACT
See application file for complete search history.
(56) References Cited At least certain embodiments of the present disclosure
U.S. PATENT DOCUMENTS include an environment with user interface software interact
ing with a software application. A method for operating
5,534,893 A 7, 1996 Hansen et al. through an application programming interface (API) in this
5,903,902 A 5, 1999 Orr et al. environment includes transferring a set bounce call. The
6,028,602 A 2/2000 Weidenfeller et al. method further includes setting at least one of maximum and
6,486,896 B1 11, 2002 Ubillos minimum bounce values. The set bounce call causes abounce
6,677.965 B1* 1/2004 Ullmann et al. ............. 715,786 of a scrolled region in an opposite direction of a scroll based
6,741,996 B1 5, 2004 Brechner et al.
6,839,721 B2 1/2005 Schwols
on a region past an edge of the scrolled region being visible in
6,903,927 B2 6, 2005 Anlauff a display region at the end of the scroll.
6,957,392 B2 10/2005 Simister et al.
6,958,749 B1 * 10/2005 Matsushita et al. .......... 34.5/175 21 Claims, 37 Drawing Sheets
100

RECEWEAUSERNPUT
102

CREAEANEWEN OBJECIN
RESPONSECTHE USERINPT

fERINE WHETHEREEEN OBJECT


NWOKES ASCROL OR SjRE
OERAfi0

SSUEATEAS (NESRC CR
GESTURECAL BASE (NINWOKING
TiESCR00RGESTUREOPERATION

RSPOND TO AileAst NSCRO CALL


IFSSUED, SCROLLANtow HAVINGs. A
WIEWASSCCAED WITH HINFOBJECT
BASED ON ANAMOUNT OFASCROW
THESCROLSOPPATAPRETERMINE
PCSTONN REAONCHE USERNPU

RESPND TO AELEASE ONEGSTURECAL


IFSSUED, CHANGEAWWASSOCIA
WITH THEEVENT OBJECBAS) ()
RECEWINGAPURALY OFINPUPCNS
NHEFORM OFHESERNPU
US 7,844.915 B2
Page 2

U.S. PATENT DOCUMENTS WO WO-2006/067711 6, 2006


WO WO 2008/085848 A1 * T 2008
7,561,159 B2 7/2009 Abel et al. WO WO 2008/085877 A1 * T 2008
7,576,732 B2 * 8/2009 Lii............................. 345,173 WO WO-2008O85848 T 2008
2001/0045949 A1 11/2001 Chithambaram et al. WO WO-2008O85877 T 2008
2002/0194589 A1 12/2002 Cristofalo et al.
2003/0095096 A1 5, 2003 Robbin et al. OTHER PUBLICATIONS
2003/O122787 A1 7/2003 Zimmerman et al.
2003/O132959 A1 7/2003 Simister et al. Action, U.S. Appl. No. 1 1/620,723, mailed Apr. 1, 2009, 8
2003.0160832 A1 8/2003 Ridgley et al.
2003/0174149 A1 9/2003 Fujisaki et al. Action, U.S. Appl. No. 1 1/620,709, mailed Apr. 1, 2009, 8
2004/0021676 A1 2/2004 Chen et al.
2004/0021698 A1 2/2004 Baldwin et al. Action, U.S. Appl. No. 1 1/620,720, mailed Jun. 23, 2009, 17
2004/O100479 A1 5, 2004 Nakano et al. . .
2004/0215643 A1 10, 2004 Brechner et al. Action, U.S. Appl. No. 1 1/620,720, mailed Dec. 23, 2008, 18
S.E. A. 3. E. f1. PCT International Search Report and Written Opinion for PCT Inter
2005/0057524 A1* adellet etal.al..................... 345,173
3, 2005 Hill national
pages) Applin. No. US2008/000058, mailed Jul. 31, 2008 (10
ge A. 3. Place et i PCT International Search Report and Written Opinion for PCT Inter
2006, OO38796 A1* 2, 2006 E. e 345,173 national Applin. No. US2008/000089, mailed Apr. 6, 2008 (14 pages).
2006, O190833 A1 8, 2006 S I ey et a st al- - - - - - - - - - PCT International Search Report and Written Opinion for PCT Inter
angiovanni
2006/0236263 A1* 10, 2006 Bathiche et al.et.............
al. 715,786 national Applin. No. PCT/US2008/0001.03, mailed Jun. 3, 2008 (15
pages)
22.93. A. 358, R et al 1 PCT International Search Report and Written Opinion for PCT Inter
2007/0174257 A1 7, 2007 Howard
uppl et al. national Applin. No. PCT/US2008/000069, mailed May 2, 2008 (16
pages)
3.295: A. ck S. SR et 1 345,173 PCT International Search Report and Written Opinion for PCT Inter
2007/028.8856 A1
olemans et al. ..........
12, 2007 Butlin et al.
national Appin. No. PCT/US2008/000060, mailed Apr. 22, 2008 (12
pages)
2008/0005703 A1* 1/2008 Radivojevic et al. ........ T15,863 . .
2008, OO16096 A1 1/2008 Wilding et al. Action, U.S. Appl. No. 1 1/620,723, mailed Jun. 8, 2010, 7
2008.0034029 A1 2/2008 Fang et al.
2008/0048978 A1 2/2008 Trent et al. .................. 345,157 es Action, U.S. Appl. No. 1 1/620,709, mailed Jun. 9, 2010, 7
2008. O168395 A1* 7, 2008 Ording et al. . . . . . ... 715,833 Final Office Action, U.S. Appl. No. 1 1/620,709 mailed Nov. 13,
2008/0231610 A1* 9/2008 Hotelling et al. ... 345,173 2009, 8 pages.
2009,0259969 A1* 10, 2009 Pallakoff .................... T15/808 Final Office Action, U.S. Appl. No. 1 1/620,723, mailed Nov. 17,
2009. 10 pages.
FOREIGN PATENT DOCUMENTS Office Action, U.S. Appl. No. 1 1/620,720 mailed Nov. 18, 2009, 17
GB 2319 591 A 5, 1998 pageS.
GB 23 19591 A ck 5, 1998 * cited by examiner
U.S. Patent Nov.30, 2010 Sheet 1 of 37 US 7,844.915 B2

OC
1N/
RECEIVE AUSER INPUT
102

CREATE AN EVENT OBJECT N


RESPONSE O THE USER INPUT
104

DETERMINE WHETHER THE EVENT OBJEC


NWOKES A SCROL OR GESTURE
OPERATION
06

SSUE AT LEAST ONESCROLOR


GESTURE CALL BASED ON N.VOKENG
THE SCROLL OR GESTURE OPERATION

RESPOND TO AT LEAST ONESCROLL CALL


FISSUED, SCROL. A WINDOW HAVING A
WEW ASSOCATED WITH THE EVEN OBJEC
BASED ON AN AMOUNT OF A SCROLL WH
TrESCROLL SOPPED A A PREDERM NED
POSITION N RELATON TO THE USER INPUT
10

RESPOND TO AT LEAST ONE GESTURE CALL


IFISSUED, CHANGEAVIEW ASSOCATED
WITH THE EVENT OBJECT BASED ON
RECEIVENGA PLURALITY OF INPUT POINTS
N THE FORM OF THE USER INPUT
112

F.G. 1
U.S. Patent Nov.30, 2010 Sheet 2 of 37 US 7,844.915 B2

2OO

1Nu/
RANSFER A SE BOUNCE CE
202

SETA EAST ONE OF MAXIMUM AND


MNEMUM BOUNCE WALUES
204

CAUSE A SGH BOUNCE OF A SCROLED


REGON IN AN OPPOSITE DIRECTION OFA
SCROL BASED ON A REGON PAS AN EDGE
OFA CONTENT BEING WISIBLE N A DSPLAY
REGON AT THE END OF THE SCRO.
2O6

FIG. 2

TRANSFER A RUBBERBAND CA O CAUSE


RUBBERBANDENG A SCROLED REGON
OSPAYED WITHNA DSPAY REGON
302

TRANSFER AN EDGERUBBERBAND CA TO
SET DISPLACEMENT WALUES FOR AT LEAS
ONE EDGE OF THE DISPLAY REGION
304

FIG. 3
U.S. Patent Nov.30, 2010 Sheet 3 Of 37 US 7,844.915 B2

GU
SPLAY
408 / 400

NFORMATION
412
MOVEMENT
44
NFORMATON
42-2

(3 NFORMATON
412.3

46 NFORMATION
42-4

WNOW
4. O

FIG. 4
U.S. Patent Nov.30, 2010 Sheet 4 of 37 US 7,844.915 B2

GU
DISPLAY
AO8 / 4OO

NFORMATON
412-1
ELOCITY
D STANCE W
512-1

TERMINUS
514

WINDOW
A.O.

FIG. 5A
U.S. Patent Nov.30, 2010 Sheet 5 Of 37 US 7,844.915 B2

GU
O SPEAY
p /1 400

ENFORMATON
TERMINUS 412-1
54

WINDOW
40

F.G. 5B
U.S. Patent Nov.30, 2010 Sheet 6 of 37 US 7,844.915 B2

GU
DSPLAY 40
/

WELOCITY
50-2

|NFORMATION
412.

DSTANCE
512

TERMINUS
514

WINDOW
41 O

F.G. 5C
U.S. Patent Nov.30, 2010 Sheet 7 Of 37 US 7,844.915 B2

PORABLE MULTEFUNCTON DEWCE


OO

SPEAKER OPTICAL PROXM TY


111 SENSOR 64 SENSOR 166

1402 Current Time 404 406


r 35O2
3504
Mailboxes In box (6)
3512-

Bruce Walker" January


Lunch meeting today 399%
30, 2006" w

Kim BrOOkr3506-33508-3 January 30, 2006"


Draft agreement 3508
Bob Adams : January 29, 2006
Project Orion --3514
35 2-2
g Darin Adler , January 29, 2006
fantasy football V
Aaron Jones January 29, 2006
Fwd: status report
Kim Brook January 28, 2006
Re: proposal 3532
V v Mass
(i)
3520 3522
(S)
3524

MCROPHONE
13

FIG. 6A
U.S. Patent Nov.30, 2010 Sheet 8 Of 37 US 7,844.915 B2

PORTABLE MULTIFUNCTON DEVICE


206 1 OO
SPEAKER OPTICAL PROX METY
35OCA 11 SENSOR 164 SENSOR 66

Current Time 404 406


3504

Aaron Jones, 3506-1


3508. 1 January 30, 2006-3'
Re; Project Orion- 3534
Bruce Walker" January 30, 2006"
Lunch meeting today -30°
Kim Brook". 3508-3
January 30, 2006'
Draft agreement --3514
Bob Adams , January 29, 2006
Project Orion y
E*Darin Adler January 29, 2006
fantasy football
Aaron Jones January 29, 2006
Fwd: status report

MCROPHONE
13

F.G. 6B
U.S. Patent Nov.30, 2010 Sheet 9 Of 37 US 7,844.915 B2

PORTABLE MULFUNCON DEVICE


206 1OO
SPEAKER OPCA PROXM1TY
35OOA 1 11 SENSOR 164 SENSOR 166
o
s 1402 Current Time 404 a 406
3504
: 3502
& Mailboxes Inbox (6)
3536

Aaron Jones 3506. January 30, 2006"


Re: Project Orion
352
Bruce Walker" January 30, 2006"
Lunch meeting today -3' 3530
Kim Brook
::
January 30, 2006"
5063 -

Draft agreements' -3514


Bob Adams , January 29, 2006
Project Orion W
Darin Adler January 29, 2006
fantasy football

F.G. 6C
U.S. Patent Nov.30, 2010 Sheet 10 Of 37 US 7,844.915 B2

PORABLE MULTIFUNCTON DEVICE


O)

35O2
Mailboxes Inbox

Aaron Jones 3506-1o, January 30, 2006"


Re: Project Orion- 3534
35 2-1
O Bruce Walker' January 30, 2006-3'
Lunch meeting today 3? 3530
Kim Brook 3506-3 January 30, 2006 350-3
Draft agreement 35
Bob Adams January 29, 2006
Project Orion
Darin Adler January 29, 2006
fantasy football
Aaron Jones January 29, 2006
Fwd. status report

F.G. 6D
U.S. Patent Nov.30, 2010 Sheet 11 of 37 US 7,844.915 B2

700
1Nu/

TRANSFER A RECTIONAL SCRO CALL


TO DETERMINE FORECTIONAL SCROLLING
SENABE)
O2

TRANSFER A DRECTONAL SCROLANGE


CA O SEA SCROLL ANGLE FOR LOCKNG
HESCROLLENG IN AT LEAST ONE OFA
WER CA OR A HORIZONAL RECTION
704

OCK THE SCRONG EN THE HORIZONAL


DIRECTON FA USER INPUT FORMS AN
ANGE WITH A HORIZONAL RECTION TAT
SLESS THAN OR EQUAL FOA FERST SCRO
ANGLE

LOCK THE SCRONG EN THE WERTCA


DE RECTION IFA USER INPUT FORMS AN
ANGLE WTAWERT CAL DIRECTION HAT
SESS THAN OR EOUAL TO A SECOND
SCROLANGE
708

FIG. 7
U.S. Patent Nov.30, 2010 Sheet 12 Of 37 US 7,844.915 B2

FIG. 8
U.S. Patent Nov.30, 2010 Sheet 13 Of 37 US 7,844.915 B2

900
1N/
TRANSFER A DECELERATION SCROL CAL
O SET A DECELERAF ON FACTOR FOR A
DRAG USER NPUT HA MVOKES A SCROL
902

SLOW HESCROL TO A STOP BASED


ON THE SPEED OF THE DRAGUSER INPUT
904

FIG. 9

COO

TRANSFER A SCROLL YSTERESS CALL TO


OEERMNE WHER THE USER NPU
NWOKES A SCROL
10O2

SET THE HYSTERESIS WALUE FOR


DETERMINING WHETHER A USER NPU
NWOKES A SCROLL
1004

FIG. 10
U.S. Patent Nov.30, 2010 Sheet 14 of 37 US 7,844.915 B2

OO
1Nu/
TRANSFER A SCRO NOCATOR CALL TO
DETERMENE WHETHER AT LEAST ONE
SCROL INDECATOR AACHES TO A SCROE
REGON OR A WINDOW EDGE
1 102

OPTIONALLY ATACHSCROL INDICATORS


TO ASCRO REGON BASED ON
HESCROL INDICATOR CALL
104.

OPTONALLY ATTACHSCROLL NEDCATORS


O AWENDOW EDGE BASED ON
TESCROL ENDICATOR CAL
O6

FIG. 11
1200

TRANSFER AN NAOWERENT USER INPUT


CALL TO DETERMINE WHETHER THE USER
INPUT WAS NADVERTENT
2O2

GNORE THE NADWERTENT SER INPUT


BASED ON HE NADVERENT USER INPUT
CALL
1204

FIG. 12
U.S. Patent Nov.30, 2010 Sheet 15 Of 37 US 7,844.915 B2

1N1
TRANSFERA ANDE GESURE EVEN CA
1302

TRANSFERA GESTURE CHANGE CALL IN


RESPONSE TO THE HANDLEGESTURE
EVEN CAL
1304

FIG. 13
400

TRANSFER A SCALNG TRANSFORM CALL


O DEERMNEASCALENGTRANSFORM
FOR AWEW ASSOCATED WTA USER
INPUT HAVING A PLURALITY OF ENPUT PONS
42

TRANSFER ASCALNG GESTURE STAR


CALL
404

RANSFER ASCALNG GESTURE PROGRESS


CALL
1406

TRANSFER ASCANG GESURE ENO


CALL
48

F.G. 14
U.S. Patent Nov.30, 2010 Sheet 16 of 37 US 7,844.915 B2

F.G. 15
U.S. Patent Nov.30, 2010 Sheet 17 Of 37 US 7,844.915 B2

||
999 09 |

"\7S9D]I,
U.S. Patent Nov.30, 2010 Sheet 18 Of 37 US 7,844.915 B2
U.S. Patent Nov.30, 2010 Sheet 19 Of 37 US 7,844.915 B2

1700

TRANSFER A ROATON TRANSFORM, CALL


TO DEERM NEA ROTATION RANSFORM
FOR AWEEW ASSOCATED WITH A USER
NPUT HAVING A PLURALITY OF NEPUT POINTS
1702

TRANSFER A START ROTAON GESTRE


CALL
704

RANSFER ROATONGESURE
PROCESS CALL
1706

TRANSFER A ROTATION GESTURE END


CALL
1708

F.G. 17
U.S. Patent Nov.30, 2010 Sheet 20 Of 37 US 7,844.915 B2

800

ANY
ANMAON?
18O2

CACULATE NEXT
PROGRESS STATE
1804 1806

DELEGATE
808

NOTFY
DEEGATE
1810

FIG. 18
U.S. Patent Nov.30, 2010 Sheet 21 Of 37 US 7,844.915 B2

900

STAR AT LEAST TWO ANIMAONS


1902

EERMINE THE PROGRESS OF EACH


ANMATION
904

UPDAE EACH OF AT LEAST TWO


ANMATIONS BASED ON A SINGLE MER
1906

FIG. 19
U.S. Patent Nov.30, 2010 Sheet 22 Of 37 US 7,844.915 B2

2OOO

PROVIDEA SiNGLEAN MATON EMER


2002

ANIMAEA PLURALEY OF ANMATIONS


WITH THE SNGLE ANMATION TIMER
2004

FIG. 20
U.S. Patent Nov.30, 2010 Sheet 23 of 37 US 7,844.915 B2

21 OO

SETARBUTES OF WEWS
NDEPENDENTLY WIEHEACH WEW BEING
ASSOCATED WITH PROCESS

TRANSFER A SYNCHRONIZATION CALL


TOSYNCHRONIZE ANIMATIONS FOR THE
MULTIPLE VIEWS OF HE DISPLAY
2104

TRANSFER A SYNCHRONZATION
CONFIRMATION MESSAGE WHEN A
SYNCHRONIZATION FLAG SENABED BASED
ON THEIST OF THE PROCESS BEING
SYNCHRONIZED
2106

UPDATE THEATRIBUTES
OF THE VIEWS NDEPENDENTLY
2108

TRANSFER A STARTANMATION CALL TO


DRAW THE REQUESTED ANIMATIONS
210

FIG. 21
U.S. Patent Nov.30, 2010 Sheet 24 of 37 US 7,844.915 B2

FG. 22A

22OO

FIG. 22B
U.S. Patent Nov.30, 2010 Sheet 25 Of 37 US 7,844.915 B2
230
1N/
CONSTRUCT A DATA STRUCTURE HAVING
A HERARCHY OF LAYERS WITH A
LAYER BEING ASSOCATED WITH A
WEW AND OWNNG THE WEW
2302

REMOVE THE LAYER FROM THE DATA


STRUCTURE
23O4

SWITCH OWNERSHIP OF THE WEW


FROM THE LAYER TO THE WEW
2306

FIG. 23 2400
1N/
CONSTRUCT A DATASTRUCTURE HAVING
A HERARCHY OFLAYERS WITH A
AYER BEING ASSOCAED W A
VIEW
2402

STORE HE DATASTRUCTURE IN MEMORY


24O4.

MANTAN AREANED COUNT OF HE


NUMBER OF REFERENCES TO THE VIEW
FROM OTHER OBJECTS
24O6

DEAL LOCATE THE WEEW FROMMEMORY


F THE REANED COUNT IS ZERO
2408

FIG. 24
U.S. Patent Nov.30, 2010 Sheet 26 of 37 US 7,844.915 B2

FIG. 25A

FIG. 25B
U.S. Patent Nov.30, 2010 Sheet 27 Of 37 US 7,844.915 B2

2600
1N1
CONSTRUCT A DATA STRUCTURE HAVING
A HERARCHY OF AYERS ASSOCATED
WTH THE USER INTERFACE OF THE
DEWCE
2402

DETERMINE WHETHER EACH AYER OF THE


DATA SRUCTURES ASSOCATED WITH
MEDIA OR NON-MEDA CONENT
264

DEACH MEDA CONFEN FROM THE DATA


STRUCTURE
2606

STORE MEDA CONTENT NA FRST


MEMORY LOCATION
2608

STORE NON-MEDACONTENT IN A SECOND


MEMORY LOCATION
2610

COMPOSTE THE MEDIA AND NON-MEDA


CONTENT FOR DISPLAY ON HE
DEWECE
2612

FIG. 26
U.S. Patent Nov.30, 2010 Sheet 28 Of 37 US 7,844.915 B2

-7

FIG. 27
U.S. Patent Nov.30, 2010 Sheet 29 Of 37 US 7,844.915 B2
U.S. Patent Nov.30, 2010 Sheet 30 Of 37 US 7,844.915 B2

295.0

2958

FIG. 29
U.S. Patent Nov.30, 2010 Sheet 31 Of 37 US 7,844.915 B2

3084 3098
3070

FIG. 30A

3092 -/

FIG. 3OB
U.S. Patent Nov.30, 2010 Sheet 32 Of 37 US 7,844.915 B2

$
||
00

(S)HOSNES
U.S. Patent Nov.30, 2010 Sheet 33 Of 37 US 7,844.915 B2

320C
CACHE
32O8 1.
\ ^
(OPTIONAL)
32O2 3204

MCROPROCESSOR MEMORY

32O6

3212

DISPLAY IO
CONTROLER CONTROLLER(S)
AND 3216
DISPLAY
DEVICE
(OPTIONAL) SENSOR(S)
FOR USER
I/ODEVICES) ACTIWTY
(E.G., KEYBOARD,
CURSOR CONTROL
DEVICE, NETWORK
INTERFACE)

FIG. 32
U.S. Patent Nov.30, 2010 Sheet 34 of 37 US 7,844.915 B2

3304
U.S. Patent Nov.30, 2010 Sheet 35. Of 37 US 7,844.915 B2

3354
U.S. Patent Nov.30, 2010 Sheet 36 of 37 US 7,844.915 B2

3382

-se

3360

FIG. 33C
U.S. Patent Nov.30, 2010 Sheet 37 Of 37 US 7,844.915 B2

3400

CONSTRUCT A HERARCHY OF WEWS


OPERAING ON TOP OFA HERARCHY OF
LAYERS
3402

PROVIDEACCESS TO THE HERARCHY OF


WEWS WITHOUT PROVIDING ACCESS TO THE
HERARCHY OF LAYERS
3404

FIG. 34
US 7,844.915 B2
1. 2
APPLICATION PROGRAMMING At least certain embodiments of the present disclosure
INTERFACES FORSCROLLING include an environment with user interface software interact
OPERATIONS ing with a software application. A method for operating
through an application programming interface (API) in this
FIELD OF THE DISCLOSURE environment includes transferring a set bounce call. The
method further includes setting at least one of maximum and
This disclosure relates to application programming inter minimum bounce values. The set bounce call causes abounce
faces that provide Scrolling operations. of a scrolled region in an opposite direction of a scroll based
on a region past an edge of the scrolled region being visible in
COMPUTER PROGRAM LISTING 10 a display region at the end of the scroll.
At least certain embodiments of the present disclosure
A portion of the disclosure of this patent document con include an environment with user interface software interact
tains material which is Subject to copyright protection. The ing with a software application. A method for operating
copyright owner has no objection to facsimile reproduction through an application programming interface (API) in this
by anyone of the patent document or the patent disclosure, as 15 environment includes transferring a rubberband call. Rubber
it appears in the Patent & Trademark Office patent file or banding a scrolled region within a display region occurs by a
records, but otherwise reserves all copyright rights whatso predetermined maximum displacement when the Scrolled
eVe. region exceeds a display edge. The method further includes
Applicant has submitted herewith Computer Program List transferring an edge rubberband call to set displacement val
ings which are included as Appendix A, attached. ues for at least one edge of the display (e.g., top and bottom
edges, left and right edges).
BACKGROUND OF THE DISCLOSURE At least certain embodiments of the present disclosure
include gesture operations for a display of a device. The
An API is a Source code interface that a computer system or gesture operations include performing a scaling transform
program library provides in order to Support requests for 25 Such as a Zoom in or Zoom out in response to a user input
services from a software application. An API is specified in having two or more input points. The gesture operations also
terms of a programming language that can be interpretative or include performing a rotation transform to rotate an image or
compiled when an application is built, rather than an explicit view in response to a user input having two or more input
low level description of how data is laid out in memory. The points.
30 At least certain embodiments of the present disclosure
software that provides the functionality described by an API
is said to be an implementation of the API. include a method for performing animations for a display of
Various devices such as electronic devices, computing sys a device. The method includes starting at least one animation.
tems, portable devices, and handheld devices have software The method further includes determining the progress of each
applications. The API interfaces between the software appli animation. The method further includes completing each ani
cations and user interface Software to provide a user of the
35 mation based on a single timer. The single timer can be based
device with certain features and operations. A user may desire on a redraw interval of the display hardware.
certain operations such as scrolling, selecting, gesturing, and Various devices which perform one or more of the forego
animating operations for a display of the device. ing methods and machine readable media which, when
Scrolling is the act of sliding a directional (e.g., horizontal executed by a processing system, cause the processing system
40
or vertical) presentation of content, Such as text, drawings, or to perform these methods, are also described.
images, across a screen or display window. In a typical Other methods, devices and machine readable media are
graphical user interface, Scrolling is done with the help of a also described.
scrollbar or using keyboard shortcuts often the arrow keys. BRIEF DESCRIPTION OF THE DRAWINGS
Gesturing is a type of user input with two or more input 45
points. Animating operations include changing content
within a given time period. The disclosure is described by way of example with refer
The various types of devices may have a limited display ence to the accompanying drawings, wherein:
size, user interface, Software, API interface and/or processing FIG. 1 is flow chart of a method for responding to a user
capability which limits the ease of use of the devices. User 50
input of a data processing device;
interfaces of devices implement APIs in order to provide FIG. 2 illustrates details of an application programming
requested functionality and features. These user interfaces interface in flow chart form according to certain teachings of
can have difficulty interpreting the various types of user the present disclosure;
inputs and providing the intended functionality associated FIG. 3 illustrates details of an application programming
with the user inputs. 55 interface in flow chart form according to certain teachings of
the present disclosure;
SUMMARY OF THE DESCRIPTION FIG. 4 is a schematic diagram illustrating an embodiment
of user interface of a portable electronic device 400 having a
At least certain embodiments of the present disclosure touch-sensitive display 408;
include one or more application programming interfaces in an 60 FIG. 5A-5C illustrate at least some embodiments of user
environment with user interface Software interacting with a interface of a portable electronic device 400 having a touch
Software application. Various function calls or messages are sensitive display;
transferred via the application programming interfaces FIG. 6A-6D illustrate the scrolling of a list of items to a
between the user interface software and software applica terminus of the list, at which point an area beyond the termi
tions. Example application programming interfaces transfer 65 nus is displayed and the list is then scrolled in an opposite
function calls to implement Scrolling, gesturing, and animat direction until the area beyond the terminus is no longer
ing operations for a device. displayed, in accordance with some embodiments;
US 7,844.915 B2
3 4
FIG. 7 illustrates details of an application programming FIG. 32 shows another example of a device in accordance
interface in flow chart form according to certain teachings of with one embodiment of the present disclosure;
the present disclosure; FIG. 33A is a perspective view of a device in a first con
FIG. 8 illustrates first and second scroll angles for locking figuration (e.g. in a laptop configuration) in accordance with
a scroll of a display of a device in a horizontal or vertical one embodiment of the present disclosure;
direction according to certain teachings of the present disclo FIG.33B is a perspective view of the device of FIG.33A in
Sure; a second configuration (e.g. a transition configuration) in
FIG. 9 illustrates details of an application programming accordance with one embodiment of the present disclosure;
interface in flow chart form according to certain teachings of FIG.33C is a perspective view of the device of FIG.33A in
the present disclosure; 10 a third configuration (e.g., a tablet configuration) in accor
FIG. 10 illustrates details of an application programming dance with one embodiment of the present disclosure; and
interface in flow chart form according to certain teachings of FIG. 34 illustrates details of an application programming
the present disclosure; interface in flow chart form according to certain teachings of
FIG. 11 illustrates details of an application programming the present disclosure.
interface in flow chart form according to certain teachings of 15
the present disclosure; DETAILED DESCRIPTION
FIG. 12 illustrates details of an application programming
interface in flow chart form according to certain teachings of Various embodiments and aspects of the disclosure will be
the present disclosure; described with reference to details discussed below, and the
FIG. 13 illustrates details of an application programming accompanying drawings will illustrate the various embodi
interface in flow chart form according to certain teachings of ments. The following description and drawings are illustra
the present disclosure; tive of the disclosure and are not to be construed as limiting
FIG. 14 illustrates details of an application programming the disclosure. Numerous specific details are described to
interface in flow chart form according to certain teachings of provide a through understanding of various embodiments of
the present disclosure; 25 the present disclosure. However, in certain instances, well
FIG. 15 illustrates a display of a device having a scaling known or conventional details are not described in order to
transform of a view: provide a concise discussion of embodiments of the present
FIGS. 16A and 16B illustrate a display of a device with a disclosure.
view having a first and a second Scaling factor, Some portions of the detailed descriptions which follow
FIG.16C illustrates changing a view from a scale factor of 30 are presented in terms of algorithms which include operations
2x to a scale factor of 1x in at least some embodiments of the on data stored within a computer memory. An algorithm is
present disclosure; generally a self-consistent Sequence of operations leading to
FIG. 17 illustrates details of an application programming a desired result. The operations typically require or involve
interface in flow chart form according to certain teachings of physical manipulations of physical quantities. Usually,
the present disclosure; 35 though not necessarily, these quantities take the form of elec
FIG. 18 illustrates details of an application programming trical or magnetic signals capable of being stored, transferred,
interface in flow chart form according to certain teachings of combined, compared, and otherwise manipulated. It has
the present disclosure; proven convenient at times, principally for reasons of com
FIG. 19 is flow chart of a method for animating views mon usage, to refer to these signals as bits, values, elements,
displayed on a display of a device; 40 symbols, characters, terms, numbers, or the like.
FIG. 20 is flow chart of a method for animating views It should be borne in mind, however, that all of these and
displayed on a display of a device; similar terms are to be associated with the appropriate physi
FIG. 21 illustrates details of an application programming cal quantities and are merely convenient labels applied to
interface in flow chart form according to certain teachings of these quantities. Unless specifically stated otherwise as
the present disclosure;
45 apparent from the following discussion, it is appreciated that
throughout the description, discussions utilizing terms such
FIGS. 22A and 22B illustrate synchronizing the resizing of as “processing or “computing or "calculating or “deter
windows of a display of a device; mining or "displaying or the like, can refer to the action and
FIG. 23 illustrates a method for switching ownership of a processes of a data processing system, or similar electronic
view of an application displayed on a display of a data pro 50 device, that manipulates and transforms data represented as
cessing device; physical (electronic) quantities within the system's registers
FIG. 24 illustrates a method for memory management of a and memories into other data similarly represented as physi
view of an application displayed on a display of a device; cal quantities within the system's memories or registers or
FIGS. 25A and 258 illustrate a data structure having a other Such information storage, transmission or display
hierarchy of layers with a layer being associated with a view: 55 devices.
FIG. 26 illustrates a method for compositing media and The present disclosure can relate to an apparatus for per
non-media content of user interface for display on a device; forming one or more of the operations described herein. This
FIG. 27 illustrates a data structure or layer tree having a apparatus may be specially constructed for the required pur
hierarchy of layers; poses, or it may comprise a general purpose computer selec
FIG.28 is a perspective view of a device in accordance with 60 tively activated or reconfigured by a computer program stored
one embodiment of the present disclosure; in the computer. Such a computer program may be stored in a
FIG.29 is a perspective view of a device in accordance with machine (e.g. computer) readable storage medium, Such as,
one embodiment of the present disclosure; but is not limited to, any type of disk including floppy disks,
FIGS. 30A and 30B illustrate a device 3070 according to optical disks, CD-ROMs, and magnetic-optical disks, read
one embodiment of the disclosure; 65 only memories (ROMs), random access memories (RAMs),
FIG. 31 is a block diagram of a system in which embodi erasable programmable ROMs (EPROMs), electrically eras
ments of the present disclosure can be implemented; able programmable ROMs (EEPROMs), flash memory, mag
US 7,844.915 B2
5 6
netic or optical cards, or any type of media Suitable for storing The portable media player may include a media selection
electronic instructions, and each coupled to a bus. device, such as a click wheel input device on an iPod(R) or iPod
A machine-readable medium includes any mechanism for Nano.R. media player from Apple Computer, Inc. of Cuper
storing information in a form readable by a machine (e.g., a tino, Calif., a touch screen input device, pushbutton device,
computer). For example, a machine-readable medium 5 movable pointing input device or other input device. The
includes read only memory (“ROM); random access media selection device may be used to select the media stored
memory (RAM); magnetic disk storage media; optical Stor on the storage device and/or the remote storage device. The
age media; flash memory devices; etc. portable media player may, in at least certain embodiments,
At least certain embodiments of the present disclosure include a display device which is coupled to the media pro
include one or application programming interfaces in an envi 10 cessing system to display titles or other indicators of media
ronment with user interface software interacting with a soft being selected through the input device and being presented,
ware application. Various function calls or messages are either through a speaker or earphone(s), or on the display
transferred via the application programming interfaces device, or on both display device and a speaker or ear
between the user interface software and software applica phone(s). In some embodiments, the display device and input
tions. Transferring the function calls or messages may include 15 device are integrated while in other embodiments the display
issuing, initiating, invoking or receiving the function calls or device and input device are separate devices. Examples of a
messages. Example application programming interfaces portable media player are described in published U.S. patent
transfer function calls to implement Scrolling, gesturing, and application Nos. 2003/0095096 and 2004/0224638, both of
animating operations for a device having a display region. An which are incorporated by reference.
API may also implement functions having parameters, Vari Embodiments of the disclosure described herein may be
ables, or pointers. An API may receive parameters as dis part of other types of data processing systems, such as, for
closed or other combinations of parameters. In addition to the example, entertainment systems or personal digital assistants
APIs disclosed, other APIs individually or in combination can (PDAs), or general purpose computer systems, or special
perform similar functionality as the disclosed APIs. purpose computer systems, or an embedded device within
The display region is a form of a window. A window is a 25 another device, or cellular telephones which do not include
display region which may not have a border and may be the media players, or multi touch tablet devices, or other multi
entire display region or area of a display. In some embodi touch devices, or devices which combine aspects or functions
ments, a display region may have at least one window and/or of these devices (e.g., a media player, such as an iPodR,
at least one view (e.g., web, text, or image content). A window combined with a PDA, an entertainment system, and a cellu
may have at least one view. The methods, systems, and appa 30 lar telephone in one device). In this disclosure, electronic
ratuses disclosed can be implemented with display regions, devices and consumer devices are types of devices.
windows, and/or views. FIG. 1 is flow chart of a method for responding to a user
At least certain embodiments of the present disclosure input of a device. The method 100 includes receiving a user
include Scrolling operations for Scrolling a display of a input at block 102. The user input may be in the form of an
device. The Scrolling operations include bouncing a scrolled 35 input key, button, wheel, touch, or other means for interacting
region in an opposite direction of a scroll when a scroll with the device. The method 100 further includes creating an
completes, rubberbanding a scrolled region by a predeter event object in response to the user input at block 104. The
mined maximum displacement when the Scrolled region method 100 further includes determining whether the event
exceeds a display edge, and setting a scrolling angle that locks object invokes a scroll or gesture operation at block 106. For
the scroll in a horizontal or vertical direction. 40 example, a single touch that drags a distance across a display
At least certain embodiments of the present disclosure of the device may be interpreted as a scroll operation. In one
include gesture operations for a display of a device. The embodiment, a two or more finger touch of the display may be
gesture operations include performing a scaling transform interpreted as a gesture operation. In certain embodiments,
Such as a Zoom in or Zoom out in response to a user input determining whether the event object invokes a scroll or ges
having two or more input points. The gesture operations also 45 ture operation is based on receiving a drag user input for a
include performing a rotation transform to rotate an image or certain time period. The method 100 further includes issuing
view in response to a user input having two or more input at least one scroll or gesture call based on invoking the scroll
points. or gesture operation at block 108. The method 100 further
At least certain embodiments of the present disclosure includes responding to at least one scroll call, if issued, by
include a method for performing animations for a display of 50 scrolling a window having a view (e.g., web, text, or image
a device. The method includes starting at least one animation. content) associated with the event object based on an amount
The method further includes determining the progress of each of a scroll with the scroll stopped at a predetermined position
animation. The method further includes completing eachani in relation to the user input at block 110. For example, an
mation based on a single timer. The single timer can be based input may end at a certain position on a display of the device.
on a redraw interval of the display hardware. 55 The scrolling may continue until reaching a predetermined
At least certain embodiments of the disclosure may be part position in relation to the last input received from the user.
of a digital media player, such as a portable music and/or The method 100 further includes responding to at least one
Video media player, which may include a media processing gesture call, if issued, by changing a view associated with the
system to present the media, a storage device to store the event object based on receiving a plurality of input points in
media and may further include a radio frequency (RF) trans 60 the form of the user input at block 112.
ceiver (e.g., an RF transceiver for a cellular telephone) In certain embodiments of the present disclosure scroll
coupled with an antenna system and the media processing operations include attaching scroll indicators to a content
system. In certain embodiments, media stored on a remote edge of a display. Alternatively, the Scroll indicators can be
storage device may be transmitted to the media player attached to the display edge. In some embodiments, user
through the RF transceiver. The media may be, for example, 65 input in the form of a mouse/finger down causes the scroll
one or more of music or other audio, still pictures, or motion indicators to be displayed on the display edge, content edge,
pictures. or window edge of the Scrolled region. If a mouse/finger up is
US 7,844.915 B2
7 8
then detected, the scroll indicators are faded out from the In certain embodiments of the present disclosure, transfer
display region, content edge, or window edge of the Scrolled ring the rubberband call is either one of issuing, initiating,
region. invoking or receiving the rubberband call.
In certain embodiments of the present disclosure, gesture FIG. 4 is a schematic diagram illustrating an embodiment
operations include responding to at least one gesture call, if 5 of user interface of a portable electronic device 400 having a
issued, by rotating a view associated with the event object touch-sensitive display 408. The display 408 may include a
based on receiving a plurality of input points in the form of the window 410. The window 410 may include one or more
user input. Gesture operations may also include Scaling a displayed objects, such as information objects 412-1 to 412
view associated with the event object by Zooming in or Zoom 4. In an exemplary embodiment, the information objects 412
ing out based on receiving the user input. 10 may correspond to contact information for one or more indi
In some embodiments, a device includes a display region viduals in a list of items. The displayed objects may be moved
having multiple views or windows. Each window may have a in response to detecting or determining movement 414 of a
multiple views including Superviews and Subviews. It is nec point of contact with the display, Such as that associated with
essary to determine which window, view, superview, or sub one or more digits 416 of a user (which are not drawn to Scale
view is contacted by a user input in the form of a mouse up, 15 in FIG. 4). In some embodiments, movement of the displayed
mouse down, or drag, etc. An API can set various modes for objects may be accelerated in response to detecting or deter
making this determination. In one embodiment, a pass mode mining accelerated movement of the point of contact. While
sends mouse down, mouse up, and drag inputs to the nearest embodiment 400 includes one window 410, in other embodi
Subview. In another embodiment, an intercept on drag mode ments there may be two or more display windows. In addi
sends a drag input to the Superview while mouse up and down tion, while embodiment 400 illustrates movement 414 in a
inputs are sent to the Subview. In another embodiment, an particular direction, in other embodiments movement of the
intercept mode sends all drag, mouse up and down inputs to displayed objects may be in response to movement 414 in one
the superview. The superview may be scroller software oper or more other directions, or in response to a scalar (i.e., a
ating as a subclass of a view software. The subview may be determined or detected movement independent of the direc
view Software operating as a Subclass of the user interface 25 tion).
software. FIGS. 5A-5C illustrate the scrolling of a list of items on a
FIG. 2 illustrates details of an application programming device to a terminus of the list, at which point one or more
interface in flow chart form according to certain teachings of displayed items at the end of the list smoothly bounce off the
the present disclosure. The application programming inter end of the display, reverse direction, and then optionally come
face operates in an environment with user interface Software 30 to a stop. FIG. 5A is a schematic diagram illustrating an
interacting with a software application in order to provide a embodiment of user interface of a portable electronic device
bounce operation. The method 200 for providing a bounce 400 having a touch-sensitive display. One or more displayed
operation includes transferring a set bounce call at block 202. objects, such as information object 412-1 may be a distance
The method 200 further includes setting at least one of maxi 512-1 from a terminus 514 of the list of items which is an edge
mum and minimum bounce values at block 204. The mini 35 of a scrolled region and may be moving with a velocity 510-1
mum and maximum bounce values may be associated with at while the list is being scrolled. Note that the terminus 514 is
least one edge of a window that has received a user input. The a virtual boundary associated with the displayed objects, as
method 200 further includes causing a bounce of a scrolled opposed to a physical boundary associated with the window
region in an opposite direction of a scroll based on a region 410 and/or the display 408. As illustrated in FIG. 5B, when
past the scrolled region being visible in a display region at the 40 the one or more displayed objects, such as the information
end of the scroll at block 206. The scrolled region may be a object 412-1, reach or intersect with the terminus 514, the
content region. movement corresponding to the scrolling may stop, i.e., the
In certain embodiments of the present disclosure, transfer scrolling Velocity may be Zero at an instant in time. As illus
ring the set bounce call is either one of issuing, initiating, trated in FIG. 5C, the one or more displayed objects, such as
invoking or receiving the set bounce call. 45 the information 412-1, may subsequently reverse direction.
FIG. 3 illustrates details of an application programming At a time after the intersection with the terminus 514, the
interface in flow chart form according to certain teachings of information object 412-1 may have velocity 510-2 and may
the present disclosure. The application programming inter be a distance 512-2 from the terminus 514. In some embodi
face operates in an environment with user interface Software ments, the magnitude of velocity 510-2 may be less than the
interacting with a software application in order to provide a 50 magnitude of velocity 510-1 when the distance 512-2 equals
rubberband operation. The method 300 for providing a rub the distance 512-1, i.e., the motion of the one or more dis
berband operation includes transferring a rubberband call to played objects is damped after the scrolling list reaches and
cause rubberbanding a scrolled region displayed within a “bounces' at its terminus.
display at block 302. The method 300 further includes trans In at least Some embodiments of the present disclosure, the
ferring an edge rubberband call to set displacement values for 55 method 200 performs the bounce operations described in
at least one edge of the display at block 304. In some embodi FIGS.5A-5C. The bounce call transferred at block 202 deter
ments, the displacement values are set for top and bottom mines whether a bounce operation is enabled. The maximum
edges, left and right edges, or all edges. and minimum bounces values determine the amount of
Rubberbanding a scrolled region according to the method bouncing of the scrolled region in an opposite direction of the
300 occurs by a predetermined maximum displacement value 60 scroll.
when the Scrolled region exceeds a display edge of a display FIGS. 6A-6D illustrate the scrolling of a list of items to a
of a device based on the scroll. If a user scrolls content of the terminus of the list, at which point an area beyond the termi
display making a region past the edge of the content visible in nus is displayed and the list is then scrolled in an opposite
the display, then the displacement value limits the maximum direction until the area beyond the terminus is no longer
amount for the region outside the content. At the end of the 65 displayed, in accordance with Some embodiments. The rub
scroll, the content slides back making the region outside of berband operation of method 300 is illustrated in the example
the content no longer visible on the display. of FIGS. 6A-6D with the listed items being email messages.
US 7,844.915 B2
9 10
FIGS. 6A-6D illustrate an exemplary user interface 3500A transferring a directional Scroll angle call to determine if
for managing an inbox in accordance with Some embodi directional scrolling is enabled at block 702. The method 700
ments. An analogous user interface may be used to display further includes transferring a directional scroll angle call to
and manage other mailboxes (e.g., drafts, sent, trash, per set a scroll angle for locking the scrolling in at least one of a
Sonal, etc.). In addition, other types of lists are possible, vertical or a horizontal direction at block 704. The method
including but not limited to lists of instant message conver 700 further includes locking the scrolling in the horizontal
sations, favorite phone numbers, contact information, labels, direction if a user input forms an angle with a horizontal
email folders, email addresses, physical addresses, ringtones, direction that is less than or equal to a first scroll angle at block
or album names. 706. The method 700 further includes locking the scrolling in
If the list of emails fills more than the allotted screen area, 10 the vertical direction if a user input forms an angle with the
the user may scroll through the emails using vertically Vertical direction that is less than or equal to a second scroll
upward and/or vertically downward Swipe gestures on the angle at block 708.
touchscreen. In the example of FIG. 6A, a portion of a list of In certain embodiments, a user input in the form of a drag
emails is displayed in the screen area, including a top dis forms an angle with the horizontal direction that is less than
played email 3530 from Bruce Walker and a bottom displayed 15 the first scroll angle. In this case, the user presumably intends
email 3532 from Kim Brook. A user performs a vertically to scroll in the horizontal direction. The scrolling will be
downward swipe gesture 3514 to scroll toward the top of the locked in the horizontal direction until the user input exceeds
list. The vertically downward gesture 3514 need not be the first Scroll angle. A second scroll angle may be used for
exactly vertical; a Substantially vertical gesture is Sufficient. locking the user input in the vertical direction. The second
In some embodiments, a gesture within a predetermined scroll angle may be set equal to the first Scroll angle.
angle of being perfectly vertical results in vertical scrolling. FIG. 8 illustrates first and second scroll angles for locking
As a result of detecting the vertically downward gesture a scroll of a display of a device in a horizontal or vertical
3514, in FIG. 6B the displayed emails have shifted down, direction. The horizontal direction 802 and vertical direction
such that the previous bottom displayed email 3532 from Kim 804 are in reference to a window or a display of a device. As
Brook is no longer displayed, the previous top displayed 25 discussed in the method 700, a user input such as a drag
email 3530 from Bruce Walker is now second from the top, movement forming an angle with the horizontal direction 802
and the email 3534 from Aaron Jones, which was not dis less than or equal to the first scrolling angle 806 or 808 will
played in FIG. 6A, is now displayed at the top of the list. lock the user input in the horizontal direction. In a similar
In this example, the email 3534 from Aaron Jones is the manner, a user input forming an angle with the Vertical direc
first email in the list and thus is the terminus of the list. Upon 30 tion 810 less than or equal to the second scrolling angle 810 or
reaching this email 3534, in response to continued detection 812 will lock the user input in the vertical direction. The first
of the vertically downward gesture 3514, an area 3536 (FIG. and second scrolling angles may be set at the same angle or at
6C) above the first email 3534 (i.e., beyond the terminus of different angles as well. For example, the first and second
the list) is displayed. In some embodiments, the area dis scrolling angles may be set at 25 degrees. A user input less
played beyond the terminus of the list is visually indistinct 35 than or equal to 25 degrees with respect to the horizontal or
from the background of the list. In FIG.6C, both the area 3536 vertical direction will lock the scrolling in the appropriate
and the background of the emails (e.g., emails 3534 and 3530) direction.
are white and thus are visually indistinct. In some embodiments, the horizontal and Vertical locking
Once vertically downward gesture 3514 is complete, such angles can be determined in part by the aspect of the content.
that a corresponding object is no longer detected on or near 40 For example, content in the form of a tall page may receive a
the touch screen display, the list is scrolled in an opposite larger vertical locking angle compared to the horizontal lock
direction until the area 3536 is no longer displayed. FIG. 6D ing angle.
illustrates the result of this scrolling in the opposite direction, FIG. 9 illustrates details of an application programming
the email 3534 from Aaron Jones is now displayed at the top interface in flow chart form according to certain teachings of
of the screen area allotted to the list and the area 3536 is not 45 the present disclosure. The application programming inter
displayed. face operates in an environment with user interface Software
In the example of FIGS. 6A-6D, a vertically downward interacting with a software application in order to provide a
gesture resulted in display of an area beyond the first item in deceleration scroll operation. The method 900 for providing
the list. As described in FIG. 3, the values for the predeter the deceleration scroll operation includes transferring a
mined maximum displacement (e.g., display of an area 50 deceleration scroll call to set a deceleration factor for a drag
beyond the first item in the list) are set at block 304 for top and user input at block 902. The method 900 further includes
bottom edges or at block 306 for all edges of the window. slowing the scroll to a stop based on the speed of the drag user
Similarly, a vertically upward gesture may result in display input and the deceleration factor at block 904.
of an area beyond the last item of the list, if the vertically In certain embodiments, a user input in the form of a drag
upward gesture continues once the list has been scrolled to the 55 invokes a scroll operation for a certain time period. The user
last item. The last item may be considered a terminus of the input has a certain speed. The scroll of the scrolled region of
list, similar to the first item. As discussed above, the gesture a window or a display region of a display of a device will be
need not be exactly vertical to result in Vertical scrolling; a stopped after the user input stops by applying a deceleration
gesture within a predefined range of angles from perfectly factor to the speed of the user input during the drag move
vertical is sufficient. 60 ment.
FIG. 7 illustrates details of an application programming FIG. 10 illustrates details of an application programming
interface in flow chart form according to certain teachings of interface in flow chart form according to certain teachings of
the present disclosure. The application programming inter the present disclosure. The application programming inter
face operates in an environment with user interface Software face operates in an environment with user interface Software
interacting with a software application in order to provide a 65 interacting with a software application in order to provide a
directional scrolling operation. The method 700 for operating scroll hysteresis operation. The method 1000 for providing
through an application programming interface (API) includes the scroll hysteresis operation includes transferring a scroll
US 7,844.915 B2
11 12
hysteresis call to determine whether a user input invokes a or display. A mask may merely permit certain changes while
Scroll at block 1002. The method 1000 further includes set limiting or not permitting other changes. Events of all kinds
ting a hysteresis value for determining whether a user input come into the application via agraphics framework. They are
invokes a scroll at block 1004. enqueued, collaleced if necessary and dispatched. If the
In certain embodiments, a user input in the form of a drag 5 events are system level events (e.g., application should sus
over a certain distance across a display or window within a pend, device orientation has chanted, etc) they are routed to
display of a device invokes a scroll operation. The hysteresis the application having an instance of a class of the user
value determines the certain distance which the user input interface software. If the events are hand events based on a
must drag across the display or window prior to invoking a user input, the events are routed to the window they occurred
scroll operation. A user input that does not drag the certain 10 over. The window then routes these events to the appropriate
predetermined distance will not invoke a scroll operation and control by calling the instance's mouse and gesture methods.
may be considered a mouse up or down input or other type of The control that receives a mouse down or mouse entered
input. function will continue to get all future calls until the hand is
FIG. 11 illustrates details of all application programming lifted. If a second finger is detected, the gesture methods or
interface in flow chart form according to certain teachings of 15 functions are invoked. These functions may include start,
the present disclosure. The application programming inter change, and end gesture calls. The control that receives start
face operates in an environment with user interface Software gesture call will be sent all future change gesture calls until
interacting with a software application in order to attach a the gesture ends.
scroll indicator to a scroll region edge or a window edge of a FIG. 13 illustrates details of an application programming
device. In some embodiments, the Scroll region edge is asso interface in flow chart form according to certain teachings of
ciated with a content edge. The window or display edge may the present disclosure. The application programming inter
be associated with the edge of a display region. The method face operates in an environment with user interface Software
1100 for providing the scroll indicator includes transferring a interacting with a software application in order to provide a
scroll indicator call to determine whether at least one scroll gesture operation. The method 1300 for providing the gesture
indicator attaches to an edge of a scroll region or a window 25 operation includes transferring a handle gesture event call at
edge at block 1102. A scroll indicator may be displayed on block 1302. The method 1300 further includes transferring a
any display edge, window edge or Scroll region edge. The gesture change call in response to the handle gesture event
method 1100 further includes optionally attaching at least one call at block 1304.
scroll indicator to the edge of the scroll region based on the In certain embodiments, a user input in the form of two or
scroll indicator call at block 1104. Alternatively, the method 30 more points is received by a display of a device. A multi-touch
1100 further includes optionally attaching at least one scroll driver of the device receives the user input and packages the
indicator to the window edge of the view based on the scroll event into an event object. A window server receives the event
indicator call at block 1106. object and determines whether the event object is a gesture
In some embodiments, the operations of method 1100 can event object. If the window server determines that a gesture
be altered, modified, combined, or deleted. For example, 35 event object has been received, then user interface software
block 1104 can be deleted. Likewise, block 1106 can be issues or transfers the handle gesture call at block 1302 to a
deleted from the method 1100. Alternatively, the order of software application associated with the view. The software
block 1104 and block 1106 can be switched. Other methods application confirms that a gesture event has been received
having various operations that have been disclosed within the and passes the handle gesture call to a library of the user
interface software. The window server also associates the
present disclosure can also be altered, modified, rearranged, 40
gesture event object with the view that received the user input.
collapsed, combined, or deleted. The library responds by transferring a gesture change call in
In certain embodiments of the present disclosure, transfer response to the handle gesture event call at block 1304.
ring the scroll indicator call is either one of issuing, initiating, In one embodiment, a window or view associated with the
invoking or receiving the scroll indicator call. For example, user input receives the change call in order to perform the
the user interface Software (e.g., Software kit or library) may 45
gesture event. The user software that provides the view
receive the scroll indicator call from the software application. receives a gesture start event call, a gesture changed event
FIG. 12 illustrates details of an application programming call, a Zoom to Scale setting for the view, and a gesture end
interface in flow chart form according to certain teachings of call. The gesture calls receive an input of a gesture event
the present disclosure. The application programming inter which may be base event having a type Such as a hand event,
face operates in an environment with user interface Software 50
keyboard event, system event, etc. A delegate associated with
interacting with a software application in order to determine the application receives a startgesture call, gesture did change
if an inadvertent user input contacts a view of a display of a call, and gesture did finish call. The user Software is dynami
device. The method 1200 includes transferring an inadvertent cally linking into the application during the run time of the
user input call to determine whether the user input was inad gesture process.
vertent at block 1202. The method 1200 further includes 55
ignoring the inadvertent user input based on the determina In some embodiments, the gesture changed function call
tion of the inadvertent user input call at block 1204. In one contains the following information about the gesture:
embodiment, the inadvertent user input call comprises a the number offingers currently down;
thumb detection call to determine whether the user input was the number offingers initially down;
an inadvertent thumb. 60
In certain embodiments of the present disclosure, transfer the rotation of the hand;
ring the inadvertent user input call is either one of issuing, the scale of the hand;
initiating, invoking or receiving the inadvertent user input
call. the translation of the hand;
A gesture API provides an interface between an application 65
the position of the inner and outermost fingers; and
and user Software in order to handle gesturing. Gesturing may
include Scaling, rotating, or other changes to a view, window, the pressure of the first finger.
US 7,844.915 B2
13 14
In other embodiments, more information about each finger to represent that a larger portion of content is not being
down may be included as follows. displayed on display 1652 in FIG.16B as a result of the Zoom
the stage of the finger Oust touchdown, fully pressed, lifting in operation.
off, etc); In at least some embodiments of the present disclosure, a
user desires to change a view 1670 from a scale factor of 2x
the position of the finger, to a scale factor of 1x as illustrated in FIG.16C. A first set of
the proximity of the finger (how hard you're touching); user inputs 1672 and 1674 that move to the second set of user
inputs 1676 and 1678 will decrease the scale factor from 2x to
the orientation of the finger (what angle the ovoid is at); 10 1x. It may be desirable for the user to scale from 2x to 1 x
the length of the major and minor axis, without having to move the user inputs a large distance across
the view 1670. In an environment with user interface software
the velocity of the finger; and interacting with a Software application, a gesture Scaling
the eccentricity of the finger's ovoid. transform flag may be set in order to determine a scaling
A gesture event object may be a chord event object having 15 transform for a view associated with a user input having a
a chord count (e.g., number of fingers contacted the view or plurality of input points. The Scaling transform flag scales
display), a chord start event, a chord change event, and a either from a current scale factor to a minimum scale factor or
chord end event. A chord change event may include a scaling from the current scale factor to a maximum scale factor. For
or rotation transform. example, a flag may be set at the position associated with a
FIG. 14 illustrates details of an application programming 1.5x scale factor and a third set of user inputs 1680 and 1682.
interface in flow chart form according to certain teachings of A user desiring to change the scale factor from 2x to 1x would
the present disclosure. The application programming inter only have to move his fingers, the user inputs, from the first set
face operates in an environment with user interface Software 1672 and 1674 to the third set 1680 and 1682 if the gesture
interacting with a software application in order to provide a 25
Scaling transform flag has been set at a scale factor of 1.5x.
Scaling transform of a display region, window, or view of a FIG. 17 illustrates details of an application programming
display of a device. The method 1400 for providing the scal interface in flow chart form according to certain teachings of
ing transform includes transferring a scaling transform call to the present disclosure. The application programming inter
determine a scaling transform for a view associated with a face operates in an environment with user interface Software
user input having a plurality of input points at block 1402.The 30 interacting with a software application in order to provide a
method 1400 further includes transferring a scaling gesture rotation transform of a view, window, or display region of a
start call at block 1404. The method 1400 further includes display of a device. The method 1700 for providing the rota
transferring a scaling gesture progress call at block 1406. The tion transform includes transferring a rotation transform call
to determine a rotation transform for a view associated with a
method 1200 further includes transferring a scaling gesture 35 user input having a plurality of input points at block 1702. The
end call at block 1408. method 1700 further includes transferring a rotation gesture
In certain embodiments, a user input in the form of two or Start call at block 1704. The method 1700 further includes
more input points (e.g., fingers) moves together or apart to transferring a scaling gesture progress call at block 1706. The
invoke a gesture event that performs a Scaling transform on method 1700 further includes transferring a scaling gesture
the view associated with the user input. A scale transform 40
end call at block 1708.
includes a minimum and maximum scale factor. FIG. 15 In certain embodiments, a user input in the form of two or
illustrates a display 1502 of a device having a scaling trans more input points rotates to invoke a gesture event that per
form of a view. The view 1504 (e.g., web, text, or image forms a rotation transform on the view associated with the
content) has a first scale factor. A user input (e.g., two fingers user input. The rotation transform includes a minimum and
moving apart) associated with the view 1504 is interpreted as 45 maximum degree of rotation for associated minimum and
a gesture event to Zoom out from view 1504 to view 1508 maximum rotation views. The user input may temporarily
having a second scale factor that exceeds the maximum scale rotate a view past a maximum degree of rotation prior to the
factor of the view 1516. A snapback flag determines whether view Snapping back to the maximum degree of rotation.
the Zoom out can proceed past the maximum scale factor to FIG. 18 illustrates details of an application programming
view 1508 prior to snapping back to the maximum scale 50 interface in flow chart form according to certain teachings of
factor associated with view 1516. the present disclosure. The application programming inter
FIG. 16A illustrates a display 1604 of a device having a face operates in an environment with user interface Software
first Scaling factor of a view 1616. A user input (e.g., two interacting with a software application in order to notify a
fingers 1608 and 1610 moving together) associated with the delegate of at least one animation associated with a display
view 1614 is interpreted as a gesture event to Zoom in from 55 region, window, or view of a display of a device. Adelay in the
view 1614 to view 1664 having a second scale factor as animation may be specified by the API. Also, multiple ani
illustrated in FIG. 16B. The dashed regions 1602 and 1650 mations may be assigned priority by the API. The method
represent the total area of the content with the only content 1800 for notifying the delegate includes determining whether
being displayed in the display area 1604 and 1652. In per any animation occurs at block 1802. The method 1800 further
forming the scaling transform from FIG.16A to FIG.16B, the 60 includes checking the progress of an animation at block 1804.
center of the gesture event, center 1612 for FIG. 16A and If progress has occurred, then the next state (e.g., position,
center 1660 for FIG. 16B, remains in the same position with opacity, or transform) of the animation can be calculated at
respect to the display 1604. The scroll indicator 1606 shrinks block 1806. If progress has completed at block 1806, then at
to become scroll indicator 1654 during the transform to indi block 1808 it is determined whether the view associated with
cate that a smaller portion of the total content 1650 is being 65 the completed animation is associated with a delegate. If so, a
displayed on display 1604 as a result of the Zoom in operation. delegate call is transferred to notify the delegate of the ani
The dashed region 1650 is larger than the dashed region 1602 mation for the view at block 1810. The delegate operating
US 7,844.915 B2
15 16
under the control of the Software application can change other enabled at block 2106. The synchronization flag can be
views in response to the view being modified by the anima enabled when the processes to be synchronized have each
tion. sent messages to a window server operating the user interface
In certain embodiments, software invokes an animation software. The method 2100 further includes updating the
that performs a scaling transform on the view associated with attributes of the views from a first state to a second state
the user input. A display may include numerous views. The independently at block 2108. In one embodiment, the win
view being increased in size by the Scaling transform may dow server receives the updated attributes from each process
obstruct other views in which case the other views may need at different times. The method 2100 further includes transfer
to be reduced in size. Alternatively, the view being decreased ring a start animation call to draw the requested animations
in size by the scaling transform may create additional area for 10 when both processes have updated attributes associated with
other views to increase in size. the second state at block 2110.
FIG. 19 is flow chart of a method for animating a display In some embodiments, a first data structure or layer tree
region, windows, or views displayed on a display of a device. represents a hierarchy of layers that correspond to the views
The method 1900 includes starting at least two animations at or windows of the processes. A second data structure or
block 1902. The method 1900 further includes determining 15 render tree represents a similar copy of the layer tree. How
the progress of each animation at block 1904. The method ever, the render tree is not updated until the independent
1900 further includes completing each animation based on a processes have completed their separate animations. At this
single timer at block 1906. time, the render tree updates and redraws the screen with the
In certain embodiments of the present disclosure, the single new animations.
timer includes a timer based on a redraw interval which is a FIGS. 22A and 22B illustrate synchronizing the resizing of
time period between the display of a current frame and a next views or windows of a display of a device. For example, a
frame of the display of the device. In this case, changes in window 2210 associated with a first process with a size
animation are updated to the display during the redraw inter attribute may increase in size by changing from a first state,
val in order to display the chances during the next frame of the window 2210 in FIG. 22A, to a second state window 2210 in
display. The progress of each animation may be calculated 25 FIG.22B. At approximately the same time, a second window
periodically or based upon a progress call. 2220 may decrease in size in proportion to the increase in size
The method 1900 may further include determining of the first window 2210. The method 2100 provides synchro
whether each animation is associated with a delegate. The nization of the resizing of the windows 2210 and 2220 illus
delegate is then notified of the animation. Other views not trated in FIGS. 22A and 22B. The animations in changing
associated with an animation may be changed depending on 30 from the first state to the second state may occur incremen
the Software application controlling the delegate. tally and occur with the synchronization of method 2100.
FIG. 20 is flow chart of a method for animating a display FIG. 23 illustrates a method for switching ownership of a
region, windows, or views displayed on a display of a device. view of an application displayed on a display of a data pro
The method 2000 includes providing a single animation timer cessing device. The method 2300 includes constructing a data
at block 2002. The method 2000 further includes animating a 35 structure having a hierarchy of layers with a layer being
plurality of animations with the single animation timer at associated with a view and owning the view at block 2302.
block 2004. For example, a single timer may control all The layers may be content, windows, video, images, text,
animations which occur simultaneously. The animations may media, or any other type of object for user interface of the
include a transform, a frame, and an opacity animation. A application. The method 2300 further includes removing the
animation transform may include a scaling or rotation trans 40 layer from the data structure at block 2304. The method 2300
form. A frame animation may include resizing of a frame. An further includes switching ownership of the view from the
opacity animation changes the opacity from opaque to trans layer to the view at block 2306.
parent or vice versa. In some embodiments, each layer from the data structure is
FIG. 21 illustrates details of an application programming associated with a view. The layer associated with the view
interface in flow chart form according to certain teachings of 45 sends a delegate function call to the view in order to generate
the present disclosure. The application programming inter content provided by the view. A first pointer reference points
face operates in an environment with user interface Software from the layer to the view. A second pointer reference points
interacting with multiple software applications or processes from the view to the layer. The number of references pointing
in order to synchronize animations associated with multiple to an object such as the view is defined as the retained count
views or windows of a display of a device. The method 2100 50 of the object. The view may receive notification that the layer
for synchronizing the animations includes setting attributes will be removed from the data structure. Removing the layer
of views independently with each view being associated with from the data structure may occur based on the view associ
a process at block 2102. For example, an attribute or property ated with the layer being removed from the display of the
of a view may include a position, size, opacity, etc. An ani device. When the layer is removed from the data structure or
mation alters one or more attributes from a first state to a 55 layer tree the pointer from the layer to the view will be
second state. The method 2100 further includes transferring a removed. The view will have a retained count of Zero and be
synchronization call to synchronize animations for the mul deallocated or removed from memory if the ownership of the
tiple views of the display at block 2104. The synchronization view is not reversed. The view will have a retained count of at
call may include input parameters or arguments such as an least one if ownership is reversed.
identification of the synchronization of the processes and a 60 FIG. 24 illustrates a method for memory management of a
list of the processes that are requesting animation of the view of an application displayed on a display of a device. The
multiple views. In one embodiment, the synchronization call method 2400 includes constructing a data structure having a
includes the identification and the number of processes that hierarchy of layers with at least one layer being associated
are requesting animation. In one embodiment, each applica with the view at block 2402. The method 2400 further
tion or process sends a synchronization call at different times. 65 includes storing the data structure in memory at block 2404.
The method 2100 further includes transferring a synchroni The method 2400 further includes maintaining a retained
Zation confirmation message when a synchronization flag is count of the number of references to the view from other
US 7,844.915 B2
17 18
objects at block 2406. The method 2400 further includes duce other colors. The non-media content may be scanned to
deallocating the view from memory if the retained count is the display at a slower rate compared to the media content.
Zero at block 2408. As discussed above, the retained count of FIG. 27 illustrates a data structure or layer tree having a
the view will be decremented if the layer is removed from the hierarchy of layers. The layers can be associated with media
data structure. Removing the layer from the data structure and non-media content. For example, layer 2704 is associated
may occur based on the view associated with the layer being with media content 2706 such as a video. Layer 2710 is
removed from the display of the device. associated with non-media content 2712 which may be user
FIGS. 25A and 25B illustrate a data structure having a interface view for the video. Layers 2720, 2730, and 2740 are
hierarchy of layers with a layer being associated with a view. associated with non-media content 2722, 2732, and 2742,
The data structure includes layers 2502, 2504, and 2506. 10 respectively, that forms the components of the non-media
Layer 2506 is associated with the view 2510. The layer 2506 content 2712. The method 2600 will determine whether each
associated with the 2510 view sends a delegate call to the layer of the data structure is associated with media or non
view in order to generate content provided by the view. A first content. Any layers associated with media content such as
pointer reference 2508 points from the layer 2506 to the view layer 2704 will be removed from the data structure and pro
2510. A second pointer reference 2512 points from the view 15 cessed in a separate memory location.
2510 to the layer 2506. A third pointer reference 2532 may In some embodiments, the methods, systems, and appara
point from user interface (UI) controller 2530 to the view tuses of the present disclosure can be implemented in various
2510. The UI controller 2530 may control operations associ devices including electronic devices, consumer devices, data
ated with the view 2510 such as scrolling the view 2510 in processing devices, desktop computers, portable computers,
response to a user input. The view 2510 in FIG. 25A has a wireless devices, cellular devices, tablet devices, handheld
retained count of two based on the pointer references 2508 devices, multi touch devices, multi touch data processing
and 2532. devices, any combination of these devices, or other like
If the layer 2506 is removed from the data structure as devices. FIGS. 4-6 and 28-33 illustrate examples of a few of
these devices.
illustrated in FIG. 25B, then the pointer 2508 is removed. 25 FIG. 28 illustrates a device 2800 according to one embodi
View 2510 will have a lower retained count as illustrated in
ment of the disclosure. FIG. 28 shows a wireless device in a
FIG. 25B. If view 2510 has a retained count of Zero, then the telephone configuration having a “candy-bar style. In FIG.
memory storing the view 2510 will be deallocated.
FIG. 26 illustrates a method for compositing media and 28, the wireless device 2800 may include a housing 2832, a
display device 2834, an input device 2836 which may be an
non-media content of user interface for display on a device. 30 alphanumeric keypad, a speaker 2838, a microphone 2840
The method 2600 includes constructing a data structure hav and an antenna 2842. The wireless device 2800 also may
ing a hierarchy of layers associated with the user interface of include a proximity sensor 2844 and an accelerometer 2846.
the device at block 2602. The method 2600 further includes
determining whether each layer of the data structure is asso It will be appreciated that the embodiment of FIG.28 may use
ciated with media or non-media content at block 2604. The more or fewer sensors and may have a different form factor
35 from the form factor shown in FIG. 28.
data structure or layer tree is traversed in order to determine The display device 2834 is shown positioned at an upper
whether each of the layers of the data structure is associated portion of the housing 2832, and the input device 2836 is
with media or non-media content. The method 2600 further
shown positioned at a lower portion of the housing 2832. The
includes detaching a layer associated with media content antenna 2842 is shown extending from the housing 2832 at an
from the data structure at block 2606. The method 2600 40 upper portion of the housing 2832. The speaker 2838 is also
further includes storing media content in a first memory loca shown at an upper portion of the housing 2832 above the
tion at block 2606. The method 2600 further includes storing display device 2834. The microphone 2840 is shown at a
non-media content in a second memory location at block lower portion of the housing 2832, below the input device
2608. The method 2600 further includes compositing the 3286. It will be appreciated that the speaker 2838 and micro
media and non-media content for display on the device at 45 phone 2840 can be positioned at any location on the housing,
block 2610.
but are typically positioned in accordance with a user's ear
In some embodiments, compositing the media and non and mouth, respectively.
media content includes retrieving the media content from the The display device 2834 may be, for example, a liquid
first memory location, retrieving the non-media content from crystal display (LCD) which does not include the ability to
the second memory location, and Scanning the media and 50 accept inputs or a touch input screen which also includes an
non-media content directly to the display. The memory loca LCD. The input device 283.6 may include, for example, but
tion can be any type of memory located in cache, main tons, Switches, dials, sliders, keys or keypad, navigation pad,
memory, a graphics processing unit, or other location within touch pad, touch screen, and the like. Any well-known
a device. The media content may include video, Video plug speaker, microphone and antenna can be used for speaker
in, audio, image, or other time varying media. The media 55 2838, microphone 2840 and antenna 2842, respectively.
content may be in the form of a YUV model with the Y The data acquired from the proximity sensor 2844 and the
representing a luminance component (the brightness) and U accelerometer 2846 can be combined together, or used alone,
and V representing chrominance (color) components. The to gather information about the user's activities. The data
media content may be scanned to the display at a rate of from the proximity sensor 2844, the accelerometer 2846 or
substantially twenty to forty frames per second. The media 60 both can be used, for example, to activate/deactivate a display
content may be scaled prior to being scanned to the display of backlight, initiate commands, make selections, control Scroll
the device. ing, gesturing, animating or other movement in a display,
The non-media content may include content, views, and control input device settings, or to make other changes to one
images that do not require frequent updating. The non-media or more settings of the device. In certain embodiments of the
content may be in the form of a RGB model which is an 65 present disclosure, the device 2800 can be used to implement
additive model in which red, green, and blue (often used in at least Some of the methods discussed in the present disclo
additive light models) are combined in various ways to repro SUC.
US 7,844.915 B2
19 20
FIG. 29 shows a device 2950 in accordance with one telephone integrated with a media player which plays MP3
embodiment of the disclosure. The device 2950 may include files, such as MP3 music files.
a housing 2952, a display/input device 2954, a speaker 2956, Each of the devices shown in FIGS. 4, 5A, 5B, 5C, 6A, 6B,
a microphone 2958 and an optional antenna 2960 (which may 6C, 6D, 28, 29.30A and 30B may be a wireless communica
be visible on the exterior of the housing or may be concealed tion device, such as a cellular telephone, and may include a
within the housing). The device 2950 also may include a plurality of components which provide a capability for wire
proximity sensor 2962 and an accelerometer 2964. The less communication. FIG. 31 shows an embodiment of a
device 2950 may be a cellular telephone or a device which is wireless device 3070 which includes the capability for wire
an integrated PDA and a cellular telephone or a device which less communication. The wireless device 3070 may be
is an integrated media player and a cellular telephone or a 10 included in any one of the devices shown in FIGS. 4, 5A, 5B,
device which is both an entertainment system (e.g. for playing 5C, 6A, 6B, 6C, 6D, 28, 29,30A and 30B, although alterna
games) and a cellular telephone, or the device 2950 may be tive embodiments of those devices of FIGS. 4, 5A, 5B, 5C,
other types of devices described herein. In one particular 6A, 6B, 6C, 6D, 28, 29, 30A and 308 may include more or
embodiment, the device 2950 may include a cellular tele fewer components than the Wireless device 3070.
phone and a media player and a PDA, all contained within the 15 Wireless device 3070 may include an antenna system 3101.
housing 2952. The device 2950 may have a form factor which Wireless device 3070 may also include a digital and/or analog
is small enough that it fits within the hand of a normal adult radio frequency (RF) transceiver 3102, coupled to the antenna
and is light enough that it can be carried in one hand by an system 3101, to transmit and/or receive voice, digital data
adult. It will be appreciated that the term “portable” means the and/or media signals through antenna system 3101.
device can be easily held in an adult users hands (one or Wireless device 3070 may also include a digital processing
both); for example, a laptop computer and an iPod are por system 3103 to control the digital RF transceiver and to
table devices. manage the Voice, digital data and/or media signals. Digital
In one embodiment, the display/input device 2954 may processing system 3103 may be a general purpose processing
include a multi-point touch input Screen in addition to being device. Such as a microprocessor or controller for example.
a display, such as an LCD. In one embodiment, the multi 25 Digital processing system 3103 may also be a special purpose
point touch screen is a capacitive sensing medium configured processing device, such as an ASIC (application specific inte
to detect multiple touches (e.g., blobs on the display from a grated circuit), FPGA (field-programmable gate array) or
user's face or multiple fingers concurrently touching or DSP (digital signal processor). Digital processing system
nearly touching the display) or near touches (e.g., blobs on the 3103 may also include other devices, as are known in the art,
display) that occurat the same time and at distinct locations in 30 to interface with other components of wireless device 3070.
the plane of the touch panel and to produce distinct signals For example, digital processing system 3103 may include
representative of the location of the touches on the plane of analog-to-digital and digital-to-analog converters to interface
the touch panel for each of the multiple touches. with other components of wireless device 3070. Digital pro
cessing system 3103 may include a media processing system
In certain embodiments of the present disclosure, the 35 3109, which may also include a general purpose or special
device 2800 can be used to implement at least some of the purpose processing device to manage media, Such as files of
methods discussed in the present disclosure. audio data.
FIGS. 30A and 30B illustrate a device 3070 according to Wireless device 3070 may also include a storage device
one embodiment of the disclosure. The device 3070 may be a 3104, coupled to the digital processing system, to store data
cellular telephone which includes a hinge 3087 that couples a 40 and/or operating programs for the Wireless device 3070. Stor
display housing 3089 to a keypad housing 3091. The hinge age device 3104 may be, for example, any type of solid-state
3087 allows a user to open and close the cellular telephone so or magnetic memory device.
that it can be placed in at least one of two different configu Wireless device 3070 may also include one or more input
rations shown in FIGS. 30A and 30B. In one particular devices 3105, coupled to the digital processing system 3103.
embodiment, the hinge 3087 may rotatably couple the display 45 to accept user inputs (e.g., telephone numbers, names,
housing to the keypad housing. In particular, a user can open addresses, media selections, etc.) Input device 3105 may be,
the cellular telephone to place it in the open configuration for example, one or more of a keypad, a touchpad, a touch
shown in FIG. 30A and can close the cellular telephone to screen, a pointing device in combination with a display device
place it in the closed configuration shown in FIG. 30B. The or similar input device.
keypad housing 3091 may include a keypad 3095 which 50 Wireless device 3070 may also include at least one display
receives inputs (e.g. telephone number inputs or other alpha device 33106, coupled to the digital processing system 3103.
numeric inputs) from a user and a microphone 3097 which to display information Such as messages, telephone call infor
receives voice input from the user. The display housing 3089 mation, contact information, pictures, movies and/or titles or
may include, on its interior surface, a display 3093 (e.g. an other indicators of media being selected via the input device
LCD) and a speaker 3098 and a proximity sensor 3084; on its 55 3105. Display device 3106 may be, for example, an LCD
exterior surface, the display housing 3089 may include a display device. In one embodiment, display device 3106 and
speaker 3096, a temperature sensor 3094, a display 3088 (e.g. input device 3105 may be integrated together in the same
another LCD), an ambient light sensor 3092, and a proximity device (e.g., a touch screen LCD Such as a multi-touch input
sensor 3084A. Hence, in this embodiment, the display hous panel which is integrated with a display device, such as an
ing 3089 may include a first proximity sensor on its interior 60 LCD display device). The display device 3106 may include a
Surface and a second proximity sensor on its exterior Surface. backlight 3106A to illuminate the display device 3106 under
In at least certain embodiments, the device 3070 may con certain circumstances. It will be appreciated that the Wireless
tain components which provide one or more of the functions device 3070 may include multiple displays.
of a wireless communication device Such as a cellular tele Wireless device 3070 may also include a battery 3107 to
phone, a media player, an entertainment system, a PDA, or 65 Supply operating power to components of the system includ
other types of devices described herein. In one implementa ing digital RF transceiver 3102, digital processing system
tion of an embodiment, the device 3070 may be a cellular 3103, storage device 3104, input device 3105, microphone
US 7,844.915 B2
21 22
310A, audio transducer 3108, media processing system 3109, the hierarchy of layers at block 3404. An application may
sensor(s) 3110, and display device 3106. Battery 3107 may interact with the hierarchy of views via the API without
be, for example, a rechargeable or non-rechargeable lithium accessing the hierarchy of layers operating below the hierar
or nickel metal hydride battery. Wireless device 3070 may chy of views.
also include audio transducers 3108, which may include one 5 In some embodiments, a platform provides various scroll
or more speakers, and at least one microphone 3105A. In ing, gesturing, and animating operations. The platform
certain embodiments of the present disclosure, the wireless includes hardware components and an operating system. The
device 3070 can be used to implement at least some of the hardware components may include a processing unit coupled
methods discussed in the present disclosure. to an input panel and a memory coupled to the processor. The
FIG.32 shows another example of a device according to an 10 operating system includes one or more programs that are
embodiment of the disclosure. This device 3200 may include stored in the memory and configured to be executed by the
a processor, such as microprocessor 3202, and a memory processing unit. One or more programs include various
3204, which are coupled to each other through a bus 3206. instructions for transferring function calls or messages
The device 3200 may optionally include a cache 3208 which through an application programming interface in order to
is coupled to the microprocessor 3202. This device may also 15 perform various scrolling, gesturing, and animating opera
optionally include a display controller and display device tions.
3210 which is coupled to the other components through the In an embodiment, the one or more programs include
bus 3206. One or more input/output controllers 3212 are also instructions for transferring a bounce call through an API to
coupled to the bus 3206 to provide an interface for input/ cause a bounce of a scrolled region in an opposite direction of
output devices 3214 and to provide an interface for one or a scroll based on a region past an edge of the scrolled region
more sensors 3216 which are for sensing user activity. The being visible in a display region at the end of the scroll. In an
bus 3206 may include one or more buses connected to each embodiment, the one or more programs include instructions
other through various bridges, controllers, and/or adapters as for transferring a rubberband call through an API to cause a
is well known in the art. The input/output devices 3214 may rubberband effect on a scrolled region by a predetermined
include a keypad or keyboard or a cursor control device Such 25 maximum displacement when the scrolled region exceeds a
as a touch input panel. Furthermore, the input/output devices display edge based on a scroll. In an embodiment, the one or
3214 may include a network interface which is either for a more programs include instructions for transferring a direc
wired network or a wireless network (e.g. an RF transceiver). tional scroll call through an API to set a scroll angle for
The sensors 3216 may be any one of the sensors described locking the Scrolling in at least one of a vertical or a horizontal
herein including, for example, a proximity sensor oran ambi 30 direction.
ent light sensor. In at least certain implementations of the In an embodiment, the one or more programs include
device 3200, the microprocessor 3202 may receive data from instructions for transferring a scroll hysteresis call through an
one or more sensors 3216 and may perform the analysis of API to determine whether a user input invokes a scroll. In an
that data in the manner described herein. For example, the embodiment, the one or more programs include instructions
data may be analyzed through an artificial intelligence pro 35 for transferring a deceleration scroll call through an API to set
cess or in the other ways described herein. As a result of that a deceleration factor for a user input based on the user input
analysis, the microprocessor 3202 may then automatically invoking a scroll. In an embodiment, the one or more pro
cause an adjustment in one or more settings of the device. grams include instructions for transferring a scroll indicator
In certain embodiments of the present disclosure, the call through an API to determine whether at least one scroll
device 3200 can be used to implement at least some of the 40 indicator attaches to a content edge or a display edge of a
methods discussed in the present disclosure. display region.
FIGS. 33A-C show another example of a device according In some embodiments, the platform includes a framework
to at least certain embodiments of the disclosure. FIG. 33A containing a library of software code. The framework inter
illustrates a laptop device 3300 with a keyboard 3302, a body acts with the programs of the platform to provide application
3304, a display frame 3306, and a display 3308. The laptop 45 programming interfaces for performing various scrolling,
device 3300 can be converted into a tablet device as illustrated gesturing, and animating operations. The framework also
in FIG.33B and FIG.33C. FIG.33B illustrates the conversion includes associated resources (e.g., images, text, etc.) that are
of the laptop device into a tablet device. An edge of a display stored in a single directory.
frame 3356 containing a display 3358 is slide within the body In an embodiment, the library of the framework provides
3354 across the top of a keyboard 3352 until forming a tablet 50 an API for specifying a bounce operation to cause a bounce of
device as illustrated in FIG. 33C. The tablet device with a a scrolled region in an opposite direction of a scroll based on
display 2362 and a display frame 3366 rests on top of a body a region past an edge of the Scrolled region being visible in a
3360. display region at the end of the scroll. In an embodiment, the
In certain embodiments of the present disclosure, the lap library of the framework provides an API for specifying a
top device 3300 can be used to implement at least some of the 55 rubberband operation that has a rubberband effect on a
methods discussed in the present disclosure. scrolled region by a predetermined maximum displacement
FIG. 34 illustrates details of an application programming when the scrolled region exceeds a display edge based on a
interface in flow chart form according to certain teachings of scroll. In an embodiment, the library of the framework pro
the present disclosure. The application programming inter vides an API for specifying a directional scroll operation to
face operates in an environment with user interface Software 60 set a scroll angle for locking the scrolling in at least one of a
interacting with a software application. In some embodi vertical or a horizontal direction.
ments, a hierarchy of views operates on top of a hierarchy of In an embodiment, the library of the framework provides
layers within the user interface software. The API operates as an API for specifying a scroll hysteresis operation to deter
illustrated in method 3400 that includes constructing a hier mine whethera user input invokes a scroll. In an embodiment,
archy of views operating on top of a hierarchy of layers at 65 the library of the framework provides an API for specifying a
block 3402. The method 3400 further includes providing deceleration scroll operation to set a deceleration factor for a
access to the hierarchy of views without providing access to user input based on the user input invoking a scroll. In an
US 7,844.915 B2
23 24
embodiment, the library of the framework provides an API for receiving a user input, the user input is one or more input
specifying a scroll indicator operation to determine whether points applied to a touch-sensitive display that is inte
at least one scroll indicator attaches to a content edge or a grated with the data processing system;
display edge of a display region. creating an event object in response to the user input;
In the foregoing specification, the disclosure has been determining whether the event object invokes a scroll or
described with reference to specific exemplary embodiments gesture operation by distinguishing between a single
thereof. It will be evident that various modifications may be input point applied to the touch-sensitive display that is
made thereto without departing from the broader spirit and interpreted as the scroll operation and two or more input
Scope of the disclosure as set forth in the following claims. 10 points applied to the touch-sensitive display that are
The specification and drawings are, accordingly, to be interpreted as the gesture operation;
regarded in an illustrative sense rather than a restrictive sense. issuing at least one scroll or gesture call based on invoking
the Scroll or gesture operation;
responding to at least one scroll call, if issued, by Scrolling
What is claimed is: 15
a window having a view associated with the event
1. A machine implemented method for scrolling on a object; and
touch-sensitive display of a device comprising: responding to at least one gesture call, if issued, by scaling
receiving a user input, the user input is one or more input the view associated with the event object based on
points applied to the touch-sensitive display that is inte receiving the two or more input points in the form of the
grated with the device; user input.
creating an event object in response to the user input; 9. The medium as in claim 8, further comprising:
determining whether the event object invokes a scroll or rubberbanding a scrolling region displayed within the win
gesture operation by distinguishing between a single 25
dow by a predetermined maximum displacement when
input point applied to the touch-sensitive display that is the scrolled region exceeds a window edge based on the
interpreted as the Scroll operation and two or more input scroll.
points applied to the touch-sensitive display that are 10. The medium as in claim 8, further comprising:
interpreted as the gesture operation; attaching scroll indicators to a content edge of the view.
issuing at least one scroll or gesture call based on invoking 30
11. The medium as in claim 8, further comprising:
the scroll or gesture operation;
responding to at least one scroll call, if issued, by scrolling attaching scroll indicators to a window edge of the view.
a window having a view associated with the event object 12. The medium as in claim 8, wherein determining
based on an amount of a scroll with the Scroll stopped at whether the event object invokes a scroll or gesture operation
a predetermined position in relation to the user input; 35 is based on receiving a drag user input for a certain time
and period.
responding to at least one gesture call, if issued, by scaling 13. The medium as in claim 8, further comprising:
the view associated with the event object based on responding to at least one gesture call, if issued, by rotating
receiving the two or more input points in the form of the 40 a view associated with the event object based on receiv
user input. ing a plurality of input points in the form of the user
2. The method as in claim 1, further comprising: input.
rubberbanding a scrolling region displayed within the win 14. The medium as in claim 8, wherein the data processing
dow by a predetermined maximum displacement when system is one of a data processing device, a portable device,
the scrolling region exceeds a window edge based on the 45 a portable data processing device, a multi touch device, a
scroll. multi touch portable device, a wireless device, and a cell
phone.
3. The method as in claim 1, further comprising: 15. An apparatus, comprising:
attaching Scroll indicators to a content edge of the window. means for receiving, through a hardware device, a user
4. The method as in claim 1, further comprising: 50 input on a touch-sensitive display of the apparatus, the
attaching scroll indicators to the window edge. user input is one or more input points applied to the
5. The method as in claim 1, wherein determining whether touch-sensitive display that is integrated with the appa
the event object invokes a scroll or gesture operation is based ratus;
on receiving a drag user input for a certain time period. means for creating an event object in response to the user
55
6. The method as in claim 1, further comprising: input;
responding to at least one gesture call, if issued, by rotating means for determining whether the event object invokes a
a view associated with the event object based on receiv Scroll or gesture operation by distinguishing between a
ing a plurality of input points in the form of the user single input point applied to the touch-sensitive display
input. 60 that is interpreted as the scroll operation and two or more
7. The method as in claim 1, wherein the device is one of: input points applied to the touch-sensitive display that
a data processing device, a portable device, a portable data are interpreted as the gesture operation;
processing device, a multi touch device, a multi touch por means for issuing at least one scroll or gesture call based on
table device, a wireless device, and a cell phone. invoking the scroll or gesture operation;
8. A machine readable storage medium storing executable 65 means for responding to at least one scroll call, if issued, by
program instructions which when executed cause a data pro Scrolling a window having a view associated with the
cessing system to perform a method comprising: event object; and
US 7,844.915 B2
25 26
means for responding to at least one gesture call, if issued, 19. The apparatus as in claim 15, wherein determining
by scaling the view associated with the event object whether the event object invokes a scroll or gesture operation
based on receiving the two or more input points in the is based on receiving a drag user input for a certain time
form of the user input. period.
16. The apparatus as in claim 15, further comprising: 5 20. The apparatus as in claim 15, further comprising:
means for rubberbanding a scrolling region displayed means for responding to at least one gesture call, if issued,
within the window by a predetermined maximum dis by rotating a view associated with the event object based
placement when the scrolling region exceeds a window on receiving a plurality of input points in the form of the
edge based on the scroll. user input.
17. The apparatus as in claim 15, further comprising: 10 21. The apparatus as in claim 15, wherein the apparatus is
means for attaching scroll indicators to a content edge of dataofprocessing
one a data processing device, a portable device, a portable
device, a multi touch device, a multi touch
the window.
18. The apparatus as in claim 15, further comprising: portable device, a wireless device, and a cellphone.
means for attaching scroll indicators to the window edge. k k k k k

You might also like