0% found this document useful (0 votes)
84 views

Index: Pageno Number

The document provides an overview of the history and development of touchscreen technology. It discusses early resistive and infrared touchscreen designs from the 1960s-1980s, and how touchscreens became integrated into devices like PDAs, smartphones and tablets. It also outlines some of the first commercial applications of touchscreens in devices like ATMs, aircraft cockpits, and point-of-sale systems from the 1980s.

Uploaded by

vidya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views

Index: Pageno Number

The document provides an overview of the history and development of touchscreen technology. It discusses early resistive and infrared touchscreen designs from the 1960s-1980s, and how touchscreens became integrated into devices like PDAs, smartphones and tablets. It also outlines some of the first commercial applications of touchscreens in devices like ATMs, aircraft cockpits, and point-of-sale systems from the 1980s.

Uploaded by

vidya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
You are on page 1/ 17

INDEX

CHAPTER CONTENTS PAGENO


NUMBER
ABSTRACT i
LIST OF FIGURES ii
1 TOUCH SCREEN TOUCH SCREEN 1
TECHNOLOGY
1.1 INTRODUCTION 1
2 HISTORY OF TOUCH SCREEN
TECHNOLOGY 2
3 WORKING OF TOUCH SCREEN
TECHONOLOGY
3.1 TOUCH SENSOR
3.2 CONTROLLER
3.3 DRIVER
4 ADVANTAGES OF TOUCH SCREEN
5 DISADVANTAGES OF TOUCH SCREEN
6 INTRODUCTION TO TOUCH LESS ORKING
TOUCH SCREEN TECHNOLOGY
7 TOUCH LESS MONITER
8 TOUCH WALL
9 WORKING OF TOUCHLESS TOUCHSCREEN
10 MINORITY REPORT INSPIRED
TOUCHLESS TECHOLOGY
10.1 TOBII REX
10.2 ELLIPTIC LABS
10.3 AIR WRITING
10.4 EYE SIGHT
10.5 MAUZ
10.6 POINT GRAB
10.7 LEAP MOTION
10.8 MICROSOFT KINECT
11 CONCLUSION
ABSTRACT

It was the touch screens which initially created great


furore. touchscreen displays are ubiquitous worldwide. frequent
touching a touch screen display with pointing device such as a finger
can result in the gradual de-sensitization of the touch screen to
input and ultimately lead to failure of the touch screen to avoid this
a simple user interface for touch less control of electrially operated
equipment. unlike other systems which depends on distance to the
sensor or finger motions, in a sertain direction. the device based on
optical pattern recognition.

INTRODUCTION TO TOUCHSCREEN
A touchscreen is an important source of input device and
output device normally layered on the top of an electronic visual
display of an information processing system. A user can give
input or control the information processing system through simple or
multi-touch gestures by touching the screen with a special stylus
and/or one or more fingers. Some touchscreens use ordinary or
specially coated gloves to work while others use a special stylus/pen
only. The user can use the touchscreen to react to what is displayed
and to control how it is displayed; for example, zooming to increase
the text size. The touchscreen enables the user to interact directly
with what is displayed, rather than using a mouse, touchpad, or any
other intermediate device (other than a stylus, which is optional for
most modern touchscreens). Touchscreens are common in devices such as
game consoles, personal computers, tablet computers, electronic
voting machines, point of sale systems, and smartphones. They can also
be attached to computers or, as terminals, to networks. They also
play a prominent role in the design of digita l appliances such as
personal digital assistants (PDAs) and some e-readers. The popularity
of smartphones, tablets, and many types of information appliances is
driving the demand and acceptance of common touchscreens for portable
and functional electronics. Touchscreens are found in the medical
field and in heavy industry, as well as for automated teller machines
(ATMs), and kiosks such as museum displays or room automation, where
keyboard and mouse systems do not allow a suitably intuitive, rapid,
or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-
based firmware have been made available by a wide array of after-
market system integrators, and not by display, chip, or
motherboard manufacturers. Display manufacturers and chip
manufacturers worldwide have acknowledged the trend toward
acceptance of touchscreens as a highly desirable user interface
component and have begun to integrate touchscreens into the
fundamental design of their products.

HISTORY OF TOUCHSCREEN

E.A. Johnson of the Royal Radar Establishment, Malvern described his


work on capacitive touchscreens in a short article published in 1965
and then more fully—with photographs and diagrams in an article
published in 1967. The applicability of touch technology for air
traffic control was described in an article published in 1968. Frank
Beck and Bent Stumpe, engineers from CERN, developed a transparent
touchscreen in the early 1970s, based on Stumpe's work at a
television factory in the early 1960s. Then manufactured by CERN, it
was put to use in 1973. A resistive touchscreen was developed by
American inventor George Samuel Hurst, who received US patent
#3,911,215 on October 7, 1975. The first version was produced in 1982.
In 1972, a group at the University of Illinois filed for a patent on
an optical touchscreen that became a standard part of the Magnavox
Plato IV Student Terminal. Thousands were built for the PLATO IV
system. These touchscreens had a crossed array of 16 by 16 infrared
position sensors, each composed of an LED on one edge of the screen
and a matched phototransistor on the other edge, all mounted in front
of a monochrome plasma display panel. This arrangeme nt can sense
any fingertip-sized opaque object in close proximity to the screen. A
similar touchscreen was used on the HP-150 starting in 1983; this was
one of the world's earliest commercial touchscreen computers. HP
mounted their infrared transmitters and receivers around the
bezel of a 9" Sony Cathode Ray Tube (CRT). In 1984, Fujitsu released a
touch pad for the Micro 16, to deal with the complexity of kanji
characters, which were stored as tiled graphics. In 1985, Sega
released the Terebi Oekaki, also known as the Sega Graphic Board, for
the SG-1000 video game console and SC-3000 home computer. It
consisted of a plastic pen and a plastic board with a transparent
window where the pen presses are detected. It was used primarily for a
drawing software application. A graphic touch tablet was released
for the Sega AI Computer in 1986. Touch-sensitive Control-Display
Units (CDUs) were evaluated for commercial aircraft flight decks in
the early 1980s. Initial research showed that a touch interface
would reduce pilot workload as the crew could then select waypoints,
functions and actions, rather than be "head down" typing in latitudes,
longitudes, and waypoint codes on a keyboard. An effective integration
of this technology was aimed at helping flight crews maintain
a high-level of situational awareness of all major aspects of the
vehicle operations including its flight path, the functioning of
various aircraft systems, and moment-to-mo me nt human interactions.
In the early 1980s, General Motors tasked its Delco Electronics divis
io n with a project aimed at replacing an automobile's non-
essential functions (i.e. other than throttle, transmission,
braking and steering) from mechanical or electro-mechanical systems

with solid state alternatives wherever possible. The finished device


was dubbed the ECC for "Electronic Control Centre", a digital computer
and software control system hardwired to various peripheral
sensors, servos, solenoids, antenna and a monochrome CRT
touchscreen that functioned both as display and sole method of input.
[19] The ECC replaced the traditiona l mechanical stereo, fan, heater
and air conditioner controls and displays, and was capable of
providing very detailed and specific information about the vehicle's
cumulative and current operating status in real time. The ECC was
standard equipment on the 1985–89 Buick Riviera and later the 1988–
89 Buick Reatta, but was unpopular with consumers partly due to the
technophobia of some traditional Buick customers, but mostly because
of costly to repair technical problems suffered by the ECC's
touchscreen which being the sole access method, would render
climate control or stereo operation impossible. Multi-touch technology
began in 1982, when the University of Toronto's Input Research Group
developed the first human- inp ut multi-touch system, using a frosted-
glass panel with a camera placed behind the glass. In 1985, the
University of Toronto group including Bill Buxton developed a multi-
touch tablet that used capacitance rather than bulky camera-based
optical sensing systems (see History of mult i- touch). In 1986,
the first graphical point of sale software was demonstrated on the 16-
bit Atari 520ST colour computer. It featured a colour touchscreen
widget-driven interface.[21] The View Touch point of sale software was
first shown by its developer, Gene Mosher, at Fall Comdex, 1986, in
Las Vegas, Nevada to visitors at the Atari Computer demonstration area
and was the first commercially available POS system with a widget-
driven colour graphic touchscreen interface. In 1987, Casio launched
the Casio PB-1000 pocket computer with a touchscreen consisting of a
4x4 matrix, resulting in 16 touch areas in its small LCD graphic
screen. Until 1988 touchscreens had the bad reputation of being
imprecise. Most user interface books would state that touchscreens
selections were limited to targets larger than the average finger. At
the time, selections were done in such a way that a target was
selected as soon as the finger came over it, and the
corresponding action was performed immediately. Errors were
common, due to parallax or calibration problems, leading to
frustration. A new strategy called "lift- o ff strategy" was
introduced by researchers at the University of Maryland Human –
Computer Interaction Lab and is still used today. As users touch the
screen, feedback is provided as to what will be selected, users
can adjust the position of the finger, and the action takes place only
when the finger is lifted off the screen. This allowed the
selection of small targets, down to a single pixel on a VGA screen
(standard best of the time). Sears et al. (1990) gave a review of
academic research on single and multi-touch human–computer
interaction of the time, describing gestures such as rotating
knobs, adjusting sliders, and swiping thescreen to activate

10

a switch (or a U-shaped gesture for a toggle switch). The University


of Maryland Human – Computer Interaction Lab team developed and
studied small touchscreen keyboards (includ ing a study that
showed that users could type at 25 wpm for a touchscreen keyboard
compared with 58 wpm for a standard keyboard), thereby paving the way
for the touchscreen keyboards on mobile devices. They also designed
and implemented multitouch gestures such as selecting a range of a
line, connecting objects, and a "tap-click" gesture to select
while maintaining location with another finger.

11

WORKING OF TOUCHSCREEN

A resistive touchscreen panel comprises several layers, the most


important of which are two thin, transparent electrically resistive
layers separated by a thin space. These layers face each other
with a thin gap between. The top screen (the screen that is touched)
has a coating on the underside surface of the screen. Just beneath it
is a similar resistive layer on top of its substrate. One layer
has conductive connections along its sides, the other along top
and bottom. A voltage is applied to one layer, and sensed by the
other. When an object, such as a fingertip or stylus tip, presses
down onto the outer surface, the two layers touch to become
connected at that point: The panel then behaves as a pair of voltage
dividers, one axis at a time. By rapidly switching between each layer,
the position of a pressure on the screen can be read. A capacitive
touchscreen panel consists of an insulator such as glass, coated with
a transparent conductor such as indium tin oxide (ITO).[32] As the
human body is also an electrical conductor, touching the surface of
the screen results in a distortion of the screen's electrostatic
field, measurable as a change in capacitance. Differ e nt
technologies may be used to determine the location of the touch.
The location is then sent to the controller for processing.
Unlike a resistive touchscreen, one cannot use a capacitive
touchscreen through most types of electrically insulating material,
such as gloves. This disadvantage especially affects usability in
consumer electronics, such as touch tablet PCs and capacitive
smartphones in cold weather. It can be overcome with a special
capacitive stylus, or a special-application glove with an embroidered
patch of conductive thread passing through it and contacting the
user's fingertip.

TOUCH SENSOR

A touch screen sensor is a clear glass panel with a touch responsive


surface. The sensor generally has an electrical current or signal
going through it and touching the screen causes a voltage or signal
change.

Figure: 3.1- Touch Sensor

12

CONTROLLER

The controller is a small PC card that connects between the touch


sensor and the PC. The controller determines what type of
interface/connection you will need on the PC.

Figure: 3.1- Diagrammatically working of touchscreen

13
ADVANTAGE OF TOUCHSCREEN

Direct pointing to the objects.

Fast.

Finger or pen is usable (No cable required).

No keyboard necessary.

Suited to: novices, applicatio n for information retrieval etc

14

5. DISADVANTAGE OF TOUCHSCREEN

Low precision by using finger.

User has to sit or stand closer to thescreen.

The screen may be covered more by using hand.

No direct activation to the selected function.


15

INTRODUCTION TO TOUCHLESS TOUCHSCREEN

Touch less control of electrically operated equipment is being


developed by Elliptic Labs. This system depends on hand or finger
motions, a hand wave in a certain direction. The sensor can be
placed either on the screen or near the screen. The touchscreen
enablestheuser to interact directly with what is displayed, rather
than using a mouse, touchpad, or any other intermediate device (other
than a stylus, which is optional for most modern
touchscreens).Touchscreensarecommonindevicessuchas
gameconsoles,personalcomputers, tablet computers, electronic voting
machines, point of sale systems ,and smartphones. They can also be
attached to computers or, as terminals, tonetworks. They also play
aprominent role in thedesign of digital appliancessuch as
personaldigitalassistants(PDAs) and some e-readers. The popularity of
smartphones, tablets, and many types of information appliances is
driving the demand and acceptance of common touchscreens for portable
and functional electronics. Touchscreensarefoundin
themedicalfieldandin heavyindustry,aswellasforautomatedteller machines
(ATMs), and kiosks such as museum displays or roomautomation,
wherekeyboard and mouse systems do not allow a suitably intuitive,
rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-
based firmwarehave beenmadeavailable by awide arrayof after-market
system integrators, andnot by display, chip, or motherboard
manufacturers. Display manufacturers and chip manufacturers worldwide
have acknowledged the trendtoward acceptance of touchscreensas a
highly desirable user interface component and have begunto integrate
touchscreens into the fundamental design of their products.
Figure: 6.1- Touchless Touchscreen

16

TOUCHLESS MONITOR

This monitor is made by TouchKo. Touch less touch screen your


hand doesn’t have to come in contact with the screen at all, it
works by detecting your hand movements in front of it.

Figure: 7.2- Doctor using Touchless Touchscreen

17

TOUCHWALL

Touch Wall it is the first multi touch product. It refers to the touch
screen hardware setup itself and software is plex.
Figure: 8.1- Touch Wall

Touch Wall consists of three infrared lasers that scan a surface. By


using a projector entire walls can easily be turned into a multi touch
user interface.

Figure: 8.1- Kinect used for Touch Wall

18

WORKING OF TOUCHLESS TOUCHSCREEN

The system is capable of detecting movements in 3-dimensions without


ever having to put your fingers on the screen. Sensors are mounted
around the screen that is being used, by interacting in the line-of-
sight of these sensors the motion is detected and interpreted into on-
screen movements. The device is based on optical pattern
feet away), and you can manipulate objects in 3D.

Figure: 9.3- 3D Object

19

GBUI(gesture-based graphical user Interface)

A movement of part of the body, especially a hand or the head, to


express an idea or meaning Based graphical user interphase.
We have seen the futuristic user interfaces of movies like Minority
Report and the Matrix Revolutions where people wave their hand in 3
dimensions and the computer understands what the user wants and
shifts and sorts data with precision.

20

TOUCHLESS UI

The basic idea described in the patent is that there would be sensors
arrayed around the perimeter of the device capable of sensing finger
movements in 3-D space.

21

MINORITY REPORT INSPIRED TOUCHLESS


TECHNOLOGY
There are eight types of Minority Report Inspired Touchless
Technology. These are as follows:-

Figure: 12.2.1- Gesture Suit

22

Airwirting
Airwriting is a technology that allows you to write text messages or
compose emails by writing in the air.

Figure: 12.3.1- Airwiriting

Eyesight

EyeSight is a gesture technology which allows you to navigate through


your devices by just pointing at it.
Figure: 12.4.1- Gesture of hand moving

23

MAUZ

Mauz is a third party device that turns your iPhone into a trackpad or
mouse.

Figure: 12.5.1-MAUZ Device

POINT GRAB

Point Grab is something similar to eyeSight, in that it enables users


to navigate on their computer just by pointing at it.
Figure: 12.6.1-Use of Point Grab

24

LEAP MOTION

Leap Motion is a motion sensor device that recognizes the user’s


fingers with its infrared LEDs and cameras.

Figure: 12.8.1- Third party device for Touch Wall

25

CONCLUSION
Touchless Technology is still developing.
Many Future Aspects.
With this in few years our body can become a input device.
The Touch less touch screen user interface can be used effectively in
computers, cell phones, webcams andlaptops.
May be few years down the line, our body can be transformed into a
virtua l mouse, virtual keyboard ,Our body may be turned in to an
input device.
26

REFERENCES
www.google.com
www.wikipedia.org
www.studymafia.org

27

You might also like