Augmented reality vs virtual reality
Augmented reality vs virtual reality
SEMINAR PRESENTATION
ON
PRESENTED BY:
UDOFA, GOODLUCK ENOBONG
Reg. No: CG/2023/HD/CS/025
Level: HND2
SUBMITTED TO:
MR. VICTOR OKE (phd)
H.O.D/ Course Lecturer
MAY, 2025
ABSTRACT
Augmented Reality (AR) and Virtual Reality (VR) are transformative technologies
that have significantly reshaped how we interact with digital content and perceive
the world around us. This seminar explores the applications, benefits, and
implications of AR and VR across various sectors, with a special focus on
education, healthcare, entertainment, and industrial training. AR overlays digital
information onto the real-world environment, enhancing user perception without
replacing reality, while VR immerses users in a fully virtual environment,
simulating real-life experiences. The convergence of these technologies is fostering
new dimensions of experiential learning, remote collaboration, and immersive
simulation.
This paper examines the technical foundations of AR and VR, the hardware and
software ecosystems, as well as current trends driving adoption globally. It also
addresses the challenges such as high development costs, limited accessibility in
developing regions, and user health concerns related to prolonged use. By
analyzing real-world case studies and recent innovations, this seminar aims to
highlight the potential of AR and VR to bridge knowledge gaps, support mental
health therapy, streamline design and engineering workflows, and create engaging
user experiences. Ultimately, AR and VR are not just entertainment tools, but key
drivers in the digital transformation of modern society.
TABLE OF CONTENTS
Cover page
Abstract
Introduction
History
AR glasses
Conlusion
References
INTRODUCTION
Augmented reality (AR), also known as mixed reality (MR), is a technology that
overlays real-time 3D-rendered computer graphics onto a portion of the real world
through a display, such as a handheld device or head-mounted display. This
experience is seamlessly interwoven with the physical world such that it is
perceived as an immersive aspect of the real environment. In this way, augmented
reality alters one's ongoing perception of a real-world environment, compared
to virtual reality, which aims to completely replace the user's real-world
environment with a simulated one. Augmented reality is typically visual, but can
span multiple sensory modalities, including auditory, haptic, and somatosensory.
Augmented reality can be used to enhance natural environments or situations and
offers perceptually enriched experiences. With the help of advanced AR
technologies (e.g. adding computer vision, incorporating AR cameras into
smartphone applications, and object recognition) the information about the
surrounding real world of the user becomes interactive and digitally
manipulated. Information about the environment and its objects is overlaid on the
real world. This information can be virtual or real, e.g. seeing other real sensed or
measured information such as electromagnetic radio waves overlaid in exact
alignment with where they actually are in space. Augmented reality also has a lot
of potential in the gathering and sharing of tacit knowledge. Immersive perceptual
information is sometimes combined with supplemental information like scores
over a live video feed of a sporting event. This combines the benefits of both
augmented reality technology and heads up display technology (HUD).
Augmented reality frameworks include ARKit and ARCore. Commercial
augmented reality headsets include the Magic Leap 1 and HoloLens. A number of
companies have promoted the concept of smartglasses that have augmented reality
capability.
Augmented reality can be defined as a system that incorporates three basic
features: a combination of real and virtual worlds, real-time interaction, and
accurate 3D registration of virtual and real objects. The overlaid sensory
information can be constructive (i.e. additive to the natural environment), or
destructive (i.e. masking of the natural environment). As such, it is one of the key
technologies in the reality-virtuality continuum. Augmented reality is largely
synonymous with mixed reality. There is also overlap in terminology
with extended reality and computer-mediated reality.
HISTORY
Virtual Fixtures – first AR system, U.S. Air Force, Wright-Patterson Air Force
Base (1992)
1992: Louis Rosenberg developed one of the first functioning AR systems,
called Virtual Fixtures, at the United States Air Force Research Laboratory
—Armstrong, that demonstrated benefit to human perception.
1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present an early
paper on an AR system prototype, KARMA, at the Graphics Interface
conference.
1993: Mike Abernathy, et al., report the first use of augmented reality in
identifying space debris using Rockwell WorldView by overlaying satellite
geographic trajectories on live telescope video.
1993: A widely cited version of the paper above is published
in Communications of the ACM – Special issue on computer augmented
environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold.
1993: Loral WDL, with sponsorship from STRICOM, performed the first
demonstration combining live AR-equipped vehicles and manned
simulators. Unpublished paper, J. Barrilleaux, "Experiences and
Observations in Applying Augmented Reality to Live Training", 1999.
1995: S. Ravela et al. at University of Massachusetts introduce a vision-
based system using monocular cameras to track objects (engine blocks)
across views for augmented reality.
1996: General Electric develops system for projecting information from 3D
CAD models onto real-world instances of those models.
1998: Spatial augmented reality introduced at University of North
Carolina at Chapel Hill by Ramesh Raskar, Greg Welch, Henry Fuchs.
1999: Frank Delgado, Mike Abernathy et al. report successful flight test of
LandForm software video map overlay from a helicopter at Army Yuma
Proving Ground overlaying video with runways, taxiways, roads and road
names.
1999: The US Naval Research Laboratory engages on a decade-long
research program called the Battlefield Augmented Reality System (BARS)
to prototype some of the early wearable systems for dismounted soldier
operating in urban environment for situation awareness and training.
1999: NASA X-38 flown using LandForm software video map overlays
at Dryden Flight Research Center.
2000: Rockwell International Science Center demonstrates tetherless
wearable augmented reality systems receiving analog video and 3D audio
over radio-frequency wireless channels. The systems incorporate outdoor
navigation capabilities, with digital horizon silhouettes from a terrain
database overlain in real time on the live outdoor scene, allowing
visualization of terrain made invisible by clouds and fog.
2004: An outdoor helmet-mounted AR system was demonstrated by Trimble
Navigation and the Human Interface Technology Laboratory (HIT lab).
2006: Outland Research develops AR media player that overlays virtual
content onto a users view of the real world synchronously with playing
music, thereby providing an immersive AR entertainment experience.
2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the G1
Android phone.
2009: ARToolkit was ported to Adobe Flash (FLARToolkit) by Saqoosha,
bringing augmented reality to the web browser.
2012: Launch of Lyteshot, an interactive AR gaming platform that utilizes
smart glasses for game data
2013: Niantic releases "Ingress", an augmented reality mobile game
for iOS and Android operating systems (and a predecessor of Pokémon Go).
2015: Microsoft announced the HoloLens augmented reality headset, which
uses various sensors and a processing unit to display virtual imagery over
the real world.
2016: Niantic released Pokémon Go for iOS and Android in July 2016. The
game quickly became one of the most popular smartphone applications and
in turn spikes the popularity of augmented reality games.
2018: Magic Leap launched the Magic Leap One augmented reality
headset. Leap Motion announced the Project North Star augmented reality
headset, and later released it under an open source license.
2019: Microsoft announced HoloLens 2 with significant improvements in
terms of field of view and ergonomics.
2022: Magic Leap launched the Magic Leap 2 headset.
2024: Meta Platforms revealed the Orion AR glasses prototype.
REALITY HEADSET
Augmented reality requires a head-mounted display or a handheld device, which
includes a processor, display, sensors, and one or more input devices.
Modern mobile computing devices like smartphones and tablet computers contain
these elements, which often include a camera and microelectromechanical systems
(MEMS) sensors such as an accelerometer, GPS, and solid state compass.[59][60]
Various technologies can be used to display augmented reality, including optical
see-through head mounted displays, monitors, and handheld devices. Two of the
display technologies used in augmented reality are diffractive waveguides and
reflective waveguides.
Head-mounted displays
A head-mounted display (HMD) is a display device worn on the forehead, such as
a harness or helmet-mounted. HMDs place images of virtual objects over the user's
field of view. Augmented reality HMDs are either optical see-through or video
passthrough. Modern HMDs often employ sensors for six degrees of
freedom monitoring that allow the system to align virtual information to the
physical world and adjust accordingly with the user's head movements. When
using AR technology, the HMDs only require relatively small displays. In this
situation, liquid crystal on silicon (LCOS) and micro-OLED (organic light-emitting
diodes) are commonly used. HMDs can provide VR users with mobile and
collaborative experiences. Specific providers, such as uSens and Gestigon,
include gesture controls for full virtual immersion.
AR headsets typically have a field of view of about 30 to 50 degrees per eye.
Near-eye augmented reality devices can be used as portable head-up displays as
they can show data, information, and images while the user views the real world.
This is basically what a head-up display does; however, practically speaking,
augmented reality is expected to include registration and tracking between the
superimposed perceptions, sensations, information, data, and images and some
portion of the real world.
AR glasses
AR displays can be rendered on devices resembling eyeglasses. Versions include
eyewear that employs cameras to intercept the real world view and re-display its
augmented view through the eyepieces and devices in which the AR imagery is
projected through or reflected off the surfaces of the eyewear lens pieces.
The EyeTap (also known as Generation-2 Glass) captures rays of light that would
otherwise pass through the center of the lens of the wearer's eye, and substitutes
synthetic computer-controlled light for each ray of real light.
An example of an AR glasses product is the Snap Spectacles.
Handheld
An augmented reality app on a smartphone using GPS and a solid state compass
A handheld display employs a small display that fits in a user's hand. All handheld
AR solutions to date opt for video passthrough. Initially handheld AR
employed fiducial markers, and later GPS units and MEMS sensors such as digital
compasses and six degrees of freedom accelerometer–gyroscope.
Today simultaneous localization and mapping (SLAM) markerless trackers such as
PTAM (parallel tracking and mapping) are starting to come into use. Handheld
display AR promises to be the first commercial success for AR technologies. The
two main advantages of handheld AR are the portable nature of handheld devices
and the ubiquitous nature of camera phones. The disadvantages are the physical
constraints of the user having to hold the handheld device out in front of them at all
times, as well as the distorting effect of classically wide-angled mobile phone
cameras when compared to the real world as viewed through the eye.
Contact lenses
Contact lenses that display AR imaging are in development. These bionic contact
lenses might contain the elements for display embedded into the lens including
integrated circuitry, LEDs and an antenna for wireless communication.
The first contact lens display was patented in 1999 by Steve Mann and was
intended to work in combination with AR spectacles, but the project was
abandoned, then 11 years later in 2010–2011. Another version of contact lenses, in
development for the U.S. military, is designed to function with AR spectacles,
allowing soldiers to focus on close-to-the-eye AR images on the spectacles and
distant real world objects at the same time.
Projection mapping
Projection mapping augments real-world objects and scenes without the use of
special displays such as monitors, head-mounted displays or hand-held devices.
Projection mapping makes use of digital projectors to display graphical
information onto physical objects. The key difference in projection mapping is that
the display is separated from the users of the system. Since the displays are not
associated with each user, projection mapping scales naturally up to groups of
users, allowing for collocated collaboration between users.
Examples include shader lamps, mobile projectors, virtual tables, and smart
projectors. Shader lamps mimic and augment reality by projecting imagery onto
neutral objects. This provides the opportunity to enhance the object's appearance
with materials of a simple unit—a projector, camera, and sensor.
Other applications include table and wall projections. Virtual showcases, which
employ beam splitter mirrors together with multiple graphics displays, provide an
interactive means of simultaneously engaging with the virtual and the real.
A projection mapping system can display on any number of surfaces in an indoor
setting at once. Projection mapping supports both a graphical visualization and
passive haptic sensation for the end users. Users are able to touch physical objects
in a process that provides passive haptic sensation.
3D tracking
3D tracking is an integral part of augmented reality, as it allows a headset and
controllers to be tracked in the user's environment. Tracking is often camera-based,
which uses cameras on the device.
Mobile augmented-reality systems use one or more of the following motion
tracking technologies: digital cameras and/or other optical sensors, accelerometers,
GPS, gyroscopes, solid state compasses, radio-frequency identification (RFID).
These technologies offer varying levels of accuracy and precision. These
technologies are implemented in the ARKit API by Apple and ARCore API
by Google to allow tracking for their respective mobile device platforms.
CMOS camera sensors are widely used for camera-based tracking in AR
technology.
Camera-based tracking
Comparison of fiducial markers used for 3D tracking in augmented reality
Augmented reality systems must realistically integrate virtual imagery into the real
world. The software must derive real world coordinates, independent of camera,
and camera images. That process is called image registration, and uses different
methods of computer vision, mostly related to video tracking. Many computer
vision methods of augmented reality are inherited from visual odometry.
Usually those methods consist of two parts. The first stage is to detect interest
points, fiducial markers or optical flow in the camera images. This step can
use feature detection methods like corner detection, blob detection, edge
detection or thresholding, and other image processing methods. The second stage
restores a real world coordinate system from the data obtained in the first stage.
Some methods assume objects with known geometry (or fiducial markers) are
present in the scene. In some of those cases the scene 3D structure should be
calculated beforehand. If part of the scene is unknown, simultaneous localization
and mapping (SLAM) can map relative positions. If no information about scene
geometry is available, structure from motion methods like bundle adjustment are
used. Mathematical methods used in the second stage include: projective (epipolar)
geometry, kalman and particle filters, nonlinear optimization, robust statistics.
Input devices
Techniques include gesture recognition systems that interpret a user's body
movements by visual detection or from sensors embedded in a peripheral device
such as a wand, stylus, pointer, glove or other body wear. Products which are
trying to serve as a controller of AR headsets include Wave by Seebright Inc. and
Nimble by Intugine Technologies.
Processing
Computers are responsible for graphics and processing 3D tracking data for
augmented reality. For camera-based 3D tracking methods, a computer analyzes
the sensed visual and other data to synthesize and position virtual objects. With the
improvement of technology and computers, augmented reality is going to lead to a
drastic change on ones perspective of the real world.
Computers are improving at a very fast rate, leading to new ways to improve other
technology. Computers are the core of augmented reality. The computer receives
data from the sensors which determine the relative position of an objects' surface.
This translates to an input to the computer which then outputs to the users by
adding something that would otherwise not be there. The computer comprises
memory and a processor.[112] The computer takes the scanned environment then
generates images or a video and puts it on the receiver for the observer to see. The
fixed marks on an object's surface are stored in the memory of a computer. The
computer also withdraws from its memory to present images realistically to the
onlooker.
Software
Augmented Reality Markup Language (ARML) is a data standard developed
within the Open Geospatial Consortium (OGC), which consists of Extensible
Markup Language (XML) grammar to describe the location and appearance of
virtual objects in the scene, as well as ECMAScript bindings to allow dynamic
access to properties of virtual objects.
To enable rapid development of augmented reality applications, software
development applications have emerged, including Lens Studio from Snapchat and
Spark AR from Facebook. Augmented reality Software Development Kits (SDKs)
have been launched by Apple and Google: Apple's ARKit and Google's ARCore.
According to a 2017 Time article, in about 15 to 20 years it is predicted that
augmented reality and virtual reality are going to become the primary method for
computer interactions.
Rendering
Software that renders onto displays can create a realistic view by using occlusion,
which hides parts of virtual objects behind parts of the real world. Having accurate
occlusion creates a much more realistic view of virtual objects integrated into the
real world.
Design
AR systems rely heavily on the immersion of the user. The following lists some
considerations for designing augmented reality applications:
Environmental/context design
Context Design focuses on the end-user's physical surrounding, spatial space, and
accessibility that may play a role when using the AR system. Designers should be
aware of the possible physical scenarios the end-user may be in such as:
Public, in which the users use their whole body to interact with the software
Personal, in which the user uses a smartphone in a public space
Intimate, in which the user is sitting with a desktop and is not really moving
Private, in which the user has on a wearable.[118]
By evaluating each physical scenario, potential safety hazards can be avoided and
changes can be made to greater improve the end-user's immersion. UX
designers will have to define user journeys for the relevant physical scenarios and
define how the interface reacts to each.
Another aspect of context design involves the design of the system's functionality
and its ability to accommodate user preferences. While accessibility tools are
common in basic application design, some consideration should be made when
designing time-limited prompts (to prevent unintentional operations), audio cues
and overall engagement time. In some situations, the application's functionality
may hinder the user's ability. For example, applications that is used for driving
should reduce the amount of user interaction and use audio cues instead.
Visual design
In some augmented reality applications that use a 2D device as an interactive
surface, the 2D control environment does not translate well in 3D space, which can
make users hesitant to explore their surroundings. To solve this issue, designers
should apply visual cues to assist and encourage users to explore their
surroundings.
4. Architecture
AR can aid in visualizing building projects. Computer-generated images of a
structure can be superimposed onto a real-life local view of a property before the
physical building is constructed there; this was demonstrated publicly by Trimble
Navigation in 2004. AR can also be employed within an architect's workspace,
rendering animated 3D visualizations of their 2D drawings. Architecture sight-
seeing can be enhanced with AR applications, allowing users viewing a building's
exterior to virtually see through its walls, viewing its interior objects and layout.
With continual improvements to GPS accuracy, businesses are able to use
augmented reality to visualize georeferenced models of construction sites,
underground structures, cables and pipes using mobile devices. Augmented reality
is applied to present new projects, to solve on-site construction challenges, and to
enhance promotional materials. Examples include the Daqri Smart Helmet, an
Android-powered hard hat used to create augmented reality for the industrial
worker, including visual instructions, real-time alerts, and 3D mapping.
5. Fitness
AR hardware and software for use in fitness includes smart glasses made for biking
and running, with performance analytics and map navigation projected onto the
user's field of vision, and boxing, martial arts, and tennis, where users remain
aware of their physical environment for safety. Fitness-related games and software
include Pokémon Go and Jurassic World Alive.
6. In media
The futuristic short film Sight features contact lens-like augmented reality devices.
ARTag – Fiducial marker system
Augmented reality-based testing – type of testing
WebAR – Web technology
Automotive head-up display – Advanced driver assistance system
Bionic contact lens – Proposed device to display information
Computer-mediated reality – Ability to manipulate one's perception of
reality through the use of a computer
Cyborg – Being with both organic and biomechatronic body parts
Wearable computer – Small computing device worn on the body
CONCLUSION
In conclusion, Augmented Reality (AR) and Virtual Reality (VR) have evolved
from emerging technologies into powerful tools that are actively shaping numerous
sectors of society. Their ability to create immersive, interactive, and engaging
experiences holds great promise in transforming how we learn, work, heal, and
interact. From revolutionizing education through virtual classrooms to enhancing
medical procedures with real-time visual overlays, AR and VR offer innovative
solutions to complex real-world problems.