0% found this document useful (0 votes)
41 views4 pages

AR-Based Indoor Localization System

The document discusses using augmented reality on smartphones for indoor localization and navigation. It combines image recognition of markers with inertial tracking from smartphone sensors to determine location. The system allows overlaying virtual objects on the real-time camera view to provide information to users.

Uploaded by

Bunny Gk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views4 pages

AR-Based Indoor Localization System

The document discusses using augmented reality on smartphones for indoor localization and navigation. It combines image recognition of markers with inertial tracking from smartphone sensors to determine location. The system allows overlaying virtual objects on the real-time camera view to provide information to users.

Uploaded by

Bunny Gk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Indoor localization and navigation using

smartphones augmented reality and inertial tracking


Buti Al Delail, Luis Weruaga, M. Jamal Zemerly Jason W.P. Ng
Electrical and Computer Engineering Department Etisalat BT Innovation Center
Khalifa University of Science, Technology and Research BT Innovate & Design
United Arab Emirates Abu Dhabi, UAE
{[Link], [Link], [Link]}@[Link] [Link]@[Link]

Abstract—Over the last years, indoor localization and On the other hand, localization in indoor scenarios is domi-
navigation is becoming a hot topic. With the increasing number nated by wireless-based positioning techniques in combination
of buildings, indoor positioning and navigation has turned with inertial sensors, and others [5], [6]. The combination
out to be more important than outdoors. In the literature,
many papers discuss wireless based indoor positioning systems. of computer vision and inertial measurements has been also
Essentially based on Wireless Fidelity (Wi-Fi), Bluetooth, Radio explored in indoor environments [7], but this requires un-
Frequency Identification (RFID) or existing solutions that imply comfortable wearable computing equipment. A breakthrough
the measurement of radio signals. In this paper, we evaluate an application derived somewhat from mobile visual computing
indoor image-based positioning system that takes advantage of is augmented reality (AR), namely, the display of virtual 3D
smartphones augmented reality (AR) and inertial tracking. The
excellent computing capabilities in todays highend phones or objects that merge seamlessly with the actual video scene
smartphones in combination with its resources of sensors, such captured by the device camera. As AR is founded on detection
as Global Positioning System (GPS), inertial sensors, camera, of a known marker, as well as its tilt in 3D coordinates [8],
wireless receivers, are powering the mobile application sector when the marker is in a permanent static location, de facto
to the extent of becoming the fastest growing one in data localization is accomplished.
communication technologies. AR as an emerging technology has
the potential of creating new types of indoor location based II. C ONCEPT AND R ELATED W ORK
services for the near future. Here, we show some of the AR
capabilities combined with inertial tracking for localization and The work in [9] the first step towards mobile and wearable
navigation. augmented reality. The hardware design has many disadvan-
tages such as the difficulty to carry-with and unlikely to be
Index Terms—Indoor Localization, Augmented Reality, Iner-
tial Tracking, Indoor Navigation
used by everyday users. After that, the hand-held augmented
reality came in the form of PDAs and laptops with frontally
mounted camera, then when smartphones emerged with en-
I. I NTRODUCTION
hanced computational power it became a convenient hardware
With the rapid advancements in smartphones technology, platform for AR systems, primarily because of the ease of use
the ability of obtaining the position of devices and persons for inexperienced users [10].
is the driving core for many applications. GPS is now avail- Following the previous concept, this article proposes a
able in every smartphone and being used for all kinds of scalable mobile based system for indoor location detection
applications, not only by the traditional maps application. and tracking using a combination of image marker recognition
Nowadays the location used for calculating accurate local time, and inertial measurements. This localization service is core to
current weather data or providing customized information that deliver a context-aware information system built around an
is useful to individual user. Also, it is the best existing solution augmented reality software layer. The AR layer informs the
for outdoor navigation with a precision of up to 1 meter [1]. user of a nearby point of interest, overlaying self-explanatory
However, in indoor environments, the radio signal quickly 3D virtual objects related to the location on the real-time video
weakens because of the roof thickness, walls and nearby capture. The points of interest nearby also shown in a 360
objects [2]. Knowledge about the orientation and location degree fashion supported on the device compass readings.
delivered by current handheld devices allows for context-aware Any AR virtual object is “clickable” or “touchable”, such
information services tailored to individual user according to that the user obtains information related thereupon. Recently
their preference. In that regard, the excellent performance in there has been similar work related to vision-based indoor
mobile visual computing [3] permits the detection and tracking localization, navigation and the use of augmented reality.
of markers and planar objects in real time. This information For instance, authors in [11] proposed a vision-based mobile
can be used for mobile visual location recognition [4] in dense indoor localization and navigation system with Augmented
outdoor or urban areas where GPS measurements are either or Virtual Reality (VR) interfaces. In this paper, we use AR
unavailable or inaccurate. only since the visual localization and inertial tracking is fairly

978-1-4799-2452-3/13/$31.00 ©2013 IEEE 929


accurate for indoor use, insuring that the graphical contents IV. AUGMENTED R EALITY I NTERFACE
will augment the video stream as intended. Not only that, but The concept of AR makes it usable for unlimited applica-
the research from the same authors showed that users preferred tions, the advantage here is that the AR interface naturally
the AR interface over VR interface. Therefore, the system [12] fits with vision-based localization. At the same time that the
was initially developed for the University campus where the image is being tracked and the users position is known, an
authors work. Another use for this application is the work in informative computer generated graphical objects augment
[13], which leverages the smartphones AR capabilities for a parts of the screen. Fig. 2 shows a snapshot of the 3D AR
treasure hunt games. Also, the work in [14] using Near Field objects superimposed on the original view captured by the
Communication (NFC) tags and Quick Response (QR) Codes camera. The acquired facts of the detected marker (shown in
for indoor localization and map guidance. Fig. 1) delivered by Vuforia corresponds to the distance and
tilt of the scene is used to render the 3D object in the “correct”
III. V ISION BASED L OCALIZATION place. Moreover, “touchable” virtual buttons, provided also
by the Vuforia SDK [19], allow for visual interaction with
Using vision as a base for localization has a number of the identified image, and used for information retrieval. The
advantages over other real-time localization methods. It does system becomes portable and easily configurable by allowing
not rely on external electronic devices (e.g. Bluetooth or WiFi 3D object data to be downloadable from the cloud.
access points) and takes advantage of the camera and enough
processing power, which put together in every smartphone.
Vision-based tracking is currently the most accurate type of
tracking, which is measured in pixels. A lot of progress has
been done in the visual tracking area, where approaches are
gradually moving from marker-based tracking to Simultaneous
Localization And Mapping techniques (SLAM) [15] [16]. Us-
ing markers is one way of telling the computer how, what and
where to augment the real-world view. Many visual techniques
are based on tracking high-level image key-points (such as
Speeded up Robust Feature (SURF), Binary Robust Invari-
ant Scalable Keypoints (BRISK) and Fast Retina Keypoint
(FREAK) [17]) instead of pixels. These key-points are not Fig. 1. Target image features and fiducial points.
just used for tracking, but also for object and image detection
for embedded devices with low memory and computational
power, enabling faster object recognition and AR registration
in low-end devices.
In order to get the initial location, the smartphone user
must catch sight of an image marker using the camera. Each
marker has a permanent indoor location that is stored in
the database. When a marker is detected (see Fig. 1 for an
illustrative explanation of the feature or fiducial points of a
certain marker), its unique identifier is used to search the
database for the location of that marker. Since the user is
near to the marker, this can be used as the user location.
Furthermore, the visual tracking can be combined with the
inertial sensors (compass, accelerometer and gyroscope) to
get better tracking abilities, some promising results in visual-
inertial sensor fusion can be seen in [18].
The challenge here is that vision-based localization heavily Fig. 2. Augmented reality: 3D virtual object displayed on the video scene;
‘clickable” virtual buttons (in red line); when a virtual button is pressed, the
relies on the recognition of known images in the environment, related information (faculty name) is displayed.
and when the environment is large, the number of images
increase. The smartphone can not handle processing a large V. I NERTIAL T RACKING
database of images. A feasible solution is moving the image Inertial tracking overrides the visual SLAM described in
recognition processing to a remote workstation. Cloud-based Section III when the tracked image goes out of frame. There
recognition (available in Vuforia [19]) enables the smartphone are many inertial tracking systems that detect the user footsteps
to recognize a vast number of images in real-time. Addition- in order to perform SLAM [20], but only recently, these
ally, positioning the images database on the cloud allows easy techniques are being explored to be used in smartphones.
update in case changes occurred in the environment. Despite the challenge in dealing with measurement errors and

930
dead reckoning from the accelerometer, recent methods in
signal recognition, position prediction and correction looks to
be promising. Therefore, the availability of inertial sensors and
processing power in smartphones, and being the most common
device that people carry, it is the most favorable hardware for
inertial tracking.
The method we use detects the user footsteps based on
the accelerometer readings (see Fig. 3), and updates the user
location 2 ft towards the direction read from the compass. This
implementation is reasonably accurate, provided that the user
knows how to use it.

Fig. 4. Trace (red) obtained from the inertial location tracking module for
a walking path length of about 300 ft. The left snapshot shows the original
estimated walking path, in contrast, the right snapshot shows the corrected
path.

Fig. 3. Magnitude of the accelerometer readings with walking user: every


cycle in the signal corresponds to a user footstep.

The result from the initial approach is shown in Fig. 4,


where the walking path is more than 300 ft long, resulted
in localization error of less than 3 meters at the destination.
However, it can be seen that the overall error at certain
points is more. Furthermore, the compass heading is not very
reliable, and using that only with no calibrations may produce
errors as high as 90◦ . Therefore, to improve the accuracy we
calibrate the compass and add contraints on the estimations to
insure that the estimated location is within the path. The error
correction mechanism uses the compass direction to produce
a field of view, where the estimated heading is considered
towards the nearest node within view, thus, not relying on the
exact heading degree for the direction. The improvement can
be seen in Table I for the same path, the heading error with the
correction mechanism depends on the user location, walking
path and the node infront.
TABLE I
I NERTIAL TRACKING WITH AND WITHOUT CORRECTION

Comparison No Correction With Correction


Maxiumum Heading Error +-5◦ +-2.3◦
Final Position Accuracy 3 meter 1 meter
Fig. 5. Magnitude of the accelerometer readings and the direction of motion
Walking Path Length 306.7 meter 309.4 meter

Further efforts are being put to track the location using VI. I NDOOR NAVIGATION
inertial sensors, disregarding the need of having the device
held in a fixed position. Fig. 5 shows a sample of the The localization system can be used efficiently to obtain
accelerometer reading captured while the user is walking, with the user location and search for other places. Determining the
the device is being inside the pocket. Principal Component path from the user location to the destination is the key to
Analysis (PCA) is put to use to identify the direction of providing indoor navigation. In order for the system to search
motion. The pricipal component is calculated for a window for the path, intersections within the building are added to
size of 1 second (+-0.2 sec.) to remove the signal noise extract the location database with links that identify which locations
from the instant acceleration in order to obtain a high-level can be directly connected. This data can be used to search
vector of the gravity and motion. for the shortest distance path between any two locations.

931
Therefore, we use the A* algorithm [21] to find the shortest there is no doubt that AR is the future. The system in this
path. The algorithm ensures finding the shortest path in the paper can be deployed on any hardware with the sufficient
fastest possible way without the need of exhaustive visiting or resource and sensors available in any modern smartphone.
testing of all locations.
R EFERENCES
Moreover, navigation is not about finding areas and paths,
but also guiding and monitoring the movement of the user. [1] H. Koyuncu, S. H. Yang, “A Survey of Indoor Positioning and Object
Locating Systems,” IJCSNS International Journal of Computer Science
Here, where AR plays another role in the interface. Instead and Network Security, Vol. 10, No. 5, pp. 121-128, 2010
of looking at a 2D map while navigating, AR can be used to [2] E. Coca, V. Popa, and G. Buta, “An Indoor Location System Performance
show virtual guiding objects to the user view. The previously Evaluation and Electromagnetic Measurements,” IJCSNS International
Journal of Computer Science and Network Security, Vol. 10, No. 5, pp.
described SLAM allows computer graphics (e.g. signs and 121-128, 2010
directions) to be mapping and updated simultaneously to the [3] B. Girod, et al., “Mobile visual search,” IEEE Signal Processing Mag.,
camera view. For instance, points of interest AR displayed in vol. 28, no. 4, pp. 61–76, Jul. 2011.
[4] G. Schroth, et al., “Mobile visual location recognition,” IEEE Signal
Fig. 6. This method relies on the compass together with the Processing Mag., vol. 28, no. 4, pp. 77–89, Jul. 2011.
gyroscope, to produce a 360-degree scene with the points of [5] L. Klingbeil, et al., “A modular and mobile system for indoor localiza-
interest (PoI) in the vicinity of the current mobile user location. tion,” Intl. Conf. Indoor Positioning & Navigation, 2010, pp. 1–10.
[6] S. Lee, B. Kim, H. Kim, R. Ha, and H. Cha, “Inertial sensor-based
As proof of concept and for the sake of simplicity, the PoIs indoor pedestrian localization with minimum 802.15.4a configuration,”
are presented with a simple clickable button. IEEE Trans. Industrial Informatics, vol. 7, no. 3, pp. 455–466, Aug.
2011.
[7] D. Chdid, R. Oueis, H. Khoury, D. Asmar, and I. Elhajj, “Inertial-
vision sensor fusion for pedestrian localization,” IEEE Intl. Conf Robotics
Biomimetics, 2011, pp. 1695–1701.
[8] D. Wagner, et al., “Real-time detection and tracking for augmented reality
on mobile phones,” IEEE Trans. Visual. Comp. Graphics, vol. 16, no. 3,
pp. 355–368, May/Jun. 2010.
[9] S. Feiner, B. MacIntyre, and T. Hollerer. Wearing It Out: First Steps
Toward Mobile Augmented Reality Systems. In International Symposium
on Mixed Reality (ISMR 99), pp. 363-377, 1999.
[10] D. Schmalstieg and D. Wagner, “Experiences with Handheld Augmented
Reality,” Proc. Sixth IEEE and ACM Intl Symp. Mixed and Augmented
Reality, 2007.
[11] A. Mller, M. Kranz, R. Huitl, S. Diewald, L. Roalter, “A mobile indoor
navigation system interface adapted to vision-based localization,” 11th
International Conference on Mobile and Ubiquitous Multimedia, ACM,
pp.4-14, 2012.
[12] B. Al Delail, L. Weruaga, M. J. Zemerly, “CAViAR: Context Aware
Visual indoor Augmented Reality for a University Campus,” IEEE Intl.
Conf on Web Intelligence and Intelligent Agent Technology, pp.286-290,
Dec. 2012.
[13] Z. Balint, B. Kiss, B. Magyari, K. Simon, “Augmented Reality and
Image Recognition Based Framework for Treasure Hunt Games,” 10th
Fig. 6. Indoor 360-degree points-of-interest view.
Jubilee International Symposium on Intelligent Systems and Informatics,
2012, pp. 148–152.
[14] O. Al Hammadi, A. Al Hebsi, M. J. Zemerly, J. W. P. Ng, “Indoor
VII. C ONCLUSION AND F UTURE W ORK Localization and Guidance Using Portable Smartphones” IEEE Intl. Conf
on Web Intelligence and Intelligent Agent Technology, pp.337-341, Dec.
This paper presents a new augmented reality system method 2012.
[15] D. Stricker, G. Bleser, “From Interactive to Adaptive Augmented Real-
for indoor localization and navigation, thus allowing users ity,” Ubiquitous Virtual Reality (ISUVR), 2012 International Symposium,
to be aware of their locations and making it easier to find pp.18-21, 22-25 Aug. 2012.
places with reasonable accuracy. The use of image recognition [16] G. Klein, D. Murray, “Parallel Tracking and Mapping for Small AR
Workspaces,” Mixed and Augmented Reality, 2007. ISMAR 2007. 6th
indoors enables the system to run on any campus, provided the IEEE and ACM International Symposium, pp.225-234, 13-16 Nov. 2007.
database for recognizing markers, obtaining the location and [17] A. Alahi, R. Ortiz, P. Vandergheynst, “FREAK: Fast Retina Keypoint,”
displaying the information in AR exists. Also, results show Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Confer-
ence, pp.510-517, 16-21 June 2012.
that inertial navigation provides a suitable approach to track [18] G. Bleser, D. Stricker, “Advanced tracking through efficient image
the user location indoors. processing and visual-inertial sensor fusion,” IEEE Virtual Reality 2008,
Moreover, smartphone based augmented reality separates IEEE, pp.137-144, 2008.
[19] Vuforia SDK, Qualcomm Inc., [Link] Apr. 2013.
Augmented Reality from Wearable Computing as initial AR [20] P. Robertson, M. Angermann, B Krach, “Simultaneous localization and
systems hardware consisted of a Head-mounted Display mapping for pedestrians using only foot-mounted inertial sensors”, In
(HMD), which must be wearable. Therefore, wearable AR 2009 Proceedings of the 11th international conference on Ubiquitous
computing (Ubicomp ’09). ACM, New York, NY, USA, pp.93-96.
glasses have been a research and development project by major [21] P. Hart, N. Nilsson, and B. Raphael, “A formal basis for the heuristic
companies (such as Googles Glass Project [22]), which will determination of minimum cost path,” IEEE Transactions of Systems
not just be the next-generation mobile AR hardware platform, Science and Cybernetics, pp.100-107, 1968.
[22] “Google gets in your face [2013 Tech To Watch],” Spectrum, IEEE,
but also replace smartphones. Although there is a gap in vol.50, no.1, pp.26-29, 2013.
technology to enable AR to be a part of peoples daily life,

932

Common questions

Powered by AI

Potential future advancements in smartphone-based AR systems for indoor navigation, inferred from current research, include the integration of enhanced sensor technologies and machine learning algorithms to improve localization accuracy and user interaction. As sensors become more advanced and smaller, smartphones could provide even more precise inertial tracking, potentially making full reliance on external markers obsolete . Further development in Synchronous Localization and Mapping (SLAM) technologies could facilitate complex environment mapping, adapting to changes in real-time . Advances in machine learning might allow AR systems to predict user movement patterns or augment navigation with personalized, AI-driven recommendations . Finally, as AR glasses and wearable devices evolve, they could supplant current smartphone interfaces, offering more immersive and naturally interactive navigation experiences, driving AR into mainstream adoption .

Integrating Augmented Reality (AR) with Simultaneous Localization and Mapping (SLAM) significantly enhances the user experience in indoor navigation applications. This combination allows for continuous updates to the user's location and environment mapping in real time, offering a seamless and dynamic navigation interface . SLAM techniques aid in mapping unknown environments and track the user's location relative to these maps, which is crucial in environments where GPS fails . AR adds another layer by augmenting the real world with virtual objects, such as navigational cues or informational overlays, providing intuitive guidance and contextually aware interactions . As a result, users benefit from accurate navigation in complex indoor settings while receiving immediate, visually enriched feedback, which enhances engagement and efficiency in orientation and movement tasks .

Inertial tracking compensates for the limitations of vision-based tracking in smartphones by providing continuous localization data even when visual cues are missing. Vision-based tracking relies on known visual markers or features, which can be occluded or out of the camera's field of view, resulting in potential tracking failures . In such situations, inertial sensors—like accelerometers and gyroscopes—track motion and movement continuously, using data to estimate the user's trajectory and orientation . This redundant tracking capability ensures smoother and more reliable updates for navigation systems, especially important for maintaining accurate positioning in dynamic indoor environments where visual tracking can frequently be interrupted . Additionally, inertial tracking can improve user experience by eliminating the need for constant camera operation, reducing power consumption on the device .

Augmented reality (AR) offers significant improvements for indoor localization and navigation compared to traditional GPS systems. While GPS provides excellent precision for outdoor navigation, its signal weakens indoors due to roof thickness, walls, and other obstructions . AR leverages smartphone capabilities such as cameras and inertial sensors to enable vision-based tracking and localization, which provides more precise indoor positioning . AR combined with inertial tracking creates an enhanced experience by enabling real-time detection and tracking of markers and objects, overcoming the limitations of GPS in indoor environments . Moreover, AR systems can provide a richer user interface by overlaying virtual objects on the real world, thus enhancing navigation experiences with intuitive visual cues .

The A* algorithm enhances pathfinding in indoor navigation systems within AR interfaces by efficiently identifying the shortest and most cost-effective path between the user's location and a destination. By using heuristics, A* algorithm can reduce the need to exhaustively search all possible paths, making it both faster and more efficient compared to basic search algorithms . This is particularly beneficial in complex indoor environments with multiple routes, as A* adapts to dynamic changes in the environment, allowing for real-time adjustments in the AR navigation interface . As a result, users benefit from a navigation system that intelligently guides them through optimal paths while enhancing their experience with AR visuals that clearly indicate directions and points of interest .

Augmented reality (AR) is considered an important technology for the future of navigation systems due to its capability to enhance usability and interactivity by merging digital information with the physical world. It allows for a more intuitive way of presenting navigational data, which can be crucial in environments where traditional GPS coverage is inadequate, such as indoors . By superimposing 3D objects and directions onto a user's view of the actual world, AR not only improves understanding and decision-making but also provides a seamless augmented experience, enhancing user engagement and effectiveness in navigation scenarios . Furthermore, AR-driven navigation is poised to be more accessible with the development of wearable technologies, making it a scalable solution for various applications beyond individual navigation, such as industry settings or public accommodations .

Indoor navigation systems face several challenges regarding image recognition, such as the need for processing a large set of images to achieve accurate and efficient localization. The primary challenge is the limited processing power and memory capacity of smartphones, which makes it difficult to handle complex image processing tasks . Additionally, variations in environmental conditions, such as lighting changes and occlusions, further complicate accurate image recognition . Cloud processing addresses these challenges by offloading image recognition tasks to powerful remote servers, enabling the handling of a large volume of images and rapid processing without overburdening the smartphone's local resources. This setup also supports real-time updates to the image database, allowing for adaptive and responsive navigation solutions as settings or environments change . This cloud-based approach allows users to benefit from accurate localization without the drawbacks of hardware limitations .

Inertial sensors play a crucial role in indoor navigation systems by complementing vision-based tracking, especially when visual markers are not in view. They provide continuous tracking of movements by using data from accelerometers and gyroscopes to estimate the user's location and movement . They address tracking accuracy challenges by allowing the system to continue updating location based on detected footsteps and movements, even when visual signals are unavailable . Calibration mechanisms, such as correcting compass errors, further enhance accuracy, reducing potential heading errors and positioning inaccuracies, ensuring more reliable navigation despite the absence of external cues .

The use of a cloud-based image recognition system in AR for indoor navigation holds significant advantages. It allows smartphones to manage a vast number of images and recognize them in real-time without overloading the device's processing capabilities . By moving the image processing to the cloud, it provides flexibility and scalability for navigation systems, making it feasible to update and manipulate large databases of images quickly. This setup ensures that the smartphone can efficiently handle recognition tasks and thus improve the performance and accuracy of indoor navigation systems by ensuring up-to-date database access and efficient AR rendering .

Wearable computing plays a crucial role in advancing AR applications for indoor navigation by offering a more seamless and immersive user experience compared to handheld devices. Unlike smartphones, wearable devices such as AR glasses provide hands-free access to navigational information, which enhances usability and efficiency, particularly in complex or multitask environments . Wearables facilitate an uninterrupted AR experience with intuitive overlays of directions directly within the user's field of vision, thus reducing cognitive load and enhancing situational awareness . As technology develops, wearables are expected to offer advanced sensory data integration, leading to improved tracking accuracy and context-awareness, ultimately broadening the potential applications of AR in both personal and professional navigation scenarios . Consequently, wearables could transition AR from a novelty to an essential tool in daily operations and complex task environments .

You might also like