0% found this document useful (0 votes)
52 views

Tracking

This document discusses various technologies for tracking objects in 3D space, including both stationary and mobile sensor systems. It covers coordinate systems, calibration techniques, and the measurement principles of technologies like electromagnetic tracking, ultrasonic tracking, GPS, inertial sensors, and optical tracking. Key aspects like measurement characteristics, errors, and temporal properties are described for different sensor types.

Uploaded by

ZeePiDee
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

Tracking

This document discusses various technologies for tracking objects in 3D space, including both stationary and mobile sensor systems. It covers coordinate systems, calibration techniques, and the measurement principles of technologies like electromagnetic tracking, ultrasonic tracking, GPS, inertial sensors, and optical tracking. Key aspects like measurement characteristics, errors, and temporal properties are described for different sensor types.

Uploaded by

ZeePiDee
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Tracking

Agenda
• Tracking, Calibration, and Registration
• Coordinate Systems
• Characteristics
• Stationary Tracking Systems
• Mobile Sensors
• Optical Tracking
• Sensor Fusion
Tracking, Calibration, and Registration
• Registration = alignment of spatial properties
• Calibration = offline adjustment of
measurements Calibration
• Spatial calibration yields static registration
no cali
• Offline: once in lifetime or once at startup n-s bra
pa pati tion
ic
at on
St trati
tra ram al o gis
• Alternative: autocalibration ck et sp f
ing er ati
de s of al
re

vic
• Tracking = dynamic sensing and measuring
es
Tracking Registration
of spatial properties Dynamic
registration

• Tracking yields dynamic registration


• Tracking in AR/VR always means “in 3D”!
Coordinate Systems
Eye
Local object coordinates
coordinates

Perspective transformation
• Calibrate offline
• For both camera and display

Model transformation View transformation


• Track for moving objects, • Track for moving objects,
if there are static objects as well if there are no static objects
• Track for moving observer

Global world
coordinates
Frames of Reference
• Word-stabilized
• E.g., billboard or signpost
HUD
• Item 1 This way
• Body-stabilized • Item 2
• E.g., virtual tool-belt • Item 3

Body-stabilized
• Item 4

– center –
• Screen-stabilized

Body-stabilize
• Heads-up display

– right –
Measurement Coordinates
• Global vs. local measurements
• Global  larger (or unlimited) workspace
• Local  better accuracy
• Absolute vs. relative measurements
• Absolute  coordinate system defined in advance
• Relative  incremental sensing
Physical Phenomena
• Electromagnetic radiation
• Visible light
• Infrared light
• Laser light
• Radio signals
• Magnetic flux
• Sound
• Physical linkage
• Gravity
• Inertia
Measurement Principle
• Signal strength
• Signal direction
• Time of flight
• Absolute time
• Signal phase
• Requires synchronized clocks
Degrees of Freedom (DOF)
• DOF = independent dimension of measurement
• Full tracking requires 6DOF
• 3DOF position (x, y, z)
• 3DOF orientation (roll, pitch, yaw)
• Some sensor deliver only a subset
• E.g., gyroscope  3DOF orientation only
• E.g., tracked LED  3DOF position only
• E.g., mouse  2DOF position only
Measured Geometric Property
• Trilateration: 3 distances • Triangulation: 2 angles, 1
distance
P3

d3

M M

d1 d2
α1 α2
P1 P2 P1 d12 P2
Sensor arrangement
• Multiple sensors in rigid geometric configuration
• E.g., stereo camera rig
• Sparse or dense sensors
• E.g., digital camera is dense 2D array of intensity sensor with know angles
• Advanced technical issues
• Sensor synchronization
• Sensor fusion
Sensor Group Arrangement
Outside-in Inside-out
• Stationary mounted sensors • Mobile sensor(s)
• Good position, poor orientation • Good orientation, poor position
Signal Sources
• Passive sources
• Natural signals
• E.g., natural light, earth magnetic field
• Active sources
• Electronic components producing physical signal
• Can be direct or indirect (reflected)
• E.g., acoustic, optical, radiowaves
• Most forms require open line of sight
• No sources
• Most important example: inertia
Measurement error
• Accuracy
• How close is measurement to true value
• Affected by systematic errors
• Can be improved with better calibration
• Precision
• How closely do multiple measurements agree (random error, noise)
• Varies per type of sensor
• Varies per degree of freedom
• Can be improved with filtering (more computation, more latency)
• Resolution
• Minimum difference that can be discriminated between two measurements
• Cannot be reached in practice because of noise
Temporal Characteristics
• Update rate:
• Number of measurements per time
interval

• Measurement latency
• Time it takes from occurrence of physical
event to data becoming available
• End-to-end latency
• Time it takes from occurrence of physical
event to presentation of a stimulus
Stationary Tracking Systems
• Mechanical Tracking
• Electromagnetic Tracking
• Ultrasonic Tracking
Mechanical Tracking
• Track end-effector of articulated arm
• Joints with 2 or 3 DOF
CyberGrask Fakespace BOOM
• Rotary encoders or potentiometers
• High precision
• Fast
• Freedom of operation limited
Electromagnetic Tracking
• Stationary source produces three orthogonal magnetic fields
• Current induced in sensor coils
• Measurement of strength and phase of signal
• Signal strength falls off quadratically with distance
• Working range: half-sphere with 1-3m radius
• Prolems with electromagnetic interference

Razer Hydra
Ultrasonic Tracking
• Measures time of flight of sound pulse
• Trilateration of 3 measurements
• Requires synchronized time (cables) or more than 3 measurements
• Low update rate (10-50Hz) due to slow speed of sound
• Possible fusion with fast inertial sensors (e.g., InterSense IS-600)
• Requires open line of sight
• Suffers from noise or change of temperature
• Wide-area configuration, e.g., AT&T BAT system
• Microphones mounted in ceiling

Image: Joseph Newman


Mobile Sensors
• Global Positioning System
• Wireless Networks
• Magnetometer
• Gyroscope
• Linear Accelerometer
• Odometer
Global Positioning System
• Planet-scale outside-in radiowave time-of-flight
• Requires clock synchronization
• Must receive signals from at least 4 satellites
satellites

GPS receiver
Differential GPS
• Compensate for atmospheric distortion
• Receive correction signal from base station via network
• Real-Time Kinematics (RTK) Differential GPS also uses signal phase
satellites

GPS receiver base station


correction
signal
Wireless Networks
• Measure signal strength from WiFi, Bluetooth, mobile phone towers
• Potential trilateration/triangulation
• Mostly only good for coarse location (e.g., based on WiFi SSID)
• Fingerprinting: carefully map the signal reception in a given area
• Recent use: Bluetooth iBeacon in department stores
• Assisted GPS: accelerate GPS initialization using WiFi or GSM id
• Skyhood, Google, Broadcomm etc.
Magnetometer
• Electronic compass
• Measure direction of Earth
magnetic field in 3D
• Principle: magnetoresistance
(Hall effect)
• Often very distorted watch

measurements
Gyroscopes
Radial
movement

Coriolis
• Determines rotational velocity movement

• Electronic gyro Image: Hideyuki Tamura

• Measures Coriolis force of small vibrating object


• Micro-electromechanical system (MEMS)
• High update rate (1KHz)
• Only relative measurements
• Must integrate once to determine orientation  drift
• Laser gyro (fiber-optic gyro)
• Measures angular acceleration based on light interference
• Large, expensive, used in aviation rotation
Linear accelerometer
spring
• MEMS device spring

• Displacement of small mass mass


• Measures
• Change of electric capacity, or
• Piezoresistive effect of bending
• Subtract gravity (the difficult part!)
• Integrate twice numerically to get position mass
• Drift problems
• Combine lin.acc., gyro + compass into inertial measurement unit (IMU)
Odometer
• Mechanical or opto-electrical wheel encoder
• E.g., traditional ball mouse
Optical Tracking
• Optical sensors
• Model-Based versus Model-Free Tracking
• Illumination
• Markers versus Natural Features
• Target Identification
Optical Sensors
• Digital cameras are cheap and powerful
• CCD (charge coupled devices) – professional photography
• CMOS (complementary metal oxide semiconductor) – fast and cheap
• Computer vision techniques improve with Moore’s law
• Lenses are becoming the most limiting part

Bayer pattern
Model-Based versus Model-Free Tracking
Model-based Model-free
• A tracking model representing • At start-up, no tracking model is
the 3D world is available available
• Compare the model to • Most build a temporary tracking
observations in the images model while tracking
• Measurements only relative to
starting point
ARTTrack

Illumination
• Passive illumination
• Natural (or existing) light sources
• Visible spectrum 380-780nm
• Cannot track when it is too dark (mostly indoors)
• Active illumination
• Often infrared spectrum
• LED beacons
• Camera with infrared filter delivers high contrast
• Not suitable with sunlight
• Structured light
• Project a known pattern into the scene Microsoft Kinect V1
• Projector with regular light or laser
• Laser ranging
• Measure time of flight taken by laser pulse
• Steerable MEMS mirror for scanning laser
• LIDAR (light radar): long range laser used in surveying
Leap Motion
• 2 cameras, 3 infrared LEDs
• Short-distance reflection
of the hands
Valve/HTC Vive
• “Lighthouses” = two scanning infrared lasers
• Photodiodes on head pick up lasers
Markers vs Natural Features
• Fiducials markers
• Artificial tracking targets
• Square shapes yield 4 points (enough for pose )
• Circular shapes yield only 1 point Image: Daniel Wagner
• Digital marker model exists first,
marker manufactured second (e.g., printing)
• Natural feature tracking
• Existing visual features in the environment
• Physical features exist first,
tracking model reconstructed second

Image: Andrei State, UNC Chapel Hill


Flat Marker Designs

Image: Daniel Wagner


Retro-Reflective Ball Markers
• Light reflected towards light-source
• Illuminate with infrared LED flash
• Infrared camera observes bright blobs
• 4 or more spheres in known configuration
to recover 6DOF pose
• Multiple targets distinguished by their
geometric configuration

Advanced Realtime Tracking GmbH


Image: Martin Hirzer

Natural Features
• Detect salient interest points in image
• Must be easily found
• Location in image should remain stable
when viewpoint changes
• Requires textured surfaces
• Alternative: can use edge features (less discriminative)
• Match interest points to tracking model database
• Database filled with results of 3D reconstruction
• Matching entire (sub-)images is too costly
• Typically interest points are compiled into “descriptors”

Image: Gerhard Reitmayr


Marker Target Identification
• More targets or features  more easily confused
• Must be as unique as possible Image: Mark Fiala

• Square markers
• 2D barcodes with error correction
• E.g., 6x6=36 bits (2 orientation, 6-12 payload,
rest for error correction)
• Marker tapestries
• Spherical targets
• 5 spheres in different geometric configurations
• Can distinguish 10-20 targets
• Pulsed LEDs
Image: Greg Welch, UNC Chapel Hill
Natural Feature Target Identification
• Individual natural interest points
too easily confused
• Rely on co-occurency of interest
points for detection
• Probabilistic search methods
used to deal with errors
• Vocabulary trees
• Random sampling consensus
Image: Martin Hirzer
Complementary Sensor Fusion
• Combining sensors with different degrees of freedom
• Sensors must be synchronized (or requires inter-/extrapolation)
• E.g., combine position-only and orientation-only sensor
• E.g., orthogonal 1D sensors in gyro or magnetometer are
complementary
Competitive Sensor Fusion
• Different sensor types measure the same degree of freedom
• Redundant sensor fusion
• Use worse sensor only if better sensor is unavailable
• E.g., GPS + pedometer
• Statistical sensor fusion
Statistical Sensor Fusion
• Important form of competitive fusion for higher quality
• Combine measurement to improve quality
• Establish statistical estimate of the true system state
• Predict future system state  Correct from observation
(measurement)
• Extended Kalman filter for Gaussian error distribution
• Unscented Kalman filter for highly non-linear systems
• Particle filter for systems with multiple state hypothesis
• E.g., maintain estimate with fast IMU + update when slow computer
vision results come in
Cooperative Sensor Fusion
• Primary sensor relies on information from secondary
sensor to obtain its measurements
• E.g., A-GPS combines celltower + GPS
• Combination of inside-out + outside-in
• Stereo cameras with known epipolar geometry
PointGrey LadyBug
• Non-overlapping cameras (e.g., 360°)
• Indirect sensing (cont’d)

Image: Georg Klein


Cooperative Sensor Fusion for Indirect Sensing
“Track around the corner” C1 C2

You might also like