Research Icon

Global Localization for Urban AR

Authors: Thomas Calloway, Dalila B. Megherbi, Hongsheng Zhang

2017

Reliably localizing and tracking displays moving relative to content in the physical world is one of the primary technical challenges facing all augmented reality systems.  While significant progress has been made in recent years, all approaches remain limited to functioning only in certain environments and situations.  Attempts to improve solution generality via additional sensors (e.g., depth sensors, multiple cameras) add significant size, weight and power to wearable solutions sensitive to these attributes.  In this work, we propose an approach to tracking and localization using a single camera and inertial chip.  Through a combination of visual-inertial navigation, point cloud mapping and dynamically correlating building faces and edges with sparse OpenStreetMap datasets, we achieved a typical global localization precision of less than 0.25 meters and 1 degree heading relative to the map.  All motion tracking calculations are performed on a local mobile device with less than 10 milliseconds of latency while global localization and drift correction is performed remotely.

Related Information

Research Icon

Generalized Architecture for...

Author: Eric Foxlin, InterSense...

Research Icon

VIS-Tracker: A Wearable...

Authors: Eric Foxlin and Leonid...

Research Icon

WearTrack: A Self-Referenced...

Authors: Eric Foxlin and Michael...

Related Products

InertiaCube4™

The InertiaCube4 offers superior performance over its predecessors while minimzing...

IS-1200+ HObIT System

The IS-1200+ HObiT (Hybrid Optical-Intertial Tracking System) is the industry's...

NavChip™

This miniaturized Inertial Measurement Unit (IMU) is the second evolution of the...

Recent Case Studies

Arizona State Decision Theater

The Arizona State University Decision Theater is an advanced visualization...

Brown University VENLab...

Researchers at Brown University’s Cognitive and Linguistic Sciences Department...