Research Icon

Global Localization for Urban AR

Authors: Thomas Calloway, Dalila B. Megherbi, Hongsheng Zhang

2017

Reliably localizing and tracking displays moving relative to content in the physical world is one of the primary technical challenges facing all augmented reality systems.  While significant progress has been made in recent years, all approaches remain limited to functioning only in certain environments and situations.  Attempts to improve solution generality via additional sensors (e.g., depth sensors, multiple cameras) add significant size, weight and power to wearable solutions sensitive to these attributes.  In this work, we propose an approach to tracking and localization using a single camera and inertial chip.  Through a combination of visual-inertial navigation, point cloud mapping and dynamically correlating building faces and edges with sparse OpenStreetMap datasets, we achieved a typical global localization precision of less than 0.25 meters and 1 degree heading relative to the map.  All motion tracking calculations are performed on a local mobile device with less than 10 milliseconds of latency while global localization and drift correction is performed remotely.

Related Information

Research Icon

Inertial Head-Tracker Sensor...

Authors: Eric FoxlinPresented: VRAIS 96, Santa...

Research Icon

Constellation: A Wide-Range...

Authors: Eric Foxlin, Michael Harrington, and George...

Related Products

IS-1500 System

Vision-Inertial Tracking for mixed reality and GPS denied navigation. This small,...

IS-900 System

With 1,000's of installations worldwide, the IS-900 is the system of choice for...

InertiaCube4™

The InertiaCube4 offers superior performance over its predecessors while minimzing...

Recent Case Studies

Arizona State Decision Theater

The Arizona State University Decision Theater is an advanced visualization...

Brown University VENLab...

Researchers at Brown University’s Cognitive and Linguistic Sciences Department...