Back to articles
Volume: 31 | Article ID: art00010
Autonomous navigation using localization priors, sensor fusion, and terrain classification
  DOI :  10.2352/ISSN.2470-1173.2019.15.AVM-040  Published OnlineJanuary 2019

Autonomous robots and self-driving vehicles require agents to learn and maintain accurate maps for safe and reliable operation. We use a variant of pose-graph Simultaneous Localization and Mapping (SLAM) to integrate multiple sensors for autonomous navigation in an urban environment. Our methods efficiently and accurately localize the agent across a stack of maps generated from different sensors across different periods of time. To incorporate a priori localization data, we account for the discrepancies between LiDAR observations and publicly available building geometry. We fuse data derived from heterogeneous sensor modalities to increase invariance to dynamic environmental factors, such as weather, luminance, and occlusions. To discriminate traversable terrain, we employ a deep segmentation network whose predictions increase the confidence of a LiDAR-generated cost map. Path planning is accomplished using the Timed-ElasticBand algorithm on the persistent map created through SLAM. We evaluate our method in varying environmental conditions on a large university campus and show the efficacy of the sensor and map fusion.

Subject Areas :
Views 12
Downloads 1
 articleview.views 12
 articleview.downloads 1
  Cite this article 

Zachariah Carmichael, Benjamin Glasstone, Frank Cwitkowitz, Kenneth Alexopoulos, Robert Relyea, Raymond Ptucha, "Autonomous navigation using localization priors, sensor fusion, and terrain classificationin Proc. IS&T Int’l. Symp. on Electronic Imaging: Autonomous Vehicles and Machines Conference,  2019,  pp 40-1 - 40-7,

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2019
Electronic Imaging
Society for Imaging Science and Technology
7003 Kilworth Lane, Springfield, VA 22151 USA