The United States of America has an estimate of 84,000 dams of which approximately 15,500 are rated as high-risk as of 2016. Recurrent geological and structural health changes require dam assets to be subject to continuous structural monitoring, assessment and restoration. The objective of the developed system is targeted at evaluating the feasibility for standardization in remote, digital inspections of the outflow works of such assets to replace human visual inspections. This work proposes both a mobile inspection platform and an image processing pipeline to reconstruct 3D models of the outflow tunnel and gates of dams for structural defect identification. We begin by presenting the imaging system with consideration to lighting conditions and acquisition strategies. We then propose and formulate global optimization constraints that optimize system poses and geometric estimates of the environment. Following that, we present a RANSAC frame-work that fits geometric cylinder primitives for texture projection and geometric deviation, as well as an interactive annotation frame-work for 3D anomaly marking. Results of the system and processing are demonstrated at the Blue Mountain Dam, Arkansas and the F.E. Walter Dam, Pennsylvania.
Simultaneous Localization and Mapping (SLAM) solves the computational problem of estimating the location of a robot and the map of the environment. SLAM is widely used in the area of navigation, odometry, and mobile robot mapping. However, the performance and efficiency of the small industrial mobile robots and unmanned aerial vehicles (UAVs) are highly constrained to the battery capacity. Therefore, a mobile robot, especially a UAV, requires low power consumption while maintaining high performance. This paper demonstrates holistic and quantitative performance evaluations of embedded computing devices that run on the Nvidia Jetson platform. Evaluations are based on the execution of two state-of-the-art Visual SLAM algorithms, ORB-SLAM2 and OpenVSLAM, on Nvidia Jetson Nano, Nvidia Jetson TX2, and Nvidia Jetson Xavier.
Autonomous robots and self-driving vehicles require agents to learn and maintain accurate maps for safe and reliable operation. We use a variant of pose-graph Simultaneous Localization and Mapping (SLAM) to integrate multiple sensors for autonomous navigation in an urban environment. Our methods efficiently and accurately localize the agent across a stack of maps generated from different sensors across different periods of time. To incorporate a priori localization data, we account for the discrepancies between LiDAR observations and publicly available building geometry. We fuse data derived from heterogeneous sensor modalities to increase invariance to dynamic environmental factors, such as weather, luminance, and occlusions. To discriminate traversable terrain, we employ a deep segmentation network whose predictions increase the confidence of a LiDAR-generated cost map. Path planning is accomplished using the Timed-ElasticBand algorithm on the persistent map created through SLAM. We evaluate our method in varying environmental conditions on a large university campus and show the efficacy of the sensor and map fusion.
Performing reliable and computationally efficient loop closure detection in real-world environments still remains a challenging problem. In this paper, we propose a novel method for efficient loop closure detection in different times of day. An illumination invariant color transform is applied to images that are represented by a whole-image descriptor, named PALM. The efficiency of our method resides either in description of the places or in image matching in which FLANN is used for fast nearest neighbor search. With this approach, searching time is decreased about 70 times compared to standard brute-force search with no significant loss of accuracy. According to the experiments that are performed in real-world datasets, the proposed method successfully accomplishes to detect loops under varied illumination conditions with high accuracy, and it allows real-time operation for long-life localization and mapping.