Back to articles
Work Presented at ICCSCT22 - 4th International Conference on Computer Systems and Communication Technology
Volume: 67 | Article ID: 040409
Image
An Intelligent Material Handling System for Hybrid Robot based on Visual Navigation
  DOI :  10.2352/J.ImagingSci.Technol.2023.67.4.040409  Published OnlineJuly 2023
Abstract
Abstract

A hybrid robot fully integrates the merits of automated guided vehicle (AGV) and industrial manipulator. With the aid of computer vision algorithms, the camera on the AGV works like the eyes of the robot, making the robot highly intelligent. To promote the industrial application of the hybrid robot, it is necessary to enhance the navigation accuracy of the AGV and its ability to automatically handle materials in any pose. Therefore, this paper presents a fully automatic high-precision grasping system for the hybrid robot, which integrates the functions of high-precision visual positioning and automatic grasping. Both two-dimensional (2D) and three-dimensional (3D) features were employed to recognize the target. Specifically, the local features of the target were matched with those of the point cloud segment of the scene, and the pose transform matrix was obtained between the point cloud segment of the scene and the target, completing the recognition and positioning of the target. Experimental results show that the proposed method achieved a recognition rate of 97.6% in simple scenes, and 87.2% in complex, occluded scenes, and reduced the recognition time to merely 402.3 ms. The research results promote the application of the hybrid robot in industrial operations like automatic grasping, spraying, and stacking.

Subject Areas :
Views 75
Downloads 9
 articleview.views 75
 articleview.downloads 9
  Cite this article 

Xiao-Fang Zhao, Xue-Fang Chen, "An Intelligent Material Handling System for Hybrid Robot based on Visual Navigationin Journal of Imaging Science and Technology,  2023,  pp 1 - 7,  https://doi.org/10.2352/J.ImagingSci.Technol.2023.67.4.040409

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2023
  Article timeline 
  • received March 2023
  • accepted May 2023
  • PublishedJuly 2023

Preprint submitted to:
  Login or subscribe to view the content