Design of Multi-target Medicine Box Grasping System Based on YOLOv5 and U-NET

YUAN Bin, LANG Yujian, CHEN Lingpeng, LI Chen

Packaging Engineering ›› 2024 ›› Issue (9) : 141-149.

PDF(3268 KB)
PDF(3268 KB)
Packaging Engineering ›› 2024 ›› Issue (9) : 141-149. DOI: 10.19554/j.cnki.1001-3563.2024.09.018

Design of Multi-target Medicine Box Grasping System Based on YOLOv5 and U-NET

  • YUAN Bin, LANG Yujian, CHEN Lingpeng, LI Chen
Author information +
History +

Abstract

The work aims to propose a new deep learning grasping recognition and positioning system, in order to solve the problem of poor target positioning accuracy caused by the inaccuracy of multi-target and complex-target background segmentation in traditional machine vision robot grasping system. A hardware system composed of Delta robot arm, PC host computer and binocular camera was built to compare and study the YOLO series algorithms commonly used in industrial deployment. YOLO and U-NET were combined for target detection and segmentation. When the pixel regions belonging to the target and the background target were divided, the edge and center position information were calculated, the three-dimensional position was obtained by stereo vision technology and converted into the world coordinate system, and the robot arm was guided by the PC to complete the grasp. The system combining deep learning target detection and image segmentation had better target positioning accuracy than the algorithm without image segmentation in complex background and multi-target scenes. The target positioning and grasping method combining YOLOv5 and U-NET has high robustness and meets the grasping requirements of parallel robot arms. This method can be applied to other multi-degree-of-freedom robot arms and has good application value.

Cite this article

Download Citations
YUAN Bin, LANG Yujian, CHEN Lingpeng, LI Chen. Design of Multi-target Medicine Box Grasping System Based on YOLOv5 and U-NET[J]. Packaging Engineering. 2024(9): 141-149 https://doi.org/10.19554/j.cnki.1001-3563.2024.09.018
PDF(3268 KB)

Accesses

Citation

Detail

Sections
Recommended

/