Dynamically Grasping Technology Based on Vision and Industrial Robot

HUANG Jin-suo, SHEN Zheng-hua

Packaging Engineering ›› 2019 ›› Issue (11) : 177-182.

PDF(601 KB)
PDF(601 KB)
Packaging Engineering ›› 2019 ›› Issue (11) : 177-182. DOI: 10.19554/j.cnki.1001-3563.2019.11.027

Dynamically Grasping Technology Based on Vision and Industrial Robot

  • HUANG Jin-suo1, SHEN Zheng-hua2
Author information +
History +

Abstract

The work aims to propose a set of pose data calculation method based on intelligent camera and encoder test to solve the problem of calculating position and orientation data when the robot dynamically grasps the product during conveying. Firstly, a dynamically grasping system model of intelligent camera and encoder was built to introduce the composition and working principle of the grasping system. Secondly, each coordinate system of the grasping system was created to describe the transformation relationship, matrix model and data needing calibration among each coordinate system. Thirdly, the calculating process and formula for the calibration were designed. At last, the experimental prototype consists of the MITSUBISHI industrial robot, OMORON intelligent camera and encoder was developed, on which the above calculation method for the position and orientation data were tested and proved. The result shows that, the error between the calculated value and the measured value of the product’s position was less than 0.4 mm. The dynamic grasping precision based on vision and encoder test satisfies the requirements on engineering application.

Cite this article

Download Citations
HUANG Jin-suo, SHEN Zheng-hua. Dynamically Grasping Technology Based on Vision and Industrial Robot[J]. Packaging Engineering. 2019(11): 177-182 https://doi.org/10.19554/j.cnki.1001-3563.2019.11.027
PDF(601 KB)

Accesses

Citation

Detail

Sections
Recommended

/