
Research Article
Research on Depth-Adaptive Dual-Arm Collaborative Grasping Method
@INPROCEEDINGS{10.1007/978-3-031-24386-8_15, author={Hao Zhang and Pengfei Yi and Rui Liu and Jing Dong and Qiang Zhang and Dongsheng Zhou}, title={Research on Depth-Adaptive Dual-Arm Collaborative Grasping Method}, proceedings={Collaborative Computing: Networking, Applications and Worksharing. 18th EAI International Conference, CollaborateCom 2022, Hangzhou, China, October 15-16, 2022, Proceedings, Part II}, proceedings_a={COLLABORATECOM PART 2}, year={2023}, month={1}, keywords={Dual-arm collaboration Target localization Robotics}, doi={10.1007/978-3-031-24386-8_15} }
- Hao Zhang
Pengfei Yi
Rui Liu
Jing Dong
Qiang Zhang
Dongsheng Zhou
Year: 2023
Research on Depth-Adaptive Dual-Arm Collaborative Grasping Method
COLLABORATECOM PART 2
Springer
DOI: 10.1007/978-3-031-24386-8_15
Abstract
Among the existing dual-arm cooperative grasping methods, the dual-arm cooperative grasping method based on RGB camera is the mainstream intelligent method. However, these methods often require predefined depth, difficult to adapt to changes in depth without modification. To solve this problem, this paper proposes a dual-arm cooperative grasping method based on RGB camera, which is suitable for scenes with variable depth, to increase the adaptability of dual-arm cooperation. Firstly, we build a mathematical model based on RGB camera, and use the markers attached to the target to obtain the depth information of the target. Then the 3D pose of the target under the robot world coordinate system is obtained by combining the depth information and pixel information. Finally, the task is assigned to the left and right robotic arms, and the target grabbing task is realized based on the main-auxiliary control. The proposed approach is validated in multiple experiments on a Baxter robot under different conditions.