Human-to-Robot Handovers Based on Visual Data forOptimisation of Industrial Tasks: Handovers and Grasp Generation
Student thesis: Master Thesis and HD Thesis

- Jan Kjær Jørgensen
- Rune Grønhøj
4. semester, Robotics, M.Sc. (Master Programme)
This project revolves around human-to-robot handovers, with a focus on robust real-time grasping. The system is developed with the Little Helper 7 dual-arm UR5 platform in mind. This project explores multiple state of the art grasp generation methods to utilise them for real-time grasping in handover scenarios. A standardised conversion between grasp representation is presented for visualisation of grasp predictions using ROS. Multiple viewpoints are investigated using a custom grasp rectangle data-set, with 50 scenes from three different views. Two pixel-wise real-time grasp generation methods (GG-CNN and GR-ConvNet) are explored and tuned, including tweaking batch-size, optimiser and data-set. Mainly, the novel Graspnet 1-billion data-set is investigated to improve the existing performance of the models, as the data-set is widely different from the previously available ones. The grasp generation methods proposed shows promise during evaluation. Especially the use of Graspnet seems to improve invariance to viewpoints which is essential during a handover scenario. However, to achieve a human to robot handovers, integration with other systems is needed. Mainly hand detection is required and integration with previous systems of the Little Helper 7.
Language | English |
---|---|
Publication date | 2 Jun 2021 |
Number of pages | 48 |