Skip to main content Skip to secondary navigation
Article Newspaper/Magazine

It Takes Two: Learning to Plan for Human-Robot Cooperative Carrying


Collaborative table-carrying is a complex task due to the continuous nature of the action and state-spaces, multimodality of strategies, existence of obstacles in the environment, and the need for instantaneous adaptation to other agents. In this work, we present a method for predicting realistic motion plans for cooperative human-robot teams on a table-carrying task. Using a Variational Recurrent Neural Network (VRNN) to model the variation in the trajectory of a human-robot team across time, we are able to capture the distribution over the team’s future states while leveraging information from interaction history. The key to our approach is in our model’s ability to leverage human demonstration data and generate trajectories that synergize well with humans during test time. We show that the model generates more human-like motion compared to a baseline, centralized sampling-based planner (Rapidly-exploring Random Trees). Furthermore, we evaluate the VRNN planner with a human partner and show its ability to both generate more human-like paths and achieve higher task success rate than RRT can while planning with a human. Finally, we demonstrate that a LoCoBot using the VRNN planner can complete the task successfully with a human controlling another LoCoBot. 

@INPROCEEDINGS{10161386,
  author={Ng, Eley and Liu, Ziang and Kennedy, Monroe},
  booktitle={2023 IEEE International Conference on Robotics and Automation (ICRA)}, 
  title={It Takes Two: Learning to Plan for Human-Robot Cooperative Carrying}, 
  year={2023},
  volume={},
  number={},
  pages={7526-7532},
  doi={10.1109/ICRA48891.2023.10161386}}
 

Author(s)
Eley Ng
Ziang Liu
Monroe Kennedy
Publisher
2023 IEEE International Conference on Robotics and Automation (ICRA)
Publication Date
May 29, 2023