Skip to the content.

AR-Based Contact Task Demonstrations

DOI github.io

YouTube Demonstration Video

About

This is a novel end-to-end system that captures a single manipulation task demonstration from an augmented reality (AR) head-mounted display (HMD), computes an affordance primitive (AP) representation of the task, and sends the task parameters to a mobile manipulator for execution in real-time. The system is robust, generalizable, and mobile and is intended for non-expert users to be able to define manipulator contact tasks in unknown and unstructured environments on the fly without requiring environment modifications. To learn more, watch the demonstration video hyperlinked below or watch our presentation at the 2023 IEEE IROS Conference in Detroit, MI.

This code is currently closed source. Updates will be made to this page if the code is released as open source.

System Setup

Citation

@INPROCEEDINGS{10342493,
  author={Regal, Frank and Pettinger, Adam and Duncan, John A. and Parra, Fabian and Akita, Emmanuel and Navarro, Alex and Pryor, Mitch},
  booktitle={2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)}, 
  title={Using Single Demonstrations to Define Autonomous Manipulation Contact Tasks in Unstructured Environments via Object Affordances}, 
  year={2023},
  volume={},
  number={},
  pages={3664-3671},
  keywords={Three-dimensional displays;Service robots;Affordances;Robot kinematics;Wheels;Resists;Manipulators},
  doi={10.1109/IROS55552.2023.10342493}
}