Safe Motion Planning for Human-Robot Interaction

Jae Sung Park, Biao Jia, Mohit Bansal and Dinesh Manocha
University of North Carolina at Chapel Hill

 

Abstract

We present an algorithm for combining natural language processing (NLP) and runtime robot motion planning to automatically generate safe robot movements. We present a novel method to map the complex natural language commands into appropriate cost function and constraint parameters for optimization-based motion planning. Given NLP commands, we generate a factor graph names Dynamic Grounding Graph (DGG). The coefficients of this factor graph are learned based on conditional random fields and used to dynamically generate the constraints for motion planning. We directly map the cost function to the parameters of motion planner to generate collision-free and smooth paths in complex scenes with moving obstacles. We highlight the performance of our approach in a simulated environment as well as a human interacting with a 7-DOF Fetch robot with complex NLP commands.

Papers

Generating Realtime Motion Plans from Attribute-Based Natural Language Instructions Using Dynamic Constraint Mapping (Arxiv)
Arxiv Tech. Report, 2017

Results

Safe Motion Planning with NLP Algorithm

Related Links

GAMMA Research Group
Motion Planning Research at GAMMA

Incremental Trajectory Optimization for Motion Planning (ITOMP)