User experience of the CoSTAR system for instruction of collaborative robots

Chris Paxton, Felix Jonathan, Andrew Hundt, Bilge Mutlu, Gregory D. Hager

Research output: Contribution to journalArticlepeer-review

Abstract

How can we enable novice users to create effective task plans for collaborative robots? Must there be a tradeoff between generalizability and ease of use? To answer these questions, we conducted a user study with the CoSTAR system, which integrates perception and reasoning into a Behavior Tree-based task plan editor. In our study, we ask novice users to perform simple pick-and-place assembly tasks under varying perception and planning capabilities. Our study shows that users found Behavior Trees to be an effective way of specifying task plans. Furthermore, users were also able to more quickly, effectively, and generally author task plans with the addition of CoSTAR's planning, perception, and reasoning capabilities. Despite these improvements, concepts associated with these capabilities were rated by users as less usable, and our results suggest a direction for further refinement.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Mar 22 2017

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'User experience of the CoSTAR system for instruction of collaborative robots'. Together they form a unique fingerprint.

Cite this