Reducing goal state divergence with environment design
Date
Journal Title
Journal ISSN
Volume Title
Abstract
At the core of most successful human-robot collaborations is alignment between a robot's behavior and a human's expectations. Achieving this alignment is often difficult, however, because without careful specification, a robot may misinterpret a human's goals, causing it to perform actions with unexpected, if not dangerous side effects. To avoid this, I propose a new metric called Goal State Divergence (GSD), which represents the difference between the final goal state achieved by a robot and the one a human user expected. In cases where GSD cannot be directly calculated, I show how it can be approximated using maximal and minimal bounds. I then leverage GSD in my novel human-robot goal alignment design (HRGAD) problem, which identifies a minimal set of environment modifications that can reduce such mismatches. To illustrate the effectiveness of my method for reducing goal state divergence, I then empirically evaluate it on several standard planning benchmarks.
Description
Rights Access
Subject
classical planning
goal recognition
planning and scheduling
environment design
automated planning
human-robot interaction