— To be effective team members, it is important for robots to understand the high-level behaviors of collocated humans. This is a challenging perceptual task when both the robots and people are in motion. In this paper, we describe an event-based model for multiple robots to automatically measure synchronous joint action of a group while both the robots and co-present humans are moving. We validated our model through an experiment where two people marched both synchronously and asynchronously, while being followed by two mobile robots. Our results suggest that our model accurately identifies synchronous motion, which can enable more adept human-robot collaboration.
Tariq Iqbal, Michael J. Gonzales, Laurel D. Riek