Assistive robotic devices have been used increasingly with the aim of improving the independence and quality of life of persons. As the main objective of assistive robots is to serve and support human’s needs, they must be able to communicate and interact with people in the most natural and collaborative way. This requires robots to be able to perceive the interaction environments and understand the underneath abstract knowledge, reason on such knowledge to make decisions which will be eventually transformed into physical actions in interactions with the environment, such as via grasping, navigation, speech, postures, etc. that are both legible and socially-acceptable to humans.

Importantly, perceiving and understanding the interaction environments includes not only modeling the surrounding environments but also analyzing the basic principles of human verbal and nonverbal communication. Cognitive and social psychology literature shows that a large part of human interaction takes place nonverbally (and often implicitly) during an explicit exchange of thoughts, attitudes, concerns and feelings. Particularly, nonverbal signals, e.g., gaze, facial expressions, gestures, vocal behavior, can carry significant information regarding the status of the human interactors including their emotions, engagement, intentions, action goals and focus of attention, as well as personal traits over the long-term. Such rich nonverbal information in company with explicit verbal expressions completes the human feedback to the assistive robots, and helps them to learn and adapt towards a more natural decision making and action planning.

This workshop is dedicated to developing computational approaches for natural and social human-robot interaction for Assistive Robotics Technology. We are calling for innovative research ideas, novel theories and models related to social assistive robotics, cutting-edge computational methods including either traditional statistical methods or deep-learning-based methods, as well as new simulation tools, datasets, benchmarks and evaluation protocols.

The attention is given to cognitive interaction analysis that features the interpretation of human interactions, with a particular emphasis on the nonverbal signals, in order to facilitate a more natural and social interaction. We also welcome research that focuses on the decision making and the physical interactions that can potentially benefit from the understanding of cognitive interactions between the human and robot.

The list of topics addressed in the workshop are listed below:

  • Recognition of individual’s and/or social group’s activity for social (assistive) robots
  • (Real-time) recognition of human social signals, e.g., gaze, gesture, vocal activity, etc.
  • Inference of human intentions for social (assistive) robots
  • Mechanisms of social cognition (joint attention, spatial perspective taking, action prediction, theory of mind) in human-robot interaction
  • Cognitive architectures for assistive social robots
  • Affective human-robot interaction (individuals, dyads and groups)
  • Developing emotion-aware and/or personality-aware robots
  • Interactive/active learning for social robotics
  • Transfer learning, meta-learning, reinforcement learning and learning from demonstration for human-robot social interactions
  • Multi-sensor fusion for social (assistive) robots
  • Human-robot engagement prediction
  • Applications of human- socially assistive robot interactions (e.g. for heath, education, entertainment, business)
  • Ambient assisted living
  • Human robot interactions and cognitive impairments
  • Social acceptance of robots based on human nonverbal behavioral cues (e.g., eye gaze, facial expressions, group formations)
  • Interactional synchrony between human and social robots
  • Human-robot interaction styles
  • Robot navigation problem in social interaction scenarios (e.g., adaptive planning)