Workshop on Real World Physical and Social Human-Robot Interaction

Humanoids 2024


As robots increasingly enter everyday settings—from homes to workplaces—the necessity for sophisticated human-robot interaction (HRI) capabilities becomes paramount. Traditional HRI systems often rely on a single mode of interaction, which can limit the robot’s ability to understand and respond to human nuances effectively. Multimodal HRI seeks to overcome these limitations by integrating various sensory inputs such as visual, auditory, and tactile feedback, thus enabling robots to interpret and adapt to complex human behaviors and environments. However, the integration of these modalities presents significant challenges, including sensor fusion, context-aware computing, and the development of adaptive, user-centered interfaces that can handle diverse human expressions and intentions.

This workshop aims to convene leading scholars and practitioners to explore the integration of multiple modalities in robotic systems for enhanced human-robot interaction. This workshop will highlight recent advancements in tactile feedback, visual recognition, interaction patterns and social dynamics to create robots that can engage more naturally and effectively with humans in diverse environments. Building on the success of previous related workshops at renowned conferences such as ICRA and IROS, this session is anticipated to attract a broad audience, ranging from academic researchers and industrial practitioners to educators and policy makers. Participants will engage in a series of keynote presentations, interactive panels, demonstration, and hands-on demonstrations, providing both foundational insights and innovative approaches to multimodal interaction. The workshop will also feature a call for papers, inviting contributions that address theoretical models, empirical studies, or state-of-the-art applications in human-robot interaction. Through this comprehensive format, the workshop will foster an inclusive dialogue aimed at shaping the future directions of research and development in the field.

Program Schedule

Time Activity
09:00 - 09:10 Introduction
09:10 - 09:50 Invited Speaker: Katja Mombaur (30 min + 10 min Q/A)
09:50 - 10:30 Invited Speaker: Dr. Eiichi Yoshida, "Human Model for Physical Human-Robot Interaction" (30 min + 10 min Q/A)
10:30 - 11:00 Coffee Break and Poster Presentation
11:00 - 11:30 Spotlight Talks:
  • "Failure Communication in Human-Robot Collaboration with Multimodal AI and Large Language Models"
  • "Event-Based Visual Servoing for Human-Robot Navigation using Reinforcement Learning"
11:30 - 12:30 Panel Discussion: Why Should Physical and Social HRI Researchers Listen to Each Other More? (40 min + 20 min Q/A)
12:30 - 13:30 Lunch
13:30 - 14:10 Invited Speaker: Quentin Rouxel and Dionis Totsila, "LLMs, Diffusion and Humanoid Robots: Natural Language and Imitation Learning for Contact Interaction" (30 min + 10 min Q/A)
14:10 - 14:50 Invited Speaker (30 min + 10 min Q/A)
14:50 - 15:20 Spotlight Talks:
  • "Feasibility Study on a Multi-Device Dexterous Hand Teleoperation System for Daily Activity Performance in HRI"
  • "Automated Gaze Labelling for Measuring Emotional and Cognitive Engagement in School-Age Children During Storytelling Activities with NAO Robot"
15:20 - 16:00 Coffee Break and Poster Presentation
16:00 - 17:00 Panel Discussion: How to Reconcile Academia and Industry's Approach to HRI? (40 min + 20 min Q/A)
17:00 - 17:40 Invited Speaker (30 min + 10 min Q/A)
17:40 - 17:50 Concluding Remarks

Speakers

Meet our esteemed speakers from academia and industry.

Katja Mombaur

Katja Mombaur

Professor, Karlsruhe Institute of Technology, Germany, and University of Waterloo, Canada

Presentation: TBD

Eiichi Yoshida

Eiichi Yoshida

Professor, Tokyo University of Science, Japan

Presentation: TBD

Enrico Mingo Hoffman

Enrico Mingo Hoffman

ISFP Researcher, Centre Inria de l'Université de Lorraine & Loria, Nancy, France

Presentation: "OpenSoT: A Software Tool for Advanced Whole-Body Control"

Pal Robotics

Pal Robotics

Presentation: TBD

Quentin Rouxel

Quentin Rouxel

Postdoctoral Researcher, Inria Nancy - Grand Est, CNRS, Université de Lorraine, Villers-lés-Nancy, France

Presentation: "LLMs, Diffusion and Humanoid Robots: Natural Language and Imitation Learning for Contact Interaction"

Call for Contributions

Date: November 22, 2024 (Full-day workshop)
Location: Hybrid (Nancy, France and online), as part of The 2024 IEEE-RAS International Conference on Humanoid Robots, IEEE-Humanoids 2024.
Submission Instruction: Email your contributions to: whsop.realworld.hri@gmail.com
Contact for submissions: paragk@kth.se,e.yadollahi@lancaster.ac.uk

IMPORTANT DATES

All deadlines are at 23:59 Anywhere on Earth time.


SUBMISSION GUIDELINES

Manuscripts should be written in English and will undergo a single-blind review by the organizing committee. The length should be 2-4 pages excluding references. We welcome contributions that include work in progress, preliminary results, technical reports, case studies, surveys, and state-of-the-art research. Position papers are also welcome and should be at least 2 pages excluding references. These can be research project proposals or plans without results. Authors must use the Humanoids templates provided, formatted for US Letter. The templates can be downloaded below.

Manuscript Templates: LaTeX, Word

Contact

Organizers:

Elmira Yadollahi

Elmira Yadollahi

Assistant Professor, Lancaster University, United Kingdom e.yadollahi@lancaster.ac.uk

Parag Khanna

Parag Khanna

Doctoral Student, KTH Royal Institute of Technology, Sweden paragk@kth.se

Ziwei Wang

Ziwei Wang

Assistant Professor, Lancaster University, United Kingdom z.wang82@lancaster.ac.uk

Elmira Yadollahi

Angela Faragasso

Lead Researcher Engineer, Finger Vision Inc., Japanangela.faragaso@fingervision.biz

Barbara Bruno

Barbara Bruno

Junior Professor, Karlsruhe Institute of Technology, Germany: barbara.bruno@kit.edu

Christian Smith

Christian Smith

Associate Professor, KTH Royal Institute of Technology, Sweden: ccs@kth.se