Human Robot Interaction

the mutual understading between human and robot enables proficient interaction

Perception of humans in HRI

Robot models of perception are tuned detect the presence, inner state, and the intentionality of human partners

Communicative Robot Actions

Control models that enable the robots to perform meaningful actions explicitly and non-explicitly directed to the human

Synchronization in HRI

The profound synchronization between human and robot is the result of deep mutual understanding

Trust and rapport in HRI

How robots build rapport and trust with human partners in realistic scenarios

Video: View Invariant Robot Adaptation to Human Action Timing

The humanoid robot iCub perceives auditory landscape with the use of microphones in the head. The iCub infers the location of the speakers and optimises the estimation with specific head motor actions.

View Invariant Robot Adaptation to Human Action Timing
View Invariant Robot Adaptation to Human Action Timing

Video: Can a Humanoid Robot Spot a Liar?

This work investigates the possibility to detect deception in a human-humanoid interaction, by monitoring behavioral cues proven to be significantly affected by telling lies in presence of a human interviewer.

Can a Humanoid Robot Spot a Liar?
Can a Humanoid Robot Spot a Liar?

Video: Do You See the Magic

Do You See the Magic
Do You See the Magic

Video: Biological movement detector enhances the attention skills of humanoid robot iCub

This work investigates the possibility to bias visual attention system with biolgical mark of human movements, in order to improve the attentive skills in Human-robot collaboration.

Biological movement detector enhances the attention skills of humanoid robot iCub
Biological movement detector enhances the attention skills of humanoid robot iCub

Publication Selection

Main publications in this research field resulting from state-of-the-art research and technological breakthroughs

2019 Can a Humanoid Robot Spot a Liar?

Aroyo A.M., Gonzalez-Billandon J., Tonelli A., Sciutti A., Gori M., Sandini G., Rea F. 2019, in IEEE-RAS International Conference on Humanoid Robots

2018 View-invariant robot adaptation to human action timing

Noceti N., Rea F., Sciutti A., Odone F., Sandini G. 2018, IEEE Technically Sponsored Intelligent Systems Conference (IntelliSys) (2018)

2016 Biological movement detector enhances the attentive skills of humanoid robot iCub

Alessia Vignolo A. , Rea F., Noceti N., Sciutti A., Odone F., Sandini S., In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids)

2020 Your Eyes Never Lie: a Robot Magician Can Tell if You Are Lying

Pasquali D., Aroyo A.M., Billandon-Gonzalez J., Rea F., Sandini G., Sciutti A., 15th Annual ACM/IEEE International Conference on Human Robot Interaction

Collaboration in the specific research

Collaboration activated thanks to the research carried on in the field

Products and Draft Patents

AuditoryAI

The invention is an new process to register the existinance of sound objects,localize them, prioritize them, and direct the behaviour of an agent (e.g. a robot) towards them.

gazeTracking

The product endows intelligent system with one camera to infer the gaze direction by looking at the head orientation and the iris position.

-----------------------

------------------------

-------------------------

Testimonials

-------------------------------------------------

Prof. Samia Nefti Meziani/Robotics and Automation, Salford University

--------------------------------------------------

Prof. Davide Brugali/ Robotics, Universita degli Studi di Bergamo

Partners

In the quest of enabling function robots in our lives, trustful Partners shared the road to impacting solutions.

Phone

t:+39 010 71781420