Research Areas
Human Communication and Perception
Robots are helping researchers understand humans better
Social robots enable researchers to run controlled experiments to understand human communication, social behaviour, perception of speech and language, and psychology at large.
Robots activate our communication faculties in ways that are very similar to when we interact with humans. Except that robots are much easier to control and conduct research with than humans. With Furhat, researchers around the world are studying how humans perceive and behave, by using robots as a controlled interaction partner.
Scientists are using the robot in experimental setups to study human perception to facial expressions, emotions and culture, visual perception of lip movements, eye movements, and multimodal signals, using the robots face, voice, and sensors, out of the box, without the need to invest a lot of resources to bring all these technologies together to function with the robot.
Key benefits
- Easy to control eyes, gestures, and neck
- Wizard of Oz interface for remote control
- Ability to connect to other standard software
- Logging and analytics tools for user data
Customer spotlight
Perception of facial expressions and their impact on cultural perception
At the Centre for Social, Cognitive and Affective Neuroscience (cSCAN), at Glasgow University in the UK, Dr Rachael Jack and her fellow researchers are using the Furhat robot to conduct experiments that investigate cultural differences in using facial expressions for social communication. Through studying how the brain reacts to different facial expressions designed and programmed into Furhat.
Social robots must be able to communicate effectively by using a host of different modalities, including human-like facial expressions. Whilst humans can effortlessly understand and produce the non-verbal language of facial expressions, social robots, which are already being commercially deployed in educational and care settings, cannot yet do this. It is also not clear or straight forward to design gestures that are culturally universal due to our limited understanding of the relationship between facial expressions and culture. To address this limitation, Dr Jack and her team have been formally characterising the language of facial expressions; understanding its structure, syntax, and semantics to develop mathematical models that will equip social robots with this most human of abilities.
To that end, the customizability and fine control of Furhat’s face and facial expressions make it a perfect companion for this research track.
Meet us online
Our online sessions offer something for everyone. Dive in and learn more about the Furhat platform.
View our official product documentation on the Furhat platform
Understand more about the robot, software, SDK developing skills, warranty & support and more