Using robots to probe how people react to simple behaviours


As a cognitive neuroscientist, I’ve always been fascinated by the human brain. As the photograph shows, I work with robots — at the intersection of neuroscience and robotics. I’m interested in how the brain processes social signals from robots and humans.

Robotics researchers at the Italian Institute of Technology in Genoa, where my lab is based, developed the iCub, depicted here, to support research in embodied artificial intelligence. That involves equipping software with a physical ‘body’ and exploring how that body fits into the real world. The iCub can move its eyes, head, arms, hands, waist and legs, and can ‘hear’ with sensors. We can also generate a behaviour in it, such as turning its head towards a stimulus.

For our research, we place a wired cap on a person’s head and measure how their brain responds to the robot. We use a robot rather than another human, because that makes it easier to keep emotional expressions, facial micro-movements and gaze direction constant across many trials.

It’s important to design robots that exhibit behaviours that humans can easily read and respond to, because social and assistive robots might be able to help in caring for elderly people or children with special needs.

Children with autism spectrum disorder (ASD) are often keen on interacting with robots, possibly because a robot’s face has fewer complex expressions and is therefore less intimidating than a human’s.

A key question for us is what sort of feelings or thoughts robots evoke in people. I must say that I do feel attached to the robot sometimes. It’s almost impossible for me to prevent myself from interpreting actions in an anthropomorphic way when, for example, the robot hands me an object or makes a sad face. Very often, I think, “Oh, my robot is sad, annoyed or bored right now.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *