one step ahead image - touching a broken glass

Researchers: Patric Bach and Rob Ellis

Funding (ESRC) £270,569. January 2013-December 2015

Humans are masters in predicting others’ behaviour. By watching our child’s facial expression, we know exactly which toy she will go for. When seeing someone frown at an open window, we are not surprised when she gets up and closes it. Conversely, a breakdown of these predictions might be one reason why social interactions are so confusing to those with autism.

This project tests, using behavioural and psychophysical measures, whether there is a sophisticated mechanism in our brains that ‘knows’ which cues signal the intentions of others and uses this knowledge to predict these people’s actions (e.g., looking at something signals interest, a smile signals the tendency to approach).

The first aim is to demonstrate that predictions of other’s behaviour are indeed generated when watching others. We will test whether the perception of different social cues is automatically converted into predictions of their future actions.

A second aim is to find out how these predictions come about, and specifically whether these predictions rely on our own action knowledge. A third and final aim is to establish whether such predictions are crucial for social interactions, and whether their breakdown is related to the social difficulties in individuals with autism spectrum disorder.