Camera spots your hidden prejudices from your body language Source: Aviva Rutkin
ARE your hidden biases soon to be revealed? A computer program can unmask them by scrutinising people’s body language for signs of prejudice.
Algorithms can already accurately read people’s emotions from their facial expressions or speech patterns. So a team of researchers in Italy wondered if they could be used to uncover people’s hidden racial biases.
First, they asked 32 white college students to fill out two questionnaires. One was designed to suss out their explicit biases, while the second, an Implicit Association Test, aimed to uncover their subconscious racial biases.
Then, each participated in two filmed conversations: one with a white person, and one with a black person. The pair spent three minutes discussing a neutral subject, then another three on a more sensitive topic, such as immigration. A GoPro camera and a Microsoft Kinect captured their movements, while sensors nearby estimated their heart rate and skin response.
An algorithm written by computer scientists at the University of Modena and Reggio Emilia searched for correlations between the participants’ questionnaire responses and their non-verbal behaviour during the filmed conversations. For example, it found that those who showed strong hidden racial biases kept a bigger distance between themselves and their black conversational partners. Conversely, those who were comfortable in the conversation seemed to pause more and to use their hands more when they spoke.
Then, the computer tested its new-found insights by looking back at the same data and trying to predict who would have scored high or low on the hidden biases test. It was correct 82 per cent of the time. The team presented its results at the International Joint Conference on Pervasive and Ubiquitous Computing in Heidelberg, Germany, last month.
“The software could suss out someone’s biases and gently nudge people to act differently“
The team has already started working on follow-up experiments. One focuses on hidden biases towards people who are HIV-positive, while another examines the behaviour of children.
Such technology may lead to a “kind of evolution” in how researchers study interactions between people, says team member Loris Vezzali, a psychologist at the University of Modena and Reggio Emilia. “These new measures can really provide objective information. This way, you can monitor the interaction moment by moment, second by second.” The software can also “be used to make novel theories that were not even thinkable with previous methods”, he says.
There might also be unusual new applications for the software – perhaps in a device that susses out someone’s hidden prejudices or gently nudges them to act differently.
But this study alone may not be sufficient to conclude that all the behavioural differences are related to skin colour.
“We are always biased, and bias is not based just on the colour of the skin,” says Hatice Gunes at the University of Cambridge. For example, we might change how we talk to someone according to their appearance, personal traits or even the context of the conversation. Volunteers in this study might have been responding to one of these myriad other differences, she says.
This article appeared in print under the headline “Your hidden prejudices are now on show”
| }
|