A camera can spot your hidden prejudices … from your body language. Plus … a “smile-detection” system.

 

body-language

09 October 2016 – I end the summer at DLD Tel Aviv. It’s Israel’s biggest tech conference but really much bigger than just a conference and properly referred to as an “Innovation Festival.” It’s where the Big Dogs like Facebook, Google, IBM, Intel, Microsoft … even Coke, Disney and LL Bean … unveil some of their “work-in-progress” funky tech.

It’s technology you will eventually read about first in MIT Technology Review and arXiv Papers before it hits the mainstream tech media.  And it’s a  place to meet folks like Yossi Matias, the managing director of Google’s R&D Center in Israel and Senior Director of Google’s Search organization, and Aya Soffer, an IBM director behind Watson and IBM’s cognitive analytics programs.

Plus entrepreneurs and investors come from all over the world because the event features hundreds of startups, VCs, angel investors and leading multinationals looking for the latest tech.

Over the next few weeks I will comment on a few things that caught my eye.

Your hidden prejudices, now on show

Is your hidden bias about to be revealed? There is a computer program out there that can scrutinize people’s body language for signs of prejudice. The algorithms can already accurately read people’s emotions from their facial expressions or speech patterns. So a team of researchers in Italy wondered if they could be used to uncover people’s hidden racial biases.

First, they asked 32 white college students to fill out two questionnaires. One was designed to suss out their explicit biases, while the second, an Implicit Association Test, aimed to uncover their subconscious racial biases.

Then, each participated in two filmed conversations: one with a white person, and one with a black person. The pair spent three minutes discussing a neutral subject, then another three on a more sensitive topic, such as immigration. A GoPro camera and a Microsoft Kinect captured their movements, while sensors nearby estimated their heart rate and skin response.

An algorithm written by computer scientists at the University of Modena and Reggio Emilia searched for correlations between the participants’ questionnaire responses and their non-verbal behaviour during the filmed conversations. For example, it found that those who showed strong hidden racial biases kept a bigger distance between themselves and their black conversational partners. Conversely, those who were comfortable in the conversation seemed to pause more and to use their hands more when they spoke.

Then, the computer tested its new-found insights by looking back at the same data and trying to predict who would have scored high or low on the hidden biases test. It was correct 82 per cent of the time. The team presented its results at the International Joint Conference on Pervasive and Ubiquitous Computing in Heidelberg, Germany, last month, and they had a demo at DLD.

The team has already started working on follow-up experiments. One focuses on hidden biases towards people who are HIV-positive, while another examines the behavior of children.

Such technology may lead to a “kind of evolution” in how researchers study interactions between people, quoting team member Loris Vezzali, a psychologist at the University of Modena and Reggio Emilia, who said:

“These new measures can really provide objective information. This way, you can monitor the interaction moment by moment, second by second. The software can also be used to make novel theories that were not even thinkable with previous methods. There might also be unusual new applications for the software – perhaps in a device that susses out someone’s hidden prejudices or gently nudges them to act differently.”

But he was quick to note that this study alone may not be sufficient to conclude that all the behavioral differences are related to skin color. We are always biased, and bias is not based just on the color of the skin. For example, we might change how we talk to someone according to their appearance, personal traits or even the context of the conversation. Volunteers in this study might have been responding to one of these myriad other differences.

NOTE: two years ago I noted a research piece in which Vezzali was the lead entitled “The greatest magic of Harry Potter: Reducing prejudice”. Quoting from the abstract:

Recent research shows that extended contact via story reading is a powerful strategy to improve out-group attitudes. We conducted three studies to test whether extended contact through reading the popular best-selling books of Harry Potter improves attitudes toward stigmatized groups (immigrants, homosexuals, refugees). Results from one experimental intervention with elementary school children and from two cross-sectional studies with high school and university students (in Italy and United Kingdom) supported our main hypothesis. Identification with the main character (i.e., Harry Potter) and disidentification from the negative character (i.e., Voldemort) moderated the effect. Perspective taking emerged as the process allowing attitude improvement. Theoretical and practical implications of the findings are discussed in the context of extended intergroup contact and social cognitive theory.

Smile-detection systems

Researchers at the University of Regensburg in Germany have developed a way for smartphones to know when they’re making you laugh using a “smile-detection” system. This software rates the rib-tickling value of images based on how much of a grin they elicit in the viewer. But while the results were judged to be fairly accurate, some of those using the system “were uncomfortable with the idea of being observed by their smartphone”.

But on a serious note, due to the rapid development of computer hardware design and software technology, some of these new human computer interactive systems using the multi-touch technology of Apple iPhone and the touch screen support of Windows 10 are catching more and more attention. For medical treatment, there are already some very sophisticated eye-gaze tracking systems developed for cerebral palsy and multiple sclerosis patients, enabling the user to control a computer mouse by simply looking at the word or picture shown on the monitor. I am writing a more detailed post on this technology as part of my artificial intelligence series.

This eye tracking software features tailored tools for research in education, linguistics, neuroscience and psychology. Companies are already using eye-tracking technology and processor-packed tablets that react based on how you’re looking at text — where you pause, how you stare, where you stop reading altogether — in the best (worst?) application of the Observer Effect.

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top