There is need for new laws for regulation of emotion-detecting tech, suggested a leading research centre.
The field is “built on markedly shaky foundations”, says the AI Now Institute.
Irrespective of the risks, this technology is being used and is up for sale for a variety of uses such as setting insurance prices, testing for criminal suspects for signs of deception and to help in vetting of job seekers.
The research centre has demanded banning of the software from its use in important decisions that are directly linked to the lives of people or those that are related to their gaining access to opportunities.
The founder of a company developing its own emotional-response technologies in the UK has lent support to calls of the US-based research centre. However the British firm also cautioned that any new laws or regulations should be balanced enough to strike a balance between good work done using this tech and those that are potentially harmful.
In its annual report, the technology is referred by AI Now by its formal name, affect recognition. The report by the research centre estimates that the industry using emotion-detecting tech is undergoing a period of significant growth and could be worth as much as $20bn.
“It claims to read, if you will, our inner-emotional states by interpreting the micro-expressions on our face, the tone of our voice or even the way that we walk,” explained co-founder Prof Kate Crawford.
“It’s being used everywhere, from how do you hire the perfect employee through to assessing patient pain, through to tracking which students seem to be paying attention in class. At the same time as these technologies are being rolled out, large numbers of studies are showing that there is… no substantial evidence that people have this consistent relationship between the emotion that you are feeling and the way that your face looks.”
Some firms were basing their software on the work of Paul Ekman, which was part of the problem, Prof Crawford suggested. Ekman was a psychologist who proposed that there were only six basic emotions expressed via facial emotions. His work was published and became popular in the 1960s.
However, according to extensive studies later on, far greater variability in facial expressions – both in terms of the number of facial expressions as well as in te manner in which people display them.
“It changes across cultures, across situations, and even across a single day,” she said.
A number of examples selling emotion-detecting products were also provided by AI Now.
Its report pointed out the case of Oxygen Forensics, a company that offers emotion-detecting software to the police. The company however defended its business. “The ability to detect emotions, such as anger, stress, or anxiety, provide law-enforcement agencies additional insight when pursuing a large-scale investigation,” said the company’s chief operating officer, Lee Reiber. “Ultimately, we believe that responsible application of this technology will be a factor in making the world a safer place.”
HireVue, that makes and offers video-based tools driven by artificial intelligence to recommend the right candidates that company should interview, was also cited as an example by AI Now. This firm, according to the research centre, makes choices between candidates by detecting “emotional engagement” in applicants’ micro-expressions by making use of third-party algorithms.
“Many job candidates have benefited from HireVue’s technology to help remove the very significant human bias in the existing hiring process,” said company spokeswoman Kim Paone told Reuters news agency.
(Adapted from BBC.com)