6.4 C
Brussels
Saturday, April 27, 2024
InternationalEmotional Intelligence: How Cameras Prey on Our Reactions and Smiles

Emotional Intelligence: How Cameras Prey on Our Reactions and Smiles

DISCLAIMER: Information and opinions reproduced in the articles are the ones of those stating them and it is their own responsibility. Publication in The European Times does not automatically means endorsement of the view, but the right to express it.

DISCLAIMER TRANSLATIONS: All articles in this site are published in English. The translated versions are done through an automated process known as neural translations. If in doubt, always refer to the original article. Thank you for understanding.

Gaston de Persigny
Gaston de Persigny
Gaston de Persigny - Reporter at The European Times News

Joy, sadness, puzzlement, fatigue – this is not the whole range of sensations subject to video analytics. Gartner experts promise that in 2022 every tenth gadget will support emotion recognition technology. Not only technology giants, but also small startups are investing in this area, finding new applications for it. Zaur Abutalimov, Executive Director of the Ivideon cloud video surveillance service, tells Hi-Tech how emotional AI appeared and why it catches our smiles.

When cameras learned to recognize emotions and how it works now

Scientists began studying emotion markers long before the advent of artificial intelligence, in the 1970s. Of course, then there was no talk of the connection of emotions with neural networks. Emotions were the subject of scientific interest of psychologists: scientists Paul Ekman and Wallace Friesen collected all possible variants of facial movements into a single system, with the help of which any facial expression could be broken down into separate components. A certain set of mimic units corresponded to any emotion.

Modern algorithms are arranged in a similar way: the system processes streaming images, and then artificial intelligence limits the areas of the eyes, lips, nose, and eyebrows with points. Then the neural network analyzes the position of these points and matches them with the emotion template. Ultimately, the neural network remembers that rounded eyes correspond to surprise, a half-open mouth to fear, and lowered corners of the lips to fatigue or sadness.

The ability of neural networks to work with emotions was seriously discussed already in the “zero”, when it became clear that the future was with face recognition systems. And although teaching a computer to scan emotions is more difficult than just looking for a face, in a couple of decades, the direction of Facial Emotion Recognition has taken a big step forward. Already in 2020, Mordor Intelligence analysts estimated the emotion recognition market at $19.9 million.

At the same time, most of the FER market will be in North America due to the presence of countries such as the United States and Canada on the continent. It is in them that the largest retail markets with high demand for such solutions are located.

What is it for

The first solutions based on the emotion recognition system originated in the entertainment industry. For example, in 2015, Microsoft developed an application that guessed a person’s mood: the user “feeded” photos to him, and the program analyzed the emotion based on facial microexpressions in the image. The algorithm analyzed eight basic states – contempt, anger, disgust, fear, happiness, sadness, surprise or calmness, and then distributed conditional scores between them.

Now the scope of such solutions is much wider than you might imagine. In retail, devices with facial recognition can assess not only the gender, age and ethnicity of the buyer, but also his emotions at the time of purchase. In 2017, world leader Walmart took advantage of this by installing smart cameras at the checkout lines to monitor the mood of visitors. If the system found an upset customer, the store manager immediately learned about it from notifications.

Emotion recognition is a service that is used not only by industry giants like Walmart. Ivideon cooperates with more than 1,000 retailers, including small supermarkets and brick and mortar stores. About 15% of them, without high revenue figures and multi-faceted marketing campaigns, still show interest in the Facial Emotion Recognition feature in order to better understand the client.

In retail, facial recognition devices can assess not only the gender, age, and ethnicity of a shopper, but also their emotions at the time of purchase.

recruiting

Another area in which emotion detection technology has proven to be in demand is recruiting. Large companies are implementing artificial intelligence to control the behavior and psychological state of employees. Cameras with video analytics modules installed in the office can detect signs of stress in employees and warn personnel departments about it. For example, the Chinese company Taigusys has developed a program that analyzes facial expressions of a person and, based on the data obtained, creates detailed reports about his mental state. Such solutions are also being developed today by Limeaid, VirginPulse, Glint, Ultimate Software and other companies.

FER-technologies also allow corporations to evaluate the effectiveness of television advertising. The American company Affectiva has created an application that scans videos of people and forms an idea of ​​their emotions at the time of broadcasting a commercial. With the help of artificial intelligence, you can evaluate the reaction to any product. For example, Disney uses machine learning algorithms to understand whether viewers like the cartoons the company creates.

Medical institutions also use FER to determine the emotions of patients in the waiting rooms. This helps doctors prioritize those patients who feel worse and call them for an appointment earlier. But there are more experimental ways to implement developments. In 2018, scientists using emotion recognition technology discovered a new therapy for children with autism. With the help of Google Glass augmented reality glasses and a special application for a smartphone, a sick child could find out what feelings others are experiencing.

In schools and other educational institutions, “emotional” technologies are being introduced to control the psychological health of students and can serve as a basis for preventing acts of violence and bullying. At customs at airports, such systems help intelligence officers to identify a person with a high level of anxiety and nervousness, which often indicates his involvement in smuggling. In 2019, China actively began to introduce intelligent systems for these purposes.

Medical institutions also use FER to determine the emotions of patients in waiting rooms.

Reads not only in the eyes: can AI scan a person’s thoughts

Despite the fact that the cameras seem to be an all-seeing eye, there are things that they still cannot do. For example, cameras cannot read minds and detect hidden emotions. But there is good news: you can recognize hidden discontent and disappointment by the voice. Several programs already process audio information and analyze paralinguistic signals, intonation, tempo and timbre of the speaker’s speech. Then artificial intelligence translates the data into the language of emotions. Such solutions are already being used by major brands in marketing research.

The mind-reading device is also a very real technology. So far, such devices are not widely used, but scientific developments are already underway in this area. The Meta (Facebook) company has advanced in this direction, which last year, together with the University of California at San Francisco (UCSF), presented a brain-computer interface (BCI) that recognizes words and whole phrases that a person wanted to say: electrodes connected to the brain read signals and convert them to text.

In 2019, Neuralink, along with Elon Musk, presented a similar Link device. This microgadget, the size of a small coin, is supposed to be “sewn” into the human brain with the help of tiny wires thinner than a human hair. It is expected that the device will transmit information about the activity of the centers of the brain and translate a person’s thoughts into text that can be broadcast to a computer or phone. The mind-reading device is also a very real technology.

Feelings and reason: can artificial intelligence gain empathy

So far, smart devices only work with other people’s emotions. But already with might and main they say that the day is near when the robots themselves will experience feelings. True, in this case we are not talking about love and hate, but about empathy – the ability to sympathize and empathize.

Of course, at the biochemical level, this is impossible, since artificial intelligence is devoid of sense organs. However, it is quite possible to train robots to respond to the psychological state of a person. Modern technology proves that machines can understand emotions using numbers even better than people themselves.

- Advertisement -

More from the author

- EXCLUSIVE CONTENT -spot_img
- Advertisement -
- Advertisement -
- Advertisement -spot_img
- Advertisement -

Must read

Latest articles

- Advertisement -