According to new research from Queen Mary University of London (UK), an artificial intelligence (AI) approach based on wireless signals can help reveal human emotions. Scientists aim for an “invisible ” way to understand emotions.
The study, published in the journal PLOS ONE, demonstrates that radio waves can measure heart rate and breathing signals. At the same time, the method can predict a person’s emotions even in the absence of any other cues, like facial expressions.
Heart rate and breathing rate can reveal information about people’s emotions.
Participants were initially asked to watch a video selected by the researchers. Because, videos have the ability to evoke one of four basic emotions: Anger, sadness, joy, and satisfaction. While the participants watched the video, the researchers emitted a harmless radio signal. Then they measured the echoes.
By analyzing changes to these signals caused by slight body movement, researchers can reveal “hidden” information about a person’s heart rate and breathing rate.
Previous research has used similar non-invasive or wireless emotion detection methods. However, in these studies, data analysis relies on the use of classical methods.
Accordingly, an algorithm is used to identify and classify emotional states in the data. For this study, the scientists used the deep learning technique. In it, an artificial neural network learns its own features from raw, time-dependent data. The results show that this method can detect emotions more accurately than the traditional method.
Achintha Avin Ihalage, PhD at Queen Mary University, said: ‘Agenomics allows us to evaluate data similar to how the human brain would work when considering different layers of information and making conclusions. connect. Most published literature uses machine learning to measure emotion in a subject-dependent manner. That method captures cues from a particular individual and uses these signals to predict their emotions at a later stage.”
However, the analytic method has been shown to accurately measure emotions in a subject-independent manner. Traditionally, emotion detection has been based on the assessment of visible cues such as facial expressions, speech, body gestures, or eye movements.
However, these methods can be unreliable because they do not effectively capture an individual’s inner feelings. And, researchers are turning to “invisible” cues to understand emotions.