The software recognizes human emotions based on facial expressions using a principle similar to FaceID by reading facial muscles. It detects stimuli (groups of facial muscles) and the intensity of their activity (whether they are tense or relaxed). By analyzing these stimuli collectively, the software can identify the emotions a person is currently experiencing.
This development will enable companies and individual users to automatically recognize a person's emotional state when processing large volumes of input data.
“The software identifies seven basic emotions through various recognition algorithms that operate in parallel,” commented Danila Ilyin, a student of NovSU majoring in Computer Science and Engineering, and one of the program developers. “This includes the FACS algorithm (a system for identifying emotions using a scientific coding system for facial movements), several mathematical models that determine stimuli and their intensity, and artificial intelligence that can also read and analyze facial expressions and draw conclusions based on the data obtained.”
The advancement of technology in the fields of artificial intelligence and BIG DATA has led to the creation of tools capable of analyzing and interpreting human emotional states. The program developed at NovSU can be effectively utilized in marketing, education, healthcare, entertainment, and other sectors. For instance, it can help marketers better understand consumer reactions to their products and campaigns.
For educational institutions, it can enhance learning efficiency and maintain students' psychological comfort. For medical facilities and professionals in psychology and psychiatry, it can assist in diagnosing and monitoring patients' emotional states. For app developers, it offers the ability to integrate emotion recognition functionality into their products to improve user experience. Individual users can track their emotional states as well.
“During certain illnesses (such as a stroke), the intensity coefficients of stimuli may not indicate a specific emotion but rather the most likely one. In such cases, doctors can use this product to assess the recovery degree of a patient’s facial muscle function,” explained Danila Ilyin, illustrating the software's application in medicine.
As the developers added, similar IT products for emotion recognition are already successfully used worldwide, for example, EmoDetect, Visage Technologies, and FaceReader. The new software matches their quality (using the FACS recognition algorithm) and is also quickly and easily integrated into the client's ecosystem via API, while being more adapted to the Russian language.
Currently, a beta version of the program has been developed, and successful testing has been conducted. The next phase of the project will focus on training the artificial intelligence to recognize an even greater number of emotions. This will be made possible through a grant from the “Student Startup” program, which the development team won in 2023 with their project.
The team consists of project supervisor Mikhail Yuryevich Lukov, a psychologist and neuro researcher; desktop developer Kirill Vasilyev; and Danila Ilyin, the technical leader, who are all university students.