System for classifying emotions in video game dialogues based on eye tracking
A system for classifying user`s emotions in video game dialogue scenes, based on the player`s eyesight analysis, has been developed by scientists from the Faculty of Technical Physics, Information Technology and Applied Mathematics, Lodz University of Technology. They use Eye-Tracker that allows to control the computer with the user`s eyes.
"Our solution can significantly affect future ways of human-computer interaction as well as information obtained by machines about the user in video games" - says co-author of the solution, Konrad Jadwiszczak from Lodz University of Technology.
Currently, thanks to the rapid development of technology and computing power of machines, new solutions are being introduced to the game market to support the operation of computers and consoles through user body movement, senses and emotional states.
"They allow to obtain information not only in the zero-one form like a mouse and keyboard, but also additional information about the user. Such solutions introduced in multimedia entertainment significantly affect the phenomenon of immersion, the user`s deep engagement in virtual reality" - explained the co-author of the system.
One of these solutions is eyetracking, which allows to control the computer with eye movement. But it can also be used for other purposes. The system created at the Faculty of Technical Physics, Information Technology and Applied Mathematics, Lodz University of Technology uses eyetracking to classify user emotions in the video game dialog scenes based on the player`s eyesight analysis.
According to the creators of the system, based on the user`s features (the size of the pupil of the eye or the frequency of blinking), it is possible to determine the current emotional state of the person. To transfer the solution to multimedia entertainment, Konrad Jadwiszczak and Dr. Jarosław Andrzejczak created a classifier that can recognize the current emotional state of the examined person based on the values that Eye-Tracker can read.
This allows to use the obtained information in the virtual world, in this case the emotional state information that determines the course of the dialogue scene in a video game.
"The implemented algorithm determines the further course of the conversation based not only on the options selected by the user using the traditional controller, but also the accompanying emotions, often triggered by the course of dialogue in the video game" - explains Dr. Jarosław Andrzejczak, the other co-author of the solution.
The scientist gives an example of the application of the system. If an NPC`s (a non-player character in a video game) statement causes the player`s surprise, this emotion will be recognized by the classifier.
"As a result, the character can ask you why you are surprised by this statement. Similar reactions can be assigned to other recognized emotions, thanks to which video game dialogues can be more natural and positively affect the sense of realism" - argues Dr. Andrzejczak.
Research has shown a significant increase in the user`s immersion in the dialogue scene using Eye-Tracker compared to the dialogue scene without using the device, the researchers add.
PAP - Science in Poland
szu/ agt/ kap/