Emotion Recognition AI by Eyeris. Founded in 2013 to develop its EmoVu deep learning-based emotion recognition software that reads facial micro-expressions.
The human-machine interaction (HMI) performance of a car is better if you have the EmoVu software and cameras. This way, it can monitor your emotions and alertness while you drive.
EmoVu plays an active role
During the higher levels of automation, you will have to switch back and forth between automated and manual driving. When switching from manual to automated, check if the driver can safely operate the vehicle.
If the system thinks that you are not driving well enough, the system will take over and drive your car for you.
Improving the in-vehicle technology of self-driving cars: In a fully automated car, no one is driving so people can work and take naps.
So, we need to make them more comfortable and productive on the inside. One way we can do this is by using EmoVu technology to understand what the driver is feeling inside so that we can improve driving.
Toyota has a MOBILITY TEAMMATE CONCEPT. This is for developing advanced autonomous driving technologies where humans and cars cooperate with one another. It also works in Toyota Concept-i vehicles.
Hence, Toyota realized that EmoVu is important when they make the HMI for their car.
EmoVu: Software that recognizes human emotions
Eyeris developed a way to read facial expressions using a camera in your car’s dashboard. This software can scan the face and tell if you are happy or sad, or how alert you are.
Eyeris collected 3 million facial expressions while people were driving. The data was from five races, four age groups, male or female genders, 10 lighting conditions and 13 head poses. And with different accessories such as eyeglasses, hats, and sunglasses, as well as with a variety of facial occlusions, for example, with the driver holding a cellphone to his ear.
Emotions are in a person’s brain when they are born. They can be seen on their face without them realizing it. People from any age, gender, or race show similar emotions. Hence, the EmoVu system can tell what your driver is feeling. It can see if the driver is happy, surprised, sad, disgusted, afraid or angry.
Emotions can be seen through facial expressions. But EmoVu also monitors head position, posture and movement, the eye region (openness of the eyes, gaze direction, drowsiness, blinking) and more to see if a driver is alert or not. If they are sleepy or yawning then their emotions will show through even
There are three things that computers need to be able to recognize about the condition of a driver: Driver Inattention, Cognitive Awareness, and Emotional Distraction. Humans can naturally see these things but until now it has been difficult for computers.
Eyeris has been working on ways to help cars be safer in their driving. Their EmoVu software recognizes human emotions in facial expressions and head positioning. By telling the alertness of a driver, you can be sure that they are paying attention to something else besides the road ahead.