Fujitsu said it has launched an artificial intelligence-based (AI) technology to track changes in user expression like suspense or confusio...

Fujitsu said it has launched an artificial intelligence-based (AI) technology to track changes in user expression like suspense or confusion.
ZDNet reported that Fujitsu recently announced it had successfully developed a solution to better track user emotions using facial recognition technology.
According to Fujitsu, the company's laboratories have come up with an technology based on artificial intelligence (AI). It can track changes in user expression like suspense or confusion.

Fujitsu's new face recognition technology can identify many complex states. (Photo: ZDNet).
Before Fujitsu, companies like Microsoft had used some of the same tools for facial recognition. However, most of today's tools are limited to eight basic states: joy, sadness, fear, disgust, anger, surprise, hope and trust.
Meanwhile, Fujitsu's new technology works by identifying different action units (AUs). Accordingly, certain facial motions that we make will be able to be associated with specific emotions.
For example, if the AU determines that two "cheekbones lift" and "the corner of the lips are stretched" happening at the same time, the AI will conclude that the user being analyzed is in a happy state.
“The problem with current technology is that AI needs to be trained with huge data sets for every AU. It needs to know how to recognize AU from every possible angle and position. But we currently do not have enough stock images to serve so it is not high accuracy, ”Fujitsu representatives told ZDNet.
However, Fujitsu claims that they have found a way to solve this problem. Instead of creating multiple images for AI training, they found a tool to extract more data from a photo.
Accordingly, thanks to what is temporarily called "standardization process".
It can convert images taken from a specific angle into a frontal image.
After proper zooming, zooming or rotation, the newly created front image will allow AI to detect the AUs much easier and more accurately.
“With the same limited data set, we can detect more AUs, even in photos taken from an oblique angle, for example. With more AU, we can identify more complex and delicate emotions, ”a Fujitsu spokesperson said.
Mr. Nerd