Reading faces; reading minds 

By Dawn Wiseman

This detail is from the  cover of a new book presenting Concordia researchers’ digital reading of facial expressions designed to improve security in high traffic areas. Magnifying glass

This detail is from the cover of a new book presenting Concordia researchers’ digital reading of facial expressions designed to improve security in high traffic areas.

Who knows what evil lurks in the hearts of men? Prabir Bhattacharya, Canada Research Chair in Information Systems, or at least his computer, might.

For the last three years, Bhattacharya (Concordia Institute for Information Systems Engineering) and graduate student, Abu Sayeed Sohail, have been working to develop a computer vision system that detects and classifies human facial expressions.

The results of their research to date were recently published by Verlag Dr. Müller in Classification of Human Facial Expression: A Prospective Application of Image Processing and Machine Learning (2008).

Why would you want to teach a machine to know if someone were happy or sad?
As Bhattacharya explained, Japanese banks have been using computers to judge customer mood for quite some time.

Small digital cameras snap photos of each face entering the bank, sending them through an internal network to a computer where they are compared with a database of known facial expressions. By examining details like the lift of the lips and the position of the eyebrows, the computer can make a pretty good guess about whether the customer is happy, tired, angry or sad.

“Tellers are then alerted to customers’ moods and can serve the client accordingly,” explained Bhattacharya.

While understanding a client’s state of mind is good for business, most people who work with the public could likely match a facial expression to a mood in a very short time. In fact, Bhattacharya pointed out that humans are very good at reading expressions.

“This type of application has great potential in high traffic areas where security is a primary concern.” Here computers have an advantage over humans solely because of their capacity to process large quantities of information quickly.

“Imagine an airport,” Bhattacharya said. Each day thousands and thousands of people proceed through airports on their way to or from somewhere else. As we know all too well, not all of them are traveling for business or pleasure.

“If we could take random photos of the crowd and process them fast enough, there is the potential to identify those people who might be problematic before they become so,” he explained. One computer could literally do the work of hundreds of security agents.

Facial expressions do not actually involve the entire face, but rather specific sets of muscles under the face near the eyes, nose and mouth. Bhattacharya and Sohail’s system measures 15 key points on the face and then compares these measures against images of identifiable facial expressions.

Although there is great variety in expression across both individuals and cultures, they have identified seven basic expressions that seem to be relatively universal. And for which their system is fairly accurate.

“We tested using query images against stock databases and the system did quite well, even against the Japanese Female Facial Expression Database (JaFFE).”

According to Bhattacharya, Japanese women are the litmus test for all facial expression recognition.“They, and actually Chinese women, generally have less variation in their expressions than most other people on the planet, including Japanese men. If you do well against JaFFE, the program can be applied with good result just about anywhere.”

 

Concordia University