Emotion recognition for good or evil?

In the 1970s Paul Ekman pioneered the study of facial expressions by creating a catalog of more than 5,000 muscle movements to show how expressions reveal hidden emotions (1). Today AI and machine learning teach machines to understand and react to our emotions.

Vaish and Kumari defines emotion as “a psycho-physiological process triggered by conscious and unconscious perception of an object or situation and […] can be expressed either verbally or by non-verbal cues such as tone of voice, facial expression, gesture and physiological behavior.” (2)

It is argued that emotions will play a key role in enhancing AI experiences through more natural human-computer interaction (3) (4). Emotive analytics is a blend of psychology and technology (5). There are a multitude of ways to analyze human emotion including expression recognition (6), emotion classification from EEG data (7), and speech emotion recognition (4). Multimodal fusion, i.e. combining several of these to a single representation, tends to produce the best results in emotion recognition (8).

With the influx of different types of robots, e.g. Google’s Atlas, assistants such as Alexa and Siri, and even talking heads such as Furhat, emotions are becoming an essential part of creating believable machine characters. One of the most widely used emotion models for machines is the Ortony, Clore and Collins (OCC) model, which specifies 22 emotion categories and applies these to situations classified either as “being goal relevant events, as acts of an accountable agent (including itself), or as attractive or unattractive objects” (3).

Furhat is a Swedish startup developing a robotic platform that combines natural voice and gesture to create a more social, human experience. Furhat can follow and interact with several humans simultaneously (9). Interestingly, making robots seem human-like also has its drawbacks. The presence of an animated face raises the expectation levels of its user, and robots that can express emotions are also expected to understand them (3).

A lot of the current emotion recognition research is done by large players such as Facebook and Google. As Simon Chan mentioned during the lecture you need big volumes of big data for machine learning and perhaps this is the reason why it seems like emotion recognition research is currently mainly happening in the private sector.

On average, two days of video material are uploaded to YouTube per minute. Besides speech-to-text recognition, this allows for additional exploitation of acoustic information (8), but it is challenging to use this data due to diversity in scenes in terms of head pose, illumination, occlusion and background noise (10). Emotion recognition research is therefore shifting into “the wild” as “lab controlled data poorly represents the environment and conditions faced in real-world situations.” (11)

Disney recently announced that they are using AI to determine how audiences enjoy their films. Together with Caltech, they’ve used infrared cameras to film the audiences of 150 showings of nine movies and now claim that they can predict when a moviegoer will smile or laugh. (12)

A couple of weeks ago we heard about IBM’s efforts to use expression recognition for emotion tracking, Microsoft has its own its own “cognitive services,” and most likely both Alexa and Siri will be listening for signs of emotions soon (13). There have also been some interesting acquisitions: Apple bought Emotient (13) and Facebook acquired FacioMetrics (14).

Silver (15) argues that as companies running social software already are tracking our behavior, tracking our actual face is just an inevitable escalation. This concerns many experts that believe that machines recognizing our emotions could be used for covert purposes (16). Unfortunately, the first examples of this type of behavior are already surfacing. A leaked document revealed that Facebook had offered advertisers the opportunity to target 6.4 million younger users during moments of psychological vulnerability (17). This is not the first time that Facebook is playing around with our emotions. In 2012 Facebook conducted a mass experiment in manipulating emotions on 700,000 users to understand if News Feeds could affect users’ mood (17). Now patent filings indicating that Facebook will start using webcams and smartphone cameras to read our emotions have emerged (13).

It’s not all evil. Facebook also rolled out suicide prevention tools a couple of years ago (18). This month it added 3,000 people to improve the response (19), and as researcher Sarah Strohkorb puts it “It’s really exciting to see robots used to help change behavior for the better” (20). Let’s hope that good will prevail, and emotion recognition will be used to positively revolutionize our machine interactions in the coming years!

 

References:

(1) Dwoskin, E. & Rusli, E.M. (2015, Jan 28). The Technology that Unmasks Your Hidden Emotions. The Wall Street Journal. Retrieved at: https://www.wsj.com/articles/startups-see-your-face-unmask-your-emotions-1422472398.

(2) Vaish, A., & Kumari, P. (2014). A comparative study on machine learning algorithms in emotion state recognition using ECG. In Proceedings of the Second International Conference on Soft Computing for Problem Solving (SocProS 2012), December 28-30, 2012 (pp. 1467-1476). Springer, New Delhi.

(3) Bartneck, C., Lyons, M. J., & Saerbeck, M. (2017). The relationship between emotion models and artificial intelligence. arXiv preprint arXiv:1706.09554.

(4) Han, K., Yu, D., & Tashev, I. (2014). Speech emotion recognition using deep neural network and extreme learning machine. In Fifteenth Annual Conference of the International Speech Communication Association.

(5) Doerrfeld, B. (2016, August 11). 20+ Emotion Recognition APIs That Will Leave You Impressed, and Concerned. Retrieved at:  http://nordicapis.com/20-emotion-recognition-apis-that-will-leave-you-impressed-and-concerned/

(6) Shan, C., Gong, S., & McOwan, P. W. (2009). Facial expression recognition based on local binary patterns: A comprehensive study. Image and Vision Computing, 27(6), 803-816.

(7) Wang, X. W., Nie, D., & Lu, B. L. (2014). Emotional state classification from EEG data using machine learning approach. Neurocomputing, 129, 94-106.

(8) Cambria, E. (2016). Affective computing and sentiment analysis. IEEE Intelligent Systems, 31(2), 102-107.

(9) Kite-Powell, J. (2016, March 14). From Emotive Heads To Google’s Atlas, Our Relationship With Robots Is Changing. Forbes. Retrieved at: https://www.forbes.com/sites/jenniferhicks/2016/03/14/from-emotive-heads-to-googles-atlas-robot-our-relationship-with-robots-is-changing/#3ad975c8aa56.

(10) Dhall, A., Ramana Murthy, O. V., Goecke, R., Joshi, J., & Gedeon, T. (2015, November). Video and image based emotion recognition challenges in the wild: Emotiw 2015. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (pp. 423-426). ACM.

(11) Dhall, A., Goecke, R., Joshi, J., Sikka, K., & Gedeon, T. (2014, November). Emotion recognition in the wild challenge 2014: Baseline, data and protocol. In Proceedings of the 16th International Conference on Multimodal Interaction (pp. 461-466). ACM.

(12) Brown, J. (2017, July 26) Disney Is Building Facial Recognition to Figure Out When You’ll Laugh During Toy Story 5. Gizmondo. Retrieved at: http://gizmodo.com/disney-is-building-facial-recognition-to-figure-out-whe-1797267294.

(13) McStay, A. (2017, July 15). Tech firms want to detect your emotions and expressions, but people don’t like it. Businesswire. Retrieved at: http://blogs.lse.ac.uk/businessreview/2017/07/15/tech-firms-want-to-detect-your-emotions-and-expressions-but-people-dont-like-it/.

(14) Constine, J. (2016, November 16). Like by smiling? Facebook acquires emotion detection startup FacioMetrics. TechCrunch. Retrieved at: https://techcrunch.com/2016/11/16/facial-gesture-controls/.

(15) Silver, C. (2017, June 8). Patents Reveal How Facebook Wants To Capture Your Emotions, Facial Expressions And Mood. Forbes. Retrieved at: https://www.forbes.com/sites/curtissilver/2017/06/08/how-facebook-wants-to-capture-your-emotions-facial-expressions-and-mood/#30cdfaa76014.

(16) Scharr, J. (2014, January 30). Facial-Recognition Tech Can Read Your Emotions. Live Science. Retrieved at: https://www.livescience.com/42975-facial-recognition-tech-reads-emotions.html

(17) Tiku, N. (2017, May 21) Get ready for the next big privacy backlash against Facebook. Wired. Retrieved at: https://www.wired.com/2017/05/welcome-next-phase-facebook-backlash/.

(18) Gibbs, S. (2015, February 26). Facebook rolls out new suicide prevention and support tools. The Guardian.  Retrieved at: https://www.theguardian.com/technology/2015/feb/26/facebook-rolls-out-new-suicide-prevention-support-tools

(19) Menegus, B. (2017, May 3). Facebook Will Add 3,000 More People to Watch Murders and Suicides. Gizmondo.  Retrieved at: http://gizmodo.com/facebook-to-add-3-000-more-people-to-watch-murders-and-1794877532.

(20) Revell, T. (2017, March 16). New Scientist. Retrieved at: https://www.newscientist.com/article/2124930-robot-eavesdrops-on-men-and-women-to-see-how-much-they-talk/ .

 

 

 

 

 

 

 

 

 

0

2 comments on “Emotion recognition for good or evil?”

  1. Thanks Ghita, for your collection of thought and sources on the topic of emotion recognition. I can see how the technology can be misused for “evil” and yet I get the feeling that it is still a rather positive development that may even allow us to become “better” humans. [1] I love how your article inspires to think about the importance of emotions for us humans and how often we don’t know how to handle our emotions, detect them in friends or are clueless in understanding why we feel certain ways at certain times. If we can get help with that through technology, of whatever kind, I can imagine that we would be able to possibly lead more fulfilling and even compassionate lives. To have robot friends or an AI that feels our emotions and understands us, maybe that would cure a lot of people’s broken hearts.

    [1] https://tedxinnovations.ted.com/2015/08/12/playlist-4-talks-on-technology-to-make-us-better-humans/

    0
  2. Thank you for this interesting article. I certainly also hope that good prevails as you say, because emotion recognition could be a really powerful tool for understanding people’s emotions and help them cope with their feelings. I can see this technology being used in the healthcare industry, providing therapeutical assistance to medical patients, helping doctors know how the patient is doing.
    However, the power of emotion recognition will definitely attract the attention of people looking to exploit the information to make a profit. For example, the mobile game industry, primarily the freemium game developers. It is known that this type of games target a small amount of people who spend big on in-app purchases by analyzing their profile and designing tailor-made offers to them (http://toucharcade.com/2015/09/16/we-own-you-confessions-of-a-free-to-play-producer/). The developers are able to do this by just looking for patterns in data. If you add another layer capable of recognizing emotions, the manipulation could be taken to a whole new level. Let’s hope that this won’t happen, as it could mean disaster for the “vulnerable” freemium game players.

    0

Comments are closed.