Can Businesses Use “Emotional” Artificial Intelligence?

Raghav Bharadwaj

Raghav is serves as Analyst at Emerj, covering AI trends across major industry updates, and conducting qualitative and quantitative research. He previously worked for Frost & Sullivan and Infiniti Research.

Emotional AI

Episode summary: This week on AI in Industry, we speak to Rana el Kaliouby, Co-founder and CEO of Affectiva about how machine vision can be applied to detecting human emotion – and the business value of emotionally aware machines.

Enterprises leveraging cameras today to gain an understanding of customer engagement and emotions will find Rana’s thoughts quite engaging, particularly her predictions about the future of marketing and automotive.

We’ve had guests on our podcast say that the cameras of the future will most likely be set up for their outputs to be interpreted by AI, rather than by humans. Increasingly machine vision technology is being used in sectors like automotive, security, marketing, and heavy industry – machines making sense of data and relaying information to people. Emotional intelligence is an inevitable next step in our symbiotic relationship with machines, an in this interview we explore the trend in depth.

(This article is part of a series of articles about artificial intelligence in Boston. Interested readers can visit the full Boston AI ecosystem analysis here.)

Subscribe to our AI in Industry Podcast with your favorite podcast service:

Guest: Rana el Kaliouby, CEO & co-founder of Affectiva

Expertise: Artificial intelligence, computer science, emotionally aware machines

Brief recognition: Rana earned BSc and MSc degrees in computer science from the American University in Cairo before going on to receive a Ph.D. from the University of Cambridge and a postdoctoral fellowship at MIT. She continued to serve at MIT as a Research Scientist for a period of 4 years while simultaneously co-founding Affectiva where she serves as CEO and Chief Science Officer.

Current Affiliations: Rana is a Member at the Future Global Council on Robotics and Artificial Intelligence and part of the Young Global Leader class of 2017 as a part of the World Economic Forum initiative.

Big Idea

While machine vision has gained massive popularity from the very dawn of the machine learning renaissance in 2011, emotional awareness in machine vision is still a nascent field – at least in terms of business applications. According to Rana, a traditional pain point in market research and advertising is that organizations spend millions of dollars through campaigns aimed at creating emotional connections with people, but they’re often unable to gauge a response beyond clicks and mouse activity.

Rana says that machine vision can be applied today to look at customers’ reactions and responses to a certain campaign in real-time. For example, when a customer watches an ad on their mobile device, they can be requested to allow the camera to study their reaction.

AI software can then detect micro-expressions in the customers’ face throughout the ad in real time and categorize the overall reaction as either positive or negative sentiment (such as a small smile, or a look of surprise or confusion). Rana tells us that this technology is being used in areas like market research, the automotive industry, and security.

Rana provides and example in the automotive field. She says that in accordance with current regulations around self-driving vehicles, there must be a human co-pilot in the vehicle at all times and the car will need to hand back control to the human operator at some point. In such a scenario, the system needs to know that the human operator is in the right condition to take over control of the vehicle – not just to know that the driver is sitting up, but that he or she is awake, sober, alert and able to drive.

Rana explains that machine vision can be used to detect emotions and expressions to give the system an awareness of the situation. She says that AI can today detect humans yawning or track their ‘blink rate’ and actively ‘assess’ how much cognitive load is being put on the co-pilot by the interfaces. The system can help to identify if the co-pilot is confused or frustrating.

The transferable lesson for the future here is that with AI and cameras expected to be embedded in all types of environments, gauging customer responses to products or services, and monitoring human activity and behavior in critical situations (such as driving or operating heavy machinery). We can expect some degree of “emotional intelligence” to be embedded in everyday consumer use-cases of AI in the years ahead.

Interview Highlights with Rana el Kaliouby from Affectiva

The main questions Rana answered on this topic are listed below. Listeners can use the embedded podcast player (at the top of this post) to jump ahead to sections they might be interested in:

  • (2:55) Where is AI being used today to detect and respond to human emotion through computer vision?
  • (10:50) What kinds of tangible business outcomes can result from using AI to detect human emotions?
  • (16:00) In two to five years, where do you see this technology being applied most actively (i.e. where will it make a difference in everyday life and business)?

Subscribe to our AI in Industry Podcast with your favorite podcast service:

 

Header image credit: Adobe Stock

Subscribe