A Blog by Jonathan Low

 

Jun 7, 2012

Does Your Phone Know How Happy You Are? The Emotion Recognition Industry Arrives

We know. You're thinking, 'there's an emotion recognition INDUSTRY? Why didn't I get the memo?'

And the next question probably is, 'who's making a buck out of this?' Advertising is the place to start. Neuroscience research is uncovering new data on how the brain responds to various impulses. The power of emotion has long been recognized, but the neural pathways are less well understood. If emotion can be identified, isolated, measured and managed, it is conceivable that influence can be brought to bear on customer purchase decisions.

Ethical concerns have been raised about the use of such information, but as consumers continue to waive ever greater degrees of privacy in return for discounts and other emoluments, the opposition to these sorts of inducements have lessened - or seem beside the point. The flip side, eg, the selling proposition, is that by better understanding your emotional needs, businesses can more effectively craft offerings that matter to consumers.

The question of whether it should be done has been subsumed by the excitement over the fact that it can be done. The twain shall meet eventually, but given the trendline, we're betting that convenience and self-interest will prevail, as usual. JL

Kit Eaton reports in Fast Company:
Who among us has never growled like a cave person at their laptop? Or giggled uncontrollably into a smartphone? Exactly. So there are good reasons why the emotion-recognition industry is quickly gearing up--and leading to innovations with all sorts of interesting implications.

Nuance, which makes PC voice recognition systems and the tech that powers Apple's famous Siri digital PA, have revealed their next tech is voice recognition in cars and for TVs.
But the firm also wants to add more than voice recognition in an attempt to build a real-life KITT--it wants to add emotion detection so its system can tell how you're feeling while you gab away. Combined with other advances, this could turn the ways we interact with our devices into powerful paths to find new services and information.

Nuance's chief of marketing Peter Mahoney spoke to the Boston Globe last week about the future of the company's tech, and noted that in a driving environment emotion detection could be a vital tool. For example, if your car thinks you sound stressed, it may SMS your office to say you're late or even automatically suggest another route that avoids traffic. (Or how about a voice-controlled Ford system that starts playing you, say, Enya to calm the nerves.) Soon enough, you may deviate from your existing "shortest route" algorithms, while being whisked to parts of the city you never otherwise visit. Along the way, you might discover a more pleasant route to the office, or a new place to buy coffee.

But Nuance says it has far bigger plans to make your emotional input valuable: It's looking into ways to monetize its voice systems, including your emotional input, to directly recommend services and venues to you. This could come from Nuance itself, or from a partner like Apple--which we know has some innovative plans already brewing for smarter advertising. The implications of this are considerable: What if when you ask Siri for information about a movie, she works out that you're sad and recommends a comedy film that you otherwise wouldn't have seen, paired with an ad campaign?

This sort of direct emotional input to advertising is already being tested, courtesy of an MIT spun-off firm, Affectiva, which can detect a user's emotions from a video feed of their face and thus gain access to something that's often rather ephemeral, albeit important from an advertiser's perspective: How consumers feel about a brand or an ad. The company just revealed it's earned a $500,000 grant from the National Science Foundation to add to its millions in existing funding, and notes that it's already been used experimentally by publications like Forbes to crowdsource reader's responses to ads shown on the company's website, and also to help design ad campaigns for the Super Bowl.

But why deal with something as complicate as one's voice or face when skin can be used to expose feelings. A firm called Sensum uses your galvanic skin response to measure your sweat levels, and thus infer how frightened you are when watching a movie. The company tested its systems out at the SXSW film event, triggering more and more gore in a specially-arranged horror movie as the viewer's heart rates spiked. In the future the company evisages its tech being used to reward consumers for taking part--perhaps by giving them social media spending points proportional to their fear factor. Microsoft is even building more sophisticated emotion recognition into its Kinect device, meaning next-gen games (and, yes, ads) will be able to react to your facial expression.

Because the smartphones we all carry contain sophisticated computing power, cloud computing connections and, increasingly, a front-facing webcam, it's easy to see that the next generation of advertising will determine how you're feeling and subsequently serve up information related to your mood. And it's not just the question of detecting your mood, it's all about how this leads the person expressing the mood to discover new information. Essentially advertising will be more relevant to the moment, sophisticated games will react to your emotionality, and even your cars will route you to entirely new destinations based on how you're feeling.

Borrowing a phrase from HAL9000 it looks like our near-future Siri-like systems will not only advise us to "sit down calmly and take a stress pill"--they'll also suggest the best chair and medicine for the job.

0 comments:

Post a Comment