Chris Stokel-Walker reports in New Scientist:
Microsoft has developed an artificial intelligence for its Teams videoconferencing software that aims to put people presenting a remote talk more at ease by highlighting the most positive audience reactions.The AI identifies participants’ faces and uses a neural network to classify their expressions into emotions such as sadness, happiness and surprise, and spot movements like head shaking and nodding. It also uses an eyebrow detection system to spot confusion. (But) AI to parse human emotions "never account for how someone might contest an inaccurate interpretation of their affect.”Microsoft has developed an artificial intelligence for its Teams videoconferencing software that aims to put people presenting a remote talk more at ease by highlighting the most positive audience reactions.
The AI, named AffectiveSpotlight, identifies participants’ faces and uses a neural network to classify their expressions into emotions such as sadness, happiness and surprise, and spot movements like head shaking and nodding. It also uses an eyebrow detection system to spot confusion, in the form of a furrowed brow.
Each expression is rated between 0 and 1, with positive responses scoring higher. Every 15 seconds, the AI highlights the person with the highest score in that time period to the presenter.
A Microsoft Research spokesperson told New Scientist that “spotlighting audience responses makes the presenter more aware of their audience and achieves a communicative feedback loop”. The research team declined an interview.
In a survey of 175 people conducted by the team, 83 per cent of those who give presentations said they often miss relevant audience feedback when presenting online – particularly non-verbal social cues.
To see whether AffectiveSpotlight could help address this problem, the team tested it against software that highlighted audience members at random. AffectiveSpotlight only highlighted 40 per cent of participants during talks, compared with 87 per cent by the random software. Speakers reported feeling more positive about speaking with AffectiveSpotlight, though audience members couldn’t discern a difference in the quality of presentation from those using the AI.
Rua M. Williams at Purdue University, Indiana, queries whether the AI is much use. “It is certainly dubious at best that any interpretation based on just audio or video, or both, is ever accurate,” they say.
Williams also worries that relying on AI to parse human emotions – which are more complicated than they may first appear – is troublesome. “While some studies like this one may mention issues of privacy and consent, none ever account for how someone might contest an inaccurate interpretation of their affect.”
Read more: https://www.newscientist.com/article/2267147-microsoft-teams-ai-could-tell-you-who-is-most-enjoying-your-video-call/#ixzz6nDZpdzkK
0 comments:
Post a Comment