Long-press the Action button to bring up Cortana.


Long-press the Action button to bring up Cortana.

Here we go again. It’s Microsoft’s turn in the spotlight thanks to a new report about the company’s practices for reviewing and improving its AI. With today’s news, nearly every company operating a major voice assistant has now been hit with revelations that employees or contractors are hearing recordings made by the AI’s users.

The Microsoft news takes a slightly different spin than the recent allegations faced by Apple, Google, and Amazon. A whistleblower informed Motherboard that contractors are reviewing recordings not just with the company’s Cortana assistant but also conversations conducted on its Skype service.

At least some of the audio files the contractor shared with Motherboard were made by the Translator feature in the Skype Android app. Translator, which Microsoft launched in 2015, uses conversation snippets to improve its understanding of language and natural speech. The FAQ for Translator notes that audio recorded by Microsoft is used in this way, but it doesn’t explicitly state that people are the ones listening to and reviewing the audio.

Microsoft told Motherboard that it has protections in place to ensure that its users stay anonymous to those contract employees. Most of the steps are akin to those Apple and Google also had for their grading programs, such as removing any potentially identifying data and requiring non-disclosure agreements of contractors.

But while breaking said NDA, the Microsoft whistleblower said that audio files could still contain personal information and very private conversations:

Some stuff I’ve heard could clearly be described as phone sex. I’ve heard people entering full addresses in Cortana commands, or asking Cortana to provide search returns on pornography queries. While I don’t know exactly what one could do with this information, it seems odd to me that it isn’t being handled in a more controlled environment.

Microsoft’s statement also argued that it does get permission for collecting and using voice data. “We strive to be transparent about our collection and use of voice data to ensure customers can make informed choices about when and how their voice data is used,” a representative told Motherboard following the leak.

So far, Microsoft doesn’t appear to be making any changes to its policies for reviewing voice data. Apple announced that it would completely suspend its voice-grading system and add an opt-out choice to users. Meanwhile, Google has at least stopped its grading program in Europe, where it could be facing further issues around compliance with General Data Protection Regulation rules. And although Amazon hasn’t been caught up directly in this wave of whistleblowers, it has seen Alexa privacy breaches before and did recently add a way to opt-out of human reviews.



Source link