Thursday, August 29, 2019

Apple Changes the Way It Listens to Your Siri Recordings Following Privacy Concerns


Apple today announced some major changes to its controversial 'Siri audio grading program' following criticism for employing humans to listen to audio recordings of users collected via its voice-controlled Siri personal assistant without their knowledge or consent.

The move came a month after The Guardian

reported

that third-party contractors were regularly listening to private conversations of Apple users giving voice commands to Siri in a bid to improve the quality of its product's response.

While the data received by the contractors were anonymized and not associated to Apple devices, the private conversations—which also includes private discussions between doctors and patients, business deals, seemingly criminal dealings, people having sex and so on—sometimes reveal identifiable details like a person's name or medical records.

In response to the backlash Apple received after the report went public, the company initially responded by temporarily suspending the program earlier this month while it thoroughly reviewed its practices and policies.

Now, Apple today

revealed

that the company intends to continue that program in the fall, but only after making three significant changes to it, as mentioned below:

  • First, Apple will no longer retain audio recordings of Siri interactions by default. Instead, the company will continue to use computer-generated transcripts to help Siri improve.
  • Second, Apple will allow users to opt-in to having their audio recordings listened to by human reviewers to help improve Siri's responses. Users who choose to participate can opt-out at any time.
  • Third, if you opt in to the grading program, only Apple employees will be allowed to listen to audio samples of your Siri interactions, rather than third-party contractors. The company also aims to delete Siri recordings when it determines users triggered it accidentally.

As a result of these changes, at least 300 contractors in Europe who were part of Apple's grading program have lost their jobs, The Irish Times

reports

.

Besides announcing the changes, Apple also assured its users that its Siri personal assistant has never been used outside the company, saying:

"When we store Siri data on our servers, we don't use it to build a marketing profile, and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private."

The next iOS software update for iPhones is expected to be released in early October and could be the one where Apple would have been able to implement the promised opt-out capability to its Siri grading system.

Apple is not the only major technology company that has been found listening to its smart assistant recordings and forced to rethink its approach to reviewing users' audio recordings amid privacy concerns.

Earlier this month, Google temporarily stopped human contractors

from listening

to Assistant recordings around the world. Amazon also

changed its settings

to its users opt-out of having their Alexa recordings reviewed by humans.



from The Hacker News https://ift.tt/2MJ6uUC

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.