Home » PC » Report claims accidental Siri recordings reveal your private information

Report claims accidental Siri recordings reveal your private information

What you need to know

  • A report claims Apple contractors hear private information in accidental Siri recordings.
  • This information can allegedly be used to track user location and identity.
  • Apple released a statement saying a small portion of Siri requests are analyzed to improve Siri and dictation.

You should always be careful about what you say.

According to a new report from The Guardian, Apple contractors often hear private user information thanks to accidental Siri recordings. These recordings are passed on to contractors who then grade Siri’s responses and whether they were appropriate.

The Guardian, speaking to an anonymous source, claims contractors hear a wide range of recordings, from drug deals to users divulging medical information.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” The Guardian’s source said. “These recordings are accompanied by user data showing location, contact details, and app data.”

Meanwhile, The Guardian highlights the fact that Apple does not explicitly disclose contractors may listen to Siri recordings. AppleInsider, however, refutes that claim, saying Apple has disclosed to consumers from the beginning that some Siri recordings are listened to by humans.

In response to the report, Apple said only about 1% of daily Siri activations are heard by contractors. Here’s a statement Apple sent to The Guardian:

A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.

The source speaking to The Guardian says they were motivated to speak out due to fears of such information being misused.

“There’s not much vetting of who works there, and the amount of data we’re free to look through seems quite broad,” the source said. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.”

Apple isn’t the only company that uses oversight of its artificial intelligence. Both Amazon and Google employ similar practices in an effort to improve the quality of their respective voice assistants.

Source of the article – iMore