Sex, drugs, and medical appointments: what Siri’s indiscreet ears hear

Humanicus
2 min readJul 29, 2019

--

Photo by Tyler Lastovich on Unsplash

“What happens on your iPhone stays on your iPhone”, proclaimed a huge billboard bill posted in Las Vegas at the last CES, a big show dedicated to new technologies. Apple’s bravado, which has constantly extolled its superiority over its competitors in terms of data protection.

But as Amazon and Google have admitted before it, the Cupertino company also listens to audio clips recorded by his voice assistant Siri, according to revelations from a whistleblower at the Guardian.

As at Google and Amazon, a small portion of Siri records, less than 1% according to Apple, are randomly selected and sent to reviewers to see if Siri correctly understood what was asked of him, if he answered wrong, that activation was intentional, etc.

Apple told The Guardian that records, which last only a few seconds, are not tied to their customers’ identifiers, and that anyone who can listen to them has signed strict confidentiality agreements.

But even if the identities are not directly related to the recordings, the staff still has access to the coordinates, contacts, and data of the apps related to the device used.

Accidental releases

This process helps identify the flaws in the wizard and improve it. Only, these are not people directly employed by Apple who consult these files. These are subcontractors whose employees, according to the Guardian source, are not really sorted out.

However, many records are triggered by mistake. Siri can wrongly think that his activation sentence (“Hey Siri”) was pronounced, especially when he hears the sound of a zipper. When an Apple Watch is raised and then hears words, Siri is also automatically triggered, causing an “incredibly high” accidental trigger number.

Thus, “we can hear a doctor and his patient, we can not be sure but some conversations are very much like a drug deal. Sometimes people sleeping together are accidentally recorded by a connected speaker or watch”.

Unlike Amazon and Google, which allow their customers to refuse that their recordings are used to improve the service, the only way to escape at Apple is to disable Siri entirely.

--

--

Humanicus
Humanicus

Written by Humanicus

Please follow me since now we need 100 min follower on medium

No responses yet