Hey Siri,’ stop listening! Apple says contractors can’t hear you anymore
Judging by the privacy-centric climate of current technology discourse, it’s a good time to be an Apple fan. Not only has Apple avoided much of the scrutiny following its biggest competitors in the realm of data and security, but it’s also managed to stay out of the spotlight in terms of how it handles the private data it acquires for Siri and Apple Music. It’s to the point where even security analysts from around the web have spoken highly of Apple and its commitment to protecting users’ most sensitive information.
Well, all that is about to change for Apple. Recent reports are suggesting that not only is Apple keeping tabs on private conversations using Siri — it’s doing it the same way that Amazon and Google have come under fire for!
In light of this discovery, Apple has shuttered its audio transcription program for Siri. But the fact is software engineers continue to use human reviewers in the development of virtual assistants — claiming it’s the only way to properly train them. But with how far behind Siri is compared to her competition, what excuse does Apple really have for this privacy oversight?
Siri is listening — but for what purpose?
Update 08/02: Apple has now reportedly shuttered its manual reviews of Siri recordings. This is being done, in Apple’s words, out of concern for customer privacy. The program isn’t gone for good — just merely suspended while Apple “re-evaluates” the process. When it returns for future versions of iOS, the company claims users will have the choice to opt-in or out of participating in manual reviews of Siri recordings.
According to reporting by The Guardian, anonymous whistleblowers at Apple have revealed that the company isn’t being entirely forthright in the way it’s developing its Siri products. Just like what Amazon and Google are doing with their own respective voice assistants, Apple is harvesting and analyzing samples of Siri queries in order to train its AI for better response.
Just like with the other voice assistants, it’s also not uncommon for Siri to accidentally wake and record private conversations that product owners might not want contractors to hear.
Because Apple’s wake words for Siri are “Hey Siri,” it can be fairly easy to trip up the AI into waking up with a similar sounding phrase. Similarly, the Apple Watch activates Siri with a simple lifting of the wrist — meaning that anyone who speaks with a lot of hand gesturing could fall victim to some unwanted corporate recording.
Whistleblowers say that Apple’s contractors listen to and transcribe the conversations uploaded by Siri, but have occasionally heard private incidents such as a discussion between doctor and patient, a drug deal, and even an intimate encounter!
In response to these recent stories, Apple has stated that “less than 1%” of content is analyzed by contractors. According to Apple, the recordings are randomly selected and are used to improve the way Siri interacts with users.
How can I stop Siri from listening to me?
Here’s where it gets tricky. Apple’s dedication to opacity and user privacy is a two-way street — meaning that it’s also very difficult to make changes to how the company uses the private data it has in its store.
Apple is notoriously tight-lipped about its internal processes, so for the time being, the best way to stop Siri from potentially listening to you is to just disable Siri’s “wake words.”
This can be done by navigating to Settings, then Siri & Search. Once inside this menu, turn the buttons that say Listen for “Hey Siri” and Allow Siri When Locked to the off position.
On the Apple Watch, press the digital crown and navigate to the Settings app. Once inside, select General, and then Siri. From here, disable Hey Siri and Raise to Speak.
Disabling these options on your phone and watch will prevent you from accidentally triggering the service, and will ensure that the company isn’t passively listening to your activity.
It honestly comes as a surprise to see Apple engaging in the same practices that it has lambasted its competitors for. At this point, one could even be forgiven for thinking that these companies are all speaking the truth by saying crowdsourced audio is a necessity in developing voice assistants.
However, Siri’s utter lack of refinement compared to Alexa or Google’s assistant makes Apple’s rationale seem a lot more flimsy by comparison.
Even still, natural speech has long been considered a holy grail in AI research. Maybe once truly speech capable AI is ready, big companies won’t need to construct virtual panopticons in order to develop them anymore.
There’s no shame, however, in choosing not to participate. Privacy is a personal right — so feel free to exercise it as you please.