Tips Apple

Share:

Share via email - Apple says they're sorry what you told Siri wasn't private Share on Facebook - Apple says they're sorry what you told Siri wasn't private Share on LinkedIn - Apple says they're sorry what you told Siri wasn't private Share on X - Apple says they're sorry what you told Siri wasn't private

Apple says they’re sorry what you told Siri wasn’t private

Apple says they're sorry what you told Siri wasn't private

Apple has never been one for public apologies. Back in 2010, the company came under fire for its defiance during the “antennagate” fiasco. Rather than apologize for a faulty antenna design, Steve Jobs told reporters they were “holding it wrong,” inspiring countless memes and criticism in the process.

And for years, the company continued to maintain a boldface against criticism. Despite negative press on manufacturing conditions, build quality and long wait times for support, Apple soldiered on without so much as an apology.

But all that changed in light of the news that Apple, like its industry peers, was recording and transcribing user interactions with Siri. It’s issued a public apology for the ordeal and has made some significant changes to its privacy policy in response. Here’s what it said, as well as how these changes will affect your privacy going forward.

Apple issues a formal apology for Siri debacle

Apple stunned fans, critics and analysts the world over with a public apology over the revelation that Siri, like Alexa and Google Assistant, had been recording user interactions for transcription.

According to Apple, the company realized that they “haven’t been fully living up to [their] high ideals,” and stated that privacy is a “fundamental human right” to be respected by their designs and services in the years ahead.

In addition to the eloquent statement, the company provided a road map for privacy changes as they apply to Siri in the future. Starting now, Apple will no longer rely on audio recordings to improve Siri by default, and will instead allow customers to opt-in if they decide to participate.

For those who do choose to participate, the company states that only Apple employees will be listening to any recordings and that “accidental recordings” triggered by Siri misunderstandings would be automatically discarded.

What does Apple’s apology actually mean for privacy?

Eagle-eyed observers are quick to point out that Apple, despite not listening to literal recordings, would still be relying on some form of transcription in order to “improve” Siri.

Rather than harvest audio clips themselves, Siri will instead convert audio into text for Apple’s engineers to look over. This means that, even though your voice isn’t audible to the reviewer, they’ll still know the contents of what you said.

Of course, considering Siri’s skill level at understanding commands, you might not have too much to worry about. She makes enough mistakes as it is.

But therein lies the issue at hand for Apple. Creating voice recognition technology requires a good deal of trial and error, which is why large samples of audio help the software compare and analyze its own templates for course correction. Without samples, the technology would probably be quite a bit further behind.

Still, there is controversy over whether or not any form of audio transcription violates user privacy. The real issue, however, lies in transparency. Had Apple been upfront about its transcription policy, there might have even been an enthusiastic base of willing participants — like there was for the $5 face scanning program from Google.

Nearly every major tech company has faced some kind of data reckoning recently, but it’s the responsibility of a corporation to serve the needs of its customers as well as its shareholders. And so far, it looks as if the pendulum is swinging in the direction of privacy again. The powers that be would be wise to listen.

Tags: Apple, Google, privacy, Siri