Apple contractors reportedly listen to your conversations with Siri

“Unlike Amazon Alexa and Google Assistant, Siri is present on the Apple Watch, which is activated anytime a user raises their wrist”

According to a new report from The Guardian, Apple hired contractors to listen to pre-recorded Siri conversations. However, one of the former contractors revealed that they had heard accidental recordings of users’ personal lives, including doctor’s appointments, addresses, and drug deals. The Siri interactions were sent to the contractor who was asked to grade it on factors like “whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful.” However, the Cupertino giant doesn’t say if others are listening to the recordings as well.

Apple-Watch-Series-4-09
Apple notes that less than 1 percent of daily activations are analysed under this system

Furthermore, Apple doesn’t mention that human workers will be listening to and analysing Siri conversation data. However, in a statement to The Guardian, Apple acknowledged, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Apple notes that less than 1 percent of daily activations are analysed under this system. Amazon and Google have admitted having similar policies in place where actual workers listen to recorded conversations to improve the system. Compared to Alexa, which is limited to smart speakers and Google Assistant to Home speakers and phones, Siri is present on Apple Watch, which is activated anytime a user raises their wrist.

The Guardian’s source claim that personal conversations made their way to strange Apple workers. “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on. These recordings are accompanied by user data showing location, contact details, and app data, the report said. Unlike Amazon and Google that allow customers to opt-out of these practices, Apple doesn’t offer a similar privacy-protecting option apart from disabling Siri altogether.