“According to Bloomberg, teams of Amazon employees in India and Romania are tasked with reviewing Cloud Cam snippets to help its device work better.”
The debacle around humans being involved in the process of reviewing content in order to improve AI efficiency looks set to continue. The latest in line is, yet again, Amazon, which has now been pulled up in a Bloomberg report that reveals how teams of Amazon employees based in India and Romania are tasked with reviewing “voluntarily submitted” snippets of videos from the Amazon Cloud Cam AI security camera, in order to help the company train its artificial intelligence algorithms and improve the functioning of the product.
However, the report further states that nowhere in its end-user disclosure does Amazon state that the footage being drawn from the Cloud Cam may be viewed by human counterparts. Sources that are part of the said review teams reportedly spoke to Bloomberg under conditions of anonymity to claim that sometimes, the footage received by these employees often include sensitive or inappropriate clips such as people having sexually intimate moments. While the sources further stated that these clips are then immediately scrapped from being used as material to train the AI algorithms by virtue of being private to the users, it still does not explain Amazon’s claim that these clips are shared by users voluntarily, and why would any user of the Amazon Cloud Cam wish to share their private, intimate moments with complete strangers.
The report’s findings reveal that while Google, Apple and Amazon were found to be using third party human contractors in order to listen to voice snippets from Google Assistant, Siri and Alexa in order to improve their catch-word activation and AI responses, Amazon’s newly uncovered video footage review process is undertaken by in-house employees, who are imposed with stringent conditions such as working on a restricted floor, without access to their mobile phones for the entire duration of their work days. While this would be an apparent improvement, it is still far from reassuring, and the sources that spoke to Amazon further stated that some of the employees still find ways to pass on some of the footage to non-employees or people without access — a risk that runs perennially with human intervention in such tasks.
After the earlier discoveries, Amazon, Apple and Google subsequently scrapped their third party contractor review programmes, issuing apologies and adding disclaimers to the end user licensing agreements that more clearly stated their data collection practices. While Amazon has so far claimed that the sharing of Cloud Cam video snippets, each of which are typically 20 seconds to 30 seconds in length, is voluntary, it is yet to issue a response explaining how these footages are being voluntarily shared — i.e. whether this is a subset of a privacy agreement that is worded in a murky manner, or whether Amazon takes express permission before passing on such footages to their AI review teams.
While it is understood that AI algorithms require considerable help from exterior sources in order to improve their efficiency, privacy advocates across the world have voiced concerns regarding sharing of personal data with these companies, in light of how such data can be breached, misused or fabricated to give rise to unwanted situations. The companies have since been urged to create AI systems that would review some of the data, which in turn should only be collected after receiving express consent from the users in question.
The Amazon Cloud is so far not available in India as part of the company’s Alexa-driven smart home lineup, but can be bought in international markets for $90 (~Rs 6,400). It can work through the clock, store security footage for 24 hours (which can be extended if you pay a subscription fee), and send notification alerts to users via the corresponding smartphone app in case it detects any anomaly in what it sees, such as sudden movement in a locked house. While the utility of such a product cannot be questioned, privacy concerns such as the one highlighted here will define how such devices come to stand in the long run.