Amazon sued over Alexa child recordings in US

Getty Images Dave LimpGetty Images
Amazon says it only holds profiles for children if parents set them up

Amazon is being sued over its smart assistant's recordings of children.

Two US cases allege the firm lacks consent to create voiceprints that could let it keep track of a youngster's use of Alexa-enabled devices and thus build up a "vast level of detail about the child's life".

Amazon has said it only stores data once a device-owner has given it permission to do so.

And it says parents can delete a child's profile and recordings.

Lawyers involved in the cases are seeking damages for the two plaintiffs involved, as well as others who are being invited to join the class-action lawsuits in nine states where it's claimed Amazon is in breach of privacy laws.

Amazon said in January more than 100 million devices featuring Alexa had been sold worldwide, ranging from its own Echo speakers to third-party products including headphones, fridges and televisions.

"Amazon has a longstanding commitment to preserving the trust of our customers and their families, and we have strict measures and protocols in place to protect their security and privacy," a spokeswoman told the BBC.

Presentational grey line

How Alexa works

Software on enabled devices listens out for a wake word - which can be set to be Alexa, Amazon or computer. If it is detected, audio captured just prior to the wake word as well as what was said immediately afterwards, is transmitted to Amazon's computer servers for processing.

Because mistakes are sometimes made, recordings can be transmitted when the wake word is not actually used.

The recordings are stored, allowing Amazon to use them to create a model of a user's voice characteristics to help the service learn to adapt to quirks in the different ways different people make requests as well as to provide tailored responses to different users in the home.

Registered users can prevent this happening by withdrawing consent. They also have the option to actively train the system to better recognise their voice by repeating a series of phrases.

Human operators listen to some of the clips to tag them in order to help the machine-learning system involved become more accurate.

Users can delete stored utterances via an app or via Amazon's website. In addition, they can ask Alexa to delete the last recording or last day's worth of recordings via a voice command.

Presentational grey line

Two class action cases are being pursued, one filed in Los Angeles on behalf of an eight-year-old boy and the other in Seattle on behalf of a 10-year-old girl.

The children are said to have used Alexa to tell jokes, play music, recognise movie references, solve maths problems and answer trivia questions.

In both cases, the children had interacted with Echo Dot speakers in their homes, and in both cases the parents claimed they had never agreed for their child's voice to be recorded.

Amazon Amazon press imageAmazon
Amazon's marketing materials target its Echo speakers at young families among other users

The complaints say Alexa devices could have been designed to only send a digital query rather than a voice recording to Amazon's servers - although processing the audio locally would have disadvantages such as potentially driving up the cost of the devices involved and making it harder for Amazon to deploy updates to its voice-recognition tech.

Alternatively, it is suggested that Amazon could automatically overwrite the recordings shortly after they have been processed, although this might affect the smart assistant's ability to deliver personalised replies.

Even if neither of these options were adopted, the plaintiffs suggest that more could be done to ensure children and others were aware of what was going on.

"At no point does Amazon warn unregistered users it is creating persistent voice recordings of their Alexa interactions, let alone obtain their consent to do so," the complaints state.

"Neither the children not the parents have consented to the children's interactions being permanently recorded."

'Parental opt in'

Amazon has referred reporters to a blog it published last month about a subscription service designed to help parents manage their children's use of Alexa.

It notes that parents can review and delete their offspring's voice recordings at any time via an app or the firm's website. In addition, it says, they can contact the firm and request the deletion of their child's voice profile and any personal information associated with it.

The BBC also quizzed the executive in charge of Alexa about the matter last week at the firm's re:Mars conference before the complaints had been filed.

Dave Limp said the firm only profiled under-13s if parents had agreed to its terms of service.

Amazon Amazon EchoAmazon
Amazon recently launched a new Kids Edition version of the Echo Dot in the US

"[If] they're 13 and below... then the parent opts in for them," he explained.

"You have to verify through a parent that the parent themselves has given consent for the child. And we do that by verifying an actual credit card number.

"So if you don't that, then we do do not keep any of the data for the child and we wouldn't ever do that."

He added that only a fraction of 1% of utterances were ever checked by its staff, and even then those involved would not be shown the user's name or address.

However, he acknowledged that Amazon could do more to flag that people - rather than just automated checks - were involved in reviewing the recordings.

At present, Alexa's privacy notice says that past voice requests are used to improve its services, but does not explicitly say they are listened to by humans.

"I think it's fair feedback," Mr Limp said.

"If people don't think we're being as transparent on that, we have no issue being more transparent."