James Bulger's mum seeks AI law to curb clips of murder victims

The mother of murdered toddler James Bulger has urged the government to pass a new law to clampdown on AI-generated videos of child murder victims.
Denise Fergus said TikTok had not responded to requests to take down videos showing digital clones of her two-year-old son talking about his fatal abduction.
The government said videos like this are considered illegal under an existing law, the Online Safety Act, and should be removed by platforms.
TikTok said AI videos highlighted by the BBC had been removed for violating its rules.
A TikTok spokesperson said: "We do not allow harmful AI-generated content on our platform and we proactively find 96% of content that breaks these rules before it is reported to us."
The BBC found similar videos on YouTube and Instagram and those platforms said the content had been taken down for breaching their rules.
Ms Fergus told the BBC she thought existing laws did not go far enough to force platforms to remove harmful content and prevent AI being used for such purposes.
"It's just words at the moment," Ms Fergus said. "They should be acting on it."
Ms Fergus said the AI videos depicting her son were "absolutely disgusting" and those posting them to social media "don't understand how much they're hurting people".
She said: "It plays on your mind. It's something that you can't get away from. When you see that image, it stays with you."
Ms Fergus said she had a meeting with Justice Secretary Shabana Mahmood on Thursday and intended to raise the issue.
James Bulger was out with his mum at a Merseyside shopping centre when he was abducted by two 10-year-old boys, Jon Venables and Robert Thompson, in 1993.
Venables and Thompson led the toddler two-and-a-half miles away to a railway track, where they tortured him and beat him to death.
James's body was found on a railway line two days later.
Thompson and Venables were found guilty of killing James, making them the youngest convicted murderers in modern British history.
Illegal content
The AI videos on social media show animated child avatars telling the story of James's murder, in the first person.
The BBC found multiple similar videos shared by accounts that post content about crime and murders, apparently for clicks and monetisation.
Some of the accounts follow the same format in every video, using AI avatars to imitate different murder victims from across the world.
A YouTube spokesperson said the platform's guidelines "prohibit content that realistically simulates deceased individuals describing their death".
The spokesperson said a channel called Hidden Stories had been terminated for "severe violations of this policy".
Ms Fergus said: "We go on social media and the person that's no longer with us is there, talking to us. How sick is that?
"It's just corrupt. It's just weird and it shouldn't be done."
A government source said the users posting these videos could be prosecuted for public order offences relating to obscene or gross material under the Communications Act.
A government spokesperson said: "Using technology for these disturbing purposes is vile.
"This government is taking robust action through delivery of the Online Safety Act, under which videos like this are considered illegal content where an offence is committed and should be swiftly removed by platforms.
"Like parents across the country we expect to see these laws help create a safer online world.
"But we won't hesitate to go further to protect our children; they are the foundation not the limit when it comes to children's safety online."

In 2023, the previous Conservative government passed the Online Safety Act, which aims to make social media firms and search engines protect children and adults in the UK from illegal, harmful material.
Ofcom is the regulator that enforces the law and is publishing guidance on how platforms can meet their duties to protect people online.
The regulator has powers to take action against companies which do not follow their duties, but it cannot force platforms to take down individual pieces of content.
Kym Morris, the chairwoman of the James Bulger Memorial Trust, said the government could amend the Online Safety Act to include specific protections against harmful AI-generated content.
She also said the government will need to pass new legislation that covers synthetic media and AI misuse.
"There must be clear definitions, accountability measures, and legal consequences for those who exploit this technology to cause harm - especially when it involves real victims and their families," Ms Morris said.
"This is not about censorship - it's about protecting dignity, truth, and the emotional wellbeing of those directly affected by horrific crimes."
Narrow AI bill
There were plans to include measures to force social media companies to remove some "legal-but-harmful" content in the Online Safety Act, before it became law.
But the proposals were scrapped over censorship concerns.
Online safety campaigners argue the rules around removing harmful content needed tightening to close loopholes in the act.
In January this year, Technology Secretary Peter Kyle told the BBC he had "inherited an unsatisfactory legislative settlement" in the Online Safety Act.
"I'm very open-minded and I've said publicly, I think we'll have to legislate into the future again," Kyle said.
But the BBC understands the government has no plans at the moment to pass a new law focused on AI content created and posted online.
A government source said ministers were aiming to introduce a narrow AI bill limited to the regulation of cutting-edge AI models later this year.
Jemimah Steinfeld, chief executive officer of Index on Censorship, said the AI videos of child murder victims appeared to be in breach of existing laws.
She said amending the Online Safety Act to restrict AI content "does run the risk of legitimate videos being caught up in it".
"If it's illegal already, we don't need regulation here," she said.
Ms Steinfeld said while we need to "avoid a knee-jerk reaction that puts everything in this terrible box", she had sympathy for James Bulger's mum.
"To have to relive what she's been through, again and again, as tech improves, I can't imagine what that feels like," she said.
An Ofcom spokesperson said when content is reported to a platform, it "must decide whether it breaks UK law, and if so, deal with it appropriately".
"We're currently assessing platforms' compliance with these new duties, and those who fail to introduce appropriate measures to protect UK users from illegal content can expect to face enforcement action," the spokesperson said.