'I was deepfaked by my best friend'
"Jodie" found images of herself used in deepfake porn - and then faced another terrible shock. She told BBC File on 4 about the moment she realised the person responsible was one of her best friends.
WARNING: Contains offensive language and descriptions of sexual violence
In the spring of 2021, Jodie (not her real name) was sent a link to a porn website from an anonymous email account.
Clicking through, she found explicit images and a video of what appeared to be her having sex with various men. Jodie's face had been digitally added onto another woman's body - known as a "deepfake".
Someone had posted photos of Jodie's face on a porn site saying she made them feel "so horny" and asking if other users on the site could make fake pornography of her. In exchange for the fakes, the user offered to share more photos of Jodie and details about her.
Speaking for the first time about her experience, Jodie, who is now in her mid-20s, says, "I was screaming and crying and violently scrolling through my phone to work out what I was reading and what I was looking at."
She adds: "I knew that this could genuinely ruin my life."
Forcing herself to scroll through the porn site, Jodie said she felt her "whole world fall away".
Then she came across one particular image and she made a horrible realisation.
An unnerving series of events
It was not the first time Jodie had been targeted.
In fact, it was the culmination of years of anonymous online abuse.
When Jodie was a teenager, she discovered that her name and photos were being used on dating apps without her consent.
Degraded By Deepfakes
How a woman tried to unmask the person responsible for using her image to make deepfake porn - and the disturbing answer she found.
This went on for years and she even received a Facebook message from a stranger in 2019 who said he was due to meet her at Liverpool Street station in London for a date.
She told the man that it wasn't her who he had been speaking to. She says she felt "unnerved" because he knew all about who she was and had managed to find her online. He'd found her on Facebook after the "Jodie" on the dating app had stopped responding.
In May 2020, during the UK's lockdown, Jodie was also alerted by a friend to a number of Twitter accounts that were posting pictures of her, with captions implying she was a sex worker.
"What would you like to do with little teen Jodie?" read one caption next to an image of Jodie in a bikini, which had been taken from her private social media account.
The Twitter handles posting these images had names like "slut exposer," and "chief perv."
All of the images being used were ones she'd been happy to share on her social media with close friends and family - but no one else.
Then she found that these accounts were also posting images of other women she knew from university, as well as from her hometown of Cambridge.
"In that moment, I feel a very strong sense [that] I'm at the centre of this and this person is looking to hurt me," she said.
Fighting back
Jodie began to contact the other women in the pictures to warn them, including a close friend we are calling Daisy.
"I just felt sick," said Daisy.
Together the friends discovered many other Twitter accounts posting their images.
"The more we looked, the worse it got," said Daisy.
She messaged the Twitter users and asked where they had got their pictures. The reply was that the photos were "submissions" from anonymous senders who wanted them shared.
"It's either an ex or someone who gets off on you," one user replied.
Daisy and Jodie drew up a list of all the men who followed both of them on social media, and who could access both sets of their pictures.
The friends concluded it must be Jodie's ex-boyfriend. Jodie confronted him and blocked him.
For a few months, the posts stopped - but then an anonymous emailer got in touch.
"Sorry to remain anonymous," the email read, "but I saw this guy was posting pics of you on creepy subreddits. I know this must be really scary."
Jodie clicked on the link and was taken through to the online forum, Reddit, where a user had posted photos of Jodie and two of her friends, numbering them 1, 2 and 3.
Others online were invited to take part in a game - which of these women would you have sex with, marry or kill.
Beneath the post, 55 people had already commented.
The photos used on the site were recent, and had been posted after Jodie blocked her ex. The women realised they had blamed the wrong person.
Six weeks later, the same emailer got in touch again - this time about the deepfakes.
'The ultimate betrayal'
When drawing up their list, Jodie and Daisy had ruled out a handful of men who they completely trusted, such as family - and Jodie's best friend, Alex Woolf.
Jodie and Alex had struck up a firm friendship as teenagers, bonding over their shared love of classical music.
Jodie had sought comfort from Woolf when she discovered that her name and photos were being used on dating apps without her consent.
Woolf went on to get a double first in music from Cambridge University and won BBC Young Composer of the Year 2012, as well as appearing on Mastermind in 2021.
"He [Woolf] was very aware of the issues that faced women, especially on the internet," says Jodie.
"I really felt that he was an advocate."
However, when she saw the deepfake porn photos, there was a picture of her in profile with the image of King's College, Cambridge, behind her.
She clearly remembered it being taken - and that Woolf had also been in the photo. He was also the only other person she had shared the image with.
It was Woolf who had been offering to share more original pictures of Jodie in exchange for them being turned into deepfakes.
"He knew the impact that it was having on my life so profoundly," says Jodie. "And yet he still did it."
'Utterly ashamed'
In August 2021, Woolf, 26, was convicted of taking images of 15 women, including Jodie, from social media and uploading them to pornographic websites.
He was given a 20-week prison sentence, suspended for two years and ordered to pay each of his victims £100 in compensation.
Woolf has told the BBC he is "utterly ashamed" of the behaviour which led to his conviction and he is "deeply sorry" for his actions.
"I think about the suffering I caused every day, and have no doubt that I will continue to do so for the rest of my life," he says.
"There are no excuses for what I did, nor can I adequately explain why I acted on these impulses so despicably at that time."
Woolf denies having anything to do with the harassment of Jodie which took place before the events he was charged with.
For Jodie, finding out what her friend had done was the "ultimate betrayal and humiliation".
She says: "I re-lived every conversation that we had, where he had comforted me and supported me and been kind to me. It was all a lie."
We contacted X, formerly Twitter, and Reddit about the posts. X did not respond, but a spokesperson from Reddit said: "Non-consensual intimate media (NCIM) has no place on the Reddit platform. The subreddit in question has been banned." The porn site has also been taken down.
In October 2023, sharing deepfake porn became a criminal offence as part of the Online Safety Bill.
There are tens of thousands of deepfake videos online. Recent research found that 98% are pornographic.
However, Jodie feels very angry that the new law does not criminalise a person who asks others to create deepfakes, which is what Alex Woolf did. It is also not illegal to create a deepfake.
"This is affecting thousands of women and we need to have the proper laws and tools in place to stop people from doing this," she says.
If you have been affected by any of the issues in this story, information and support is available via the BBC Action Line
Sign up for our morning newsletter and get BBC News in your inbox.