A woman was left horrified after a stranger turned her mirror selfie into a sick AI-generated deepfake nude photo and then taunted her with it.
Courtney, who MailOnline will not be identifying, woke up last Friday to find a stranger had sent her a nude photo of herself over Instagram, accompanied by a smirking emoji.
Shocked, the aspiring social worker knew the photo wasn’t real.
The person had screenshotted a selfie she had posted on her account and uploaded it to one of the many deepfake nude-generating websites that will ‘virtually undress’ any person in any photo.
Horrified, Courtney, who is in her 20s, blocked the account and has now removed all photos of herself from her social media in fear this will happen again.
But she was left in a constant state of anxiety, thinking about if the photo had been sent to anyone else – to her followers, her friends, her family.
Pictured: The mirror selfie Courtney uploaded to her Instagram account
Pictured: The message on Friday, taunting her with a fake AI-edited nude version of the selfie she had posted to her account
Thousands of women and girls have been victims of deepfake porn which has risen steeply with the emergence of AI technology.
In 2017, a deepfake pornographic video of Wonder Woman star Gal Gadot was created by a Reddit user and spread like wildfire online.
Since then, the practice has become increasingly common, targeting celebrities like actresses Scarlett Johansson and Emma Watson.
But thousands of women and girls going about their everyday lives are targeted for simply saying no to men – or for no reason at all.
One website which virtually strips women naked received 38million hits in the first eight months of 2021, according to the UK government.
Describing the ordeal, Courtney told MailOnline: ‘A completely innocent image of me was made into something different.’
She had been shocked – she instantly blocked the account and went back to sleep.
Courtney was in a state of disbelief about what happened to her and felt sick to her stomach.
She said: ‘I hadn’t heard from this person in two years and I woke up one morning, last Friday. I woke up at 7am to go to college but I received this extremely graphic image that definitely wasn’t real.
Twitch star 28-year-old QTCinderella, posted a tearful video begging people to stop accessing the images after deepfake porn image of her was put online
The Twitch streamer posted on X about the traumatic experience she went through
‘I was in shock. No way did that just happen. But I realised the severity of it later in the day… who they could have sent that to.’
It wasn’t until later on she started to think about what someone could do with an image like that – Courtney realised that whoever was behind that account could have access to her follower list.
Courtney is from a small town in the Scottish Highlands and she started to fret that the person behind the photo could find young girls she knew from home on her profile and do the same thing to them.
‘They had access to my account and who I followed, who liked my posts, my friends.
‘I’m from a small town in the Highlands. I started worrying – what if he could follow young girls from my page and do the same thing to them?’
Courtney was horrified – she is studying to become a social worker and instantly thought about warning everyone she knew about the danger of posting pictures.
She bravely wrote about what she had gone through on Facebook, urging parents to think twice before posting pictures of their children on accounts without privacy settings.
Maya Higa (left) said she felt ‘nauseous’ and ‘vulnerable’ after discovering her image on a porn site. British Twitch star Anita (right) was also included without her consent
The aspiring social worker felt ‘violated’ by the image and said it was ‘humiliating and embarrassing’ to share what had happened but felt it was important to warn others.
But the post encouraged other women who have experienced the same thing to get in contact with her and that night, after talking with them, she reported what she had gone through to the police.
Courtney said the police were ‘shocked’ and told her they would do everything they could to track down the person behind the account.
However, she mentioned the officer she reported it to didn’t know how to categorise the crime as AI technology is so new and unfamiliar.
It’s frighteningly easy for people to create violating fake images like this. Even if you’ve ever posed naked for a camera in your life, tech can make it look like you did.
A simple Google search for ‘AI deepfake nudes’ brings up countless websites which claim you can ‘see any girl naked at the click of a button’.
Users can just go to the app or website and upload an image, and they can use the fake photo to blackmail whoever they want.
One Redditor posted on the forum asking for recommendations for websites that won’t require them to pay to have deepfake nude images without blurring.
The post received 5,000 replies and nearly one thousand upvotes.
Courtney said the person who sent the deepfake image had been following her on Instagram for two years as she had a public account.
She had recently cracked down on her privacy settings online and restricted who could see her content, but said ‘I obviously wasn’t careful enough’.
She said: ‘This account was following me for two years and they’d messaged me before… with uninvited compliments, lets say.
‘My account was public but in recent months I had changed it to private.’
She added: ‘I did a post about it to spread awareness for parents at home who might post photos of their kids on Facebook. People do this to kids, and it’s disgusting.
‘It’s extremely important to spread awareness because no one expects it. I never thought it would happen. A completely innocent image of me was made into something different.
‘Loads of people reached out to me, saying that this has happened to them.
‘That night I called the police. I wasn’t threatened, but the fact that the account sent me that message had made me feel violated and scared.
‘The account has been deleted now so I don’t know what can be done, and that’s heart-breaking.
The post on Facebook that prompted women to get in touch witch Courtney to share their similar experiences of having deepfake images made of them
‘Other people have been threatened, I’m quite lucky. If I hadn’t blocked and reported them, who knows what could have happened.
‘The police were shocked for me when I reported it. They said they were going to try their best and do anything they could, but Instagram profiles are notoriously hard to track the location of.
‘I’m passionate about this, especially for the sake of children. That is what breaks my heart the most, it makes me angry. It makes me want to be a social worker even more.
‘There needs to be more of a discussion about this and what can be done. This stuff is so accessible.
‘The police didn’t even know what category to put this in, if it’s revenge porn or what. AI is obviously new but becoming so that everyone can use it really rapidly – it’s scary.
‘People don’t know how to deal with this, especially without help from social media companies. Everyone needs to keep their profiles private. Pictures of kids are posted on Facebook all the time.
‘Parents post photos of their kids in Halloween costumes, on their first day of school. People can save those to their camera roll and keep them for as long as they want.
‘They can do what they like with it. It was very humiliating and embarrassing to post this on Facebook. All I can do it tell people to be careful. It was awful. We’re in a new world with new technology.’
Since she received the altered mirror selfie, Courtney said she’s removed all pictures of herself and her body from her social media in fear it might happen again.
Her post on Facebook read: ‘This morning, I woke up to a message from an account that has only ever messaged me ONCE in December of 2021 to compliment a picture I had posted of myself.
‘My friend replied to it drunkenly saying to get lost and they’ve not interacted since.
‘Today they sent an emoji and an obviously altered and AI nude image of me from a mirror selfie I had taken.
‘It’s extremely graphic and has very much upset me and made me feel violated.
‘An innocent selfie of myself FULLY CLOTHED has now been altered to be an imagined image of me naked.
‘AI has gotten so so scary and I can’t believe this has happened, so they have been reported and since blocked.
‘All pictures of myself and my body (clothed obviously) have been removed as I feel so sick to my stomach.’
Cases like Courtney’s are astonishingly common.
Parents in Spain recently spoke out about deepfake nudes of their children – aged between 11 and 17 – being widely distributed.
One of the mothers said a boy had asked her daughter for money over Instagram, and when she refused he sent her an AI-generated fake nude photo of herself.
Female Twitch streamers have also spoken about about being victims of these fake images.
The scandal came to a head when one of the victims, 28-year-old QTCinderella, posted a tearful video begging people to stop accessing the images.
The UK government announced last year that it is seeking to outlaw deepfake revenge porn as part of its Online Safety Bill.
It said that one website which virtually strips women naked received 38 million hits in the first 8 months of 2021.
In reaction to this announcement, Ruth Davison, CEO of Refuge, said: ‘Refuge welcomes these reforms and is pleased to see progress in tackling abuse perpetrated via technology.
‘As the only frontline service with a specialist tech abuse team, Refuge is uniquely placed to support survivors who experience this form of abuse.
‘We campaigned successfully for threatening to share intimate images with intent to cause distress to be made a crime, via the Domestic Abuse Act, and these reforms will further ensure police and law enforcement agencies rightly investigate and prosecute these serious offences.
‘Tech abuse can take many forms, and Refuge hopes that these changes will signal the start of a much broader conversation on the need for strengthening the response to online abuse and harm.’
A Police Scotland spokesperson said: ‘Around 5.20pm on Friday, 27 October, 2023, we received a report of a woman receiving an inappropriate image online.
‘Enquiries are ongoing.’