Take a fresh look at your lifestyle.

The cruel costs for women of the AI ​​-Boem? Deepfake Porno images everywhere on the internet (and there may even be some of you)

- Advertisement -

0

It takes less than ten minutes before I produced the fake -naked photo of myself. I put on a picture of me that arrives at a prize-giving ceremony, with a red lipstick and a black dress with a mid-length with beaded sleeves, in an app.

A short time later, without spending a cent, the dress disappeared and I apparently completely naked on the red carpet.

If I didn’t know it wasn’t my body – not my bikini tan line or my navel – I would be completely fooled.

This is disturbing enough. But I will take my experiment further. On another website I upload a photo of myself and for just £ 14 it is integrated into one of the dozens of porn videos. It really looks frightening.

I know this sounds stupid; I know I should have been prepared. But although they are not real, it is something horrible to see the videos of yourself that you cannot imagine until it happens to you.

I start to sweat and my heart rate rises. I feel that I need a shower. I delete the video and close the page. But still my muscles tense, my throat is becoming tighter and I am returned to September 2020, when my bookmen who hate women – about how violence and hatred of women are fired on internet forums – was just published.

It was the amount of the pandemic and I did many conversations and interviews online. Not long after, the abuse started. Much was the kind I was used to. Photos of men holding machetes who said they came for me. Unequal discussions about the best way to hang up.

Then something else. A photo of myself with a man who performs a sexual action on me. Even now it makes me shiver. Even now it feels like a violation. There is still shock, disgust, fear and shame every time I see it.

The newest book by Laura Bates, The New Age of Sexism, investigates how technological progress affects women

The newest book by Laura Bates, The New Age of Sexism, investigates how technological progress affects women

DeepFake images and videos are online and only take a few minutes to make in some cases

DeepFake images and videos are online and only take a few minutes to make in some cases

And the worst thought of everyone; These images can survive me, maybe my legacy – and there is nothing that I can do about it.

I wanted to investigate Deepfakes – digitally manipulated images and videos that give the false appearance of a person who does something or said he did – for my last book, the new era of sexism.

This investigates how technological progress – in particular AI – influences women. Although many believe that we make progress in the direction of gender equality, in reality we explode a new, powerful and easily accessible tool to enable the abuse and suppression of women and girls under our nose.

A recent study showed that 143,000 DeepFake videos were viewed 4.2 billion times on 40 popular websites. These videos are not only of celebrities; A poll found 63 percent of men who would consider making deepfake porn would use images of women they knew. All perpetrator needed is a photo.

It can happen – indeed, happens – with everyone. Women like you, your daughter or granddaughter. If we do nothing about it, I fear that the violation of women will be coded in the fabric of the future.

Of all the abuse I receive, DeepFake images and videos remain the most with me. I think that many people who reject Deepfake pornography as harmless, can’t really imagine how it would feel if it happened to them.

First, the total shock is. The panic and despair. Then fear starts. This is ‘beyond’. How many people have seen it? Oh God, what if your parents see it? You feel that you will get sick. You should report it. You should remove it. Where do you start?

Are you contacting the website? But it can circulate on other platforms. Even if you could force some websites to bring it down – and that is a big one if – someone could have downloaded it, shared it. It can be on tens of thousands of men’s computers. You feel dizzy.

You could call the police. Is it even a crime? Can you imagine that you show these images to male officers you don’t know? You feel furious and then terrified and then furious again. The perpetrator can be anyone. What if it’s your ex? You think of colleagues and friends in a paranoid frenzy. How can you trust someone? You think about the future and start to feel hopeless. This will always be there.

No wonder that the word is most used by women with whom I speak to is ‘powerless’. Although deepfake pornography is a new form of abuse, the underlying power dynamics are old. It’s not just about sexualizing women; It’s about subjecting them. When Deepfakes emerged a few years ago, there was an obsessive focus on videos of famous women, but now the technology has evolved so that everyone can produce them.

As such it has already happened with many more women than you could realize, including teenagers. In June last year, one of the first big cases of massive deep -fake pornography was reportedly committed by school boys in the UK.

Employees of a private girls ‘school have warned the police about reports that DeepFake images and videos were distributed by students in a nearby private boys’ school, with about a dozen girls thought they were victims.

At the time of writing, investigations are underway, but no disciplinary measures were taken by the boys’ school. Despite the spiral number of incidents – last week, pediatric commissioner Dame Rachel de Souza called for a ban on apps producing deep fakes – of the conversations I have had with educators, most schools do not even know that the technology exists. So, given the prevalence and devastating impact of deepfakes, what is being done about them? Almost nothing.

In most countries, creating and parts of non-consensual deepfake pornography remains legal. In the UK it was only in the 2023 Safety Act that laws were introduced. However, there was a Maas in the law: it only criminalized sharing, not creating, of sexually explicit depths.

Earlier this year, added legislation was proposed to also make a violation – although this still has to be formalized – but perpetrators will not be confronted with the prison unless the image is widely shared.

It all feels too late.

I am thinking of Holly Willoughby, who resigned this morning after a 37-year-old man was imprisoned because he was crushed with others to kidnapping, rape and killing her. The police found a device in the man’s house with deepfake pornographic images of her.

I believe that if we do not take action to stop the tide of the in -depth image abuse, we will see more cases of offenses such as stalking and murder in the coming years that hold an element of manipulated sexualized images. By making these technologies accessible on a large scale, we give men a powerful delusion of ownership of the bodies of all the women they choose, who will aggravate the already terrible levels of male violence against women.

When they take place, public conversations about Deepfakes tend to concentrate on the risks of spreading wrong information, political manipulation or business impact. These are important issues, but research suggests that 96 percent of the deep-traps are non-consensual pornography of 99 percent women. Nevertheless, a Europol report on ‘law enforcement and the challenge of deepfakes’ uses the word ‘women’ once, and only has short paragraphs about deepfake pornography in its 22 pages.

It is clear that society regards the intimidation and abuse of women and girls as less an existential threat than spreading political wrong information.

After all, women and girls have been abusing since the beginning of time, right? What difference makes something more?

Adapted from the New Age of sexism by Laura Bates (£ 20, Simon & Schuster) on 15 May. © Laura Bates 2025. To order a copy for £ 18 (offer valid for 17/05/25; UK P&P free on orders over £ 25) Go to www.mailshop.co.uk/ call 020 3176 2937.

Laura Bates is the founder of daily sexism.

- Advertisement -

- Advertisement -

- Advertisement -

Leave A Reply

Your email address will not be published.