The news is by your side.

How Putin is ready to weaponize AI-created deep fake PORN with his cyber army in an effort to break down democracies in the West

0

QUALIFIED Putin could use AI-generated deep-fake pornography to disrupt and break down Western democracies, an intelligence analyst has warned.

Nina Jankowicz, former executive director of the US Department of Homeland Security’s disinformation task force, told The Sun that she is “really concerned” about the risk of AI in the hands of Russia.

AI Deepfake porn is fake nude images of real people - usually aimed at high-profile celebrities or politicians

4

AI Deepfake porn is fake nude images of real people – usually aimed at high-profile celebrities or politicians
An AI expert has warned that Russia is using fake porn against women in the public eye to disrupt Western democracies

4

An AI expert has warned that Russia is using fake porn against women in the public eye to disrupt Western democraciesCredit: Getty
In 2017, AI porn of Ukrainian MP Svitlana Zalishchuk surfaced

4

In 2017, AI porn of Ukrainian MP Svitlana Zalishchuk surfacedCredit: Ukrinform/Future Publishing

Professor Jankowicz, who has studied in Russia and worked with the Ministry of Foreign Affairs of Ukraine, analyzes Russian operations aimed at weakening democracies around the world.

She told The Sun that Russia has already used deepfake porn to attack its enemies – and that only one real image is needed to generate the ubiquitous AI nudes.

And the AI ​​expert thinks the “sick” fake porn is certainly part of Russia’s “sexualized playbook” when it comes to weakening more countries in the West.

She even said that some of the Russian cybercriminals producing the ubiquitous images are acting on Putin’s direct orders.

Jankowicz explained that while AI can be a “transformative technology,” it can be incredibly dangerous in the wrong hands.

The rapid rise of such technology has created “a whole new level of threat” and is a “great way to upset the balance of power.”

“The vast majority of deepfakes online are non-consensual deepfake pornography of women,” she told The Sun.

And Russia has already used this terrifying method to “attack women who are part of the democracies” in Ukraine and Georgia.

“If I were Russia, I would consider using deep-fake pornography to undermine democracies… to upset the balance of power.”

“Targeting women is a great way to do this,” she warned.

A look into the world of romance scammers who deceive one in three Brits using deepfake videos

Jankowicz told The Sun that 96% of deepfake images online are pornography.

‘We are in an age where a single photo of a woman can be used to create a realistic deepfake.

“Russia is using these in very strategic ways. They tear at the structure, the vulnerabilities of society, the misogynistic tendencies.”

It’s sick… I don’t want to let the bad guys win

Nina Jankowicz

“The sexualized playbook has been part of their tools and tactics for a long time,” she explained.

“If you want to upset a candidate for office or a senior military official, this is a good way to damage her credibility.

“It’s really ubiquitous.”

The result is that other women are more reluctant to take on public or high-level positions in government, the armed forces and more, disrupting the balance of a modern and democratic society.

Jankowicz, who has also written two books about disinformation and women online, has even been targeted himself.

In How to be a Woman Online: Surviving Abuse and Harassment, and How to Fight Back, she talks about her own experiences with deepfake porn.

Nina told The Sun: “The deepfakes of me were made in the weeks after I resigned from government in 2022.

“I didn’t discover them for a while after that, it was a Google alert that alerted me.

‘Of course it was part of a wider interference in public life and that is why I continue to talk about it.

“It’s sick,” she said, “I don’t want to let the bad guys win.”

Are Russian cyber groups acting on Kremlin orders?

Nina told The Sun that the Russian cyber groups producing these perverse images and clips may be doing so on Putin’s direct orders.

“We have seen Russian security services get involved in these kinds of cases before, especially with Russian opposition figures.

‘The Kremlin employs these groups directly through the security services, or has concluded crime pacts with them.

“The criminals will use it as a favor to keep them out of jail.”

The idea is to undermine a woman’s credibility… by knocking her down a peg or two. They see it as punishment for the women they portray

Nina Jankowicz

The Russian regime has historically seen security services spend enormous amounts of time and money spying on its enemies, and it can now artificially generate the information they want to use, she warned.

And if they were to play directly with Western governments in this way, they would be targeting high-profile individuals.

“If they were to interfere in this way, it would have to be a pretty high-value target.

“I think it’s possible they’re looking at government officials … a Cabinet secretary or a high-ranking appointee.”

As we enter an election year, Nina says in the US, Britain and dozens of other countries that she is “absolutely concerned” about the prospect of AI-generated porn.

‘If you look at the way this has been used against women in politics in democratic countries before, it has been a major problem for about eight years.

I think it’s possible they’re looking at government officials… a cabinet secretary or a high-ranking appointee

Nina Jankowicz

“It is allowed to continue to exist without much intervention from the tech companies or the government.

‘When I look at the vulnerabilities we have in Western society, misogyny is certainly one, and we know it. [Russia] has previously used a kind of misogynistic rhetoric and misogynistic disinformation.”

Taylor Swift became the target of deepfakes in January

4

Taylor Swift became the target of deepfakes in JanuaryCredit: Getty

Deepfake porn in Ukraine and Georgia

Nina gives an example of Ukrainian MP Svitlana Zalishchuk who was targeted by deepfake porn seven years ago.

In 2017, fake AI posts online of a tweet appearing to be written by Zalishchuk saw her promising to run naked down the streets of Kiev if Ukraine lost a key battle.

It was really disturbing that a journalist from a European country would take this so seriously and bring it up as the United Nations

Nina Jankowicz

Alongside the post were fake images of her completely naked.

Jankowicz told The Sun that the apparently real post went so viral “a reporter then asked her about it at a UN meeting for women”.

“It was really disturbing that a journalist from a European country would take this so seriously and bring it up as the United Nations,” Nina said.

“Similarly, there have been sex tape scandals in Georgia,” a country that Nina said is home to criminal groups “in cahoots with the Kremlin.”

“Russia could use sex tapes attributed to certain women but not actually theirs, or actually spy on them as they were involved in extramarital affairs or in their own bedrooms.”

Nina told The Sun that Russia has a pattern of doing this to opposition figures, and the idea behind it is to “disgrace them from public life”.

She added: “Sexualizing them, especially in a country like Georgia, which is very traditional, can be very harmful to them.”

“There have been several women who have had this happen to them and who have left public life completely.”

The rise of deepfakes

DEEPFAKE porn is nothing new, but it has gained more attention in recent years as more and more high-profile targets fall victim to it.

The term was first coined in 2017, when the faces of high-profile figures were photoshopped or edited onto pornographic content.

In its early form, a combination of machine learning algorithms would be used with AI software to create them.

But as AI becomes more sophisticated, highly realistic images can easily be created from scratch – using the most humble photographs.

Just a few weeks ago, megastar Taylor Swift was exposed to explicit deepfake images posted online.

Arguably the most famous person to ever fall victim to the dangerous technology; one of the images was viewed approximately 45 million times.

Other famous women targeted over the years include Gal Gadot, Emma Watson, Natalie Portman and Scarlett Johansson.

Professor Jankowicz is aware of four major websites specializing in images, but many more subthreads, copies and sites exist on the Internet and regulation has proven difficult.

Governments, scientists and intelligence analysts like Jankowicz are increasingly looking for ways to combat this warped technology.

Examples in Great Britain

Cara Hunter, a Northern Irish politician, saw a fake porn video using her likeness published online as she stood as a candidate for the April 2022 election.

It was shared tens of thousands of times online and led to her being sexually harassed even while walking on the street.

One of the first ‘deep fake’ political incidents in Britain – and the first involving such a prominent figure – targeted Labor leader Keir Starmer.

In October 2023, an audio clip of Starmer apparently swearing at his staff circulated online.

Another false soundbite emerged from London Mayor Sadiq Khan, implying he was against Remembrance Day commemorations and instead favored a pro-Palestinian march.

Although neither were pornographic, they were compelling to people online and spread quickly.

The UK’s Electoral Commission chairman also recently warned that female MPs could be directly targeted by deepfake porn, especially in the run-up to this year’s general election.

John Pullinger told the Financial Times several weeks ago that AI could “block the real campaign.”

And he warned that any fake AI-generated pornography would be “much more targeted at female candidates.”

2024 is a pivotal year for AI

A study of the University of Oxford a few weeks ago it was concluded that AI will “sweep through the information space this year at a time of intense political and economic volatility around the world.”

This, they say, is particularly dangerous in a year when there are elections in more than forty democracies around the world and wars on several continents.

The study warned of “bad actors” who could use technology such as AI to influence the outcome of the 2024 election.

Big tech is working hard and fast to counter the threat of such real-world disinformation online, but it is difficult to judge at this point whether they will be able to do so effectively and consistently.

Leave A Reply

Your email address will not be published.