The news is by your side.

AI expert issues a chilling warning for the future after Taylor Swift's deepfake nightmare

0

EXPOSURE to deepfakes will give us a warped sense of reality and the problem will only get worse, an AI expert warns.

AI and disinformation expert Wasim Khaled told The US Sun that deepfakes are a so-called Pandora's box and that the worst is yet to come.

1

Taylor Swift recently faced a deepfake nightmare when explicit AI-generated images of her went viral on TwitterCredit: Getty

Khaled is the CEO and co-founder of Merel.AIan AI-driven risk and narrative intelligence platform that fights against disinformation.

“Unfortunately, it's a real Pandora's box,” he told us.

'People's ability to stimulate this type of behavior will become easier, cheaper and more realistic.

“Today the videos are much harder to produce than the images.

“Just as images used to be more difficult to create than text.

“At the rate we're moving, the models are getting infinitely better every month or every quarter.”

Khaled is not alone in these concerns, and other experts agree that the danger of deepfake is increasing.

“Every day these tools become cheaper and more accessible to the public.

“We are seeing a 900% year-over-year increase in the creation of deepfakes, and we estimate that there will already be more deepfakes online in January 2024 than in all of 2023.

“Eventually, it could be possible for everyone to be deepfaked,” Michael Matias, CEO and co-founder of Clarity, told The US Sun.

One of Khaled's biggest concerns about deepfakes is the impact they have on the mental health of those affected and the people who view them.

The AI ​​expert said his organization has been working on this for some time, but previously focused mainly on text-based “made-up realities.”

“That also distorted people's sense of reality over the last four or five years, and that was just texting and outreach.

“What we're adding to text and scope, as we saw with Taylor Swift's images, is what happens when you put that very realistic media content into the scope.

“I mean, it would be very difficult for the average person to refute that it wasn't them.

“While people her size clearly know it's not her.”

Khaled is concerned that ordinary people will have difficulty proving that they were not a deepfaked image, even if they have proof.

He thinks that exposure to deepfakes is something our brains are not ready for and that we may find it difficult to get to the truth.

“It could be a leaked video, it could be leaked footage, so if ten people see that, they get an impression of you, even if you say 'that wasn't really me'.

“That's why the sextortion rings exist, because they know it will impact the credibility around it.

“On a large scale it will absolutely have an impact on people. It's just old marketing principles taken into bizarre territory.

“Where it's okay, you see things seven times and it starts to sink in.”

As AI continues to develop, ultra-realistic images could make deepfakes much harder to refute and disbelieve.

Khaled added: “Overtime, a lot of people will believe that 100%.

“Even if they are given irrefutable evidence, they will still believe it.”

Leave A Reply

Your email address will not be published.