The news is by your side.

Elections and disinformation are colliding like never before in 2024

0

Billions of people will vote in major elections this year — about half the world’s population, by some estimates — in one of the largest and most consequential democratic exercises in living memory. The results will influence the way the world is governed for decades to come.

At the same time, false stories and conspiracy theories have become an increasingly global threat.

Unsubstantiated claims of election fraud have damaged confidence in democracy. Foreign influence campaigns regularly focus on polarizing domestic issues. Artificial intelligence has boosted disinformation efforts and distorted perceptions of reality. While major social media companies have scaled back security measures and downsized election teams.

“Almost every democracy is under pressure, regardless of technology,” said Darrell M. West, a senior fellow at the Brookings Institution think tank. “If you add disinformation to that, it just creates a lot of opportunities for disaster.”

It is, he said, a “perfect storm of disinformation.”

The stakes are enormous.

Democracy, which spread globally after the end of the Cold War, is facing increasing challenges worldwide – from mass migration to climate disruption, from economic inequality to war. The struggle in many countries to respond adequately to such threats has eroded trust in liberal, pluralistic societies, opening the door to calls from populists and strong leaders.

Autocratic countries, led by Russia and China, have used the currents of political discontent to spread narratives that undermine democratic governance and leadership, often by sponsoring disinformation campaigns. If these efforts succeed, the elections could accelerate the recent rise of authoritarian-minded leaders.

Fyodor A. Lukyanov, an analyst who heads a Kremlin-linked think tank in Moscow, the Council on Foreign and Defense Policy, recently argued that 2024 “could be the year when the West’s liberal elites take control of the world order to lose.”

The political establishment in many countries, as well as intergovernmental organizations such as the Group of 20, appear primed for unrest, said Katie Harbath, founder of the tech policy firm Anchor Change and former director of public policy at Facebook, which manages the elections. Disinformation – spread via social media but also through print, radio, television and word of mouth – threatens to destabilize the political process.

“We’re going to get to 2025 and the world will look very different,” she said.

Among the biggest sources of disinformation in election campaigns are autocratic governments seeking to discredit democracy as a global governance model.

Russia, China and Iran have all been cited by researchers and the U.S. government in recent months as countries likely to try to influence operations to disrupt other countries’ elections, including this year’s U.S. presidential election. Countries see the coming year as “a real opportunity to embarrass us on the world stage, exploit social divisions and only undermine the democratic process,” said Brian Liston, an analyst at Recorded Future, a digital security firm that recently reported on possible threats to the American race.

The company also investigated a Russian influence effort that Meta first identified last year called “Doppelgänger,” which appeared to impersonate international news organizations and create fake accounts to spread Russian propaganda in the United States and Europe. Doppelgänger appeared to have used widely available artificial intelligence tools to create news channels dedicated to American politics, with names like Election Watch and My Pride.

Disinformation campaigns like this easily cross borders.

Conspiracy theories—such as claims that the United States is plotting with collaborators in several countries to engineer local power shifts or that it is operating secret biological weapons factories in Ukraine—have sought to discredit American and European political and cultural influence around the world. They might appear in Urdu in Pakistan, while also surfacing in Russia with different characters and language, shifting public opinion in those countries in favor of anti-Western politicians.

The false narratives circulating around the world are often shared by diaspora communities or orchestrated by state-backed agents. Experts predict that stories of election fraud will continue to evolve and resonate, as they did in the United States and Brazil in 2022 and then in Argentina in 2023.

An increasingly polarized and combative political environment breeds hate speech and disinformation, further siloing voters. A motivated minority of extreme voices, aided by social media algorithms that amplify users’ biases, often drown out a moderate majority.

“We are redefining our societal norms about speech and how we hold people accountable for that speech, online and offline,” Ms. Harbath said. “There are a lot of different views on how to do that in this country, let alone around the world.”

Some of the most extreme voices connect on alternative social media platforms, such as Telegram, BitChute and Truth Social. Calls to preemptively stop voter fraud — which historically has not been statistically significant — have recently been popular on such platforms, according to Pyrra, a company that monitors threats and disinformation.

The “prevalence and acceptance of these narratives is only gaining momentum,” even directly impacting election policy and legislation. Found Pyrra in a case study.

“These conspiracies are beginning to take root among the political elite, who are using these narratives to gain public favor while eroding the transparency and checks and balances of the system they are designed to maintain,” the company’s researchers wrote.

Artificial intelligence “holds promise for democratic governance,” he said a report of the University of Chicago and Stanford University. Politically focused chatbots can inform voters about important issues and better connect voters with elected officials.

The technology could also be a vector for disinformation. Fake AI images have already been used to spread conspiracy theories, such as the baseless claim that there is a global plot to replace white Europeans with non-white immigrants.

In October, Michigan Secretary of State Jocelyn Benson wrote to Senator Chuck Schumer, Democrat of New York and the majority leader, saying that “AI-generated content can increase the credibility of highly localized disinformation.”

“A handful of states — and certain districts within those states — will likely decide the presidency,” she said. “Those looking to influence outcomes or sow chaos can use AI tools to mislead voters about wait times, closures or even violence at specific voting locations.”

Lawrence Norden, who leads the elections and government program at the Brennan Center for Justice, a public policy institute, added that AI could imitate and widely distribute large amounts of materials from election offices. Or it may produce at a late stage October surprises, such as the sound with signs of AI intervention emerging during Slovakia’s close elections this fall.

“All the things that have been a threat to our democracy for some time may be made worse by AI,” Mr. Norden said while participating in an online panel in November. (During the event, organizers introduced a artificially manipulated version from Mr Norden to underline the possibilities of the technology.)

Some experts worry that the mere presence of AI tools could weaken trust in information and allow political actors to dismiss genuine content. Others said the fears are exaggerated for now. Artificial intelligence is “just one of many threats,” said James M. Lindsay, senior vice president of the Council on Foreign Relations think tank.

“I wouldn’t lose sight of all the old-fashioned ways of spreading disinformation and disinformation,” he said.

In countries where general elections are scheduled for 2024, disinformation has become a major concern for a large majority of respondents by UNESCO, the cultural organization of the United Nations. And yet, social media companies’ efforts to limit toxic content, which escalated after the 2016 U.S. presidential election, have recently waned, if not completely reversed.

Meta, YouTube and a recent report by Free Press, an interest group. Some offer new features, such as one-way private broadcasting, that are particularly difficult to control.

The companies are starting the year with “little bandwidth, very little written accountability and billions of people around the world turning to these platforms for information” — not ideal for protecting democracy, said Nora Benavidez, a senior adviser at Free Press.

Newer platforms, such as TikTok, will most likely play a larger role in political content. Substack, the newsletter startup that said last month it would not ban Nazi symbols and extremist rhetoric from its platform, wants the 2024 voting season “the Substack election.” Politicians make plans live streamed events on Twitch, which will also feature a debate between AI-generated versions of President Biden and former President Donald J. Trump.

Meta, owner of Facebook, Instagram and WhatsApp, said in a blog post in November that it was in a “strong position to protect the integrity of next year’s elections on our platforms.” (Last month, a company-appointed oversight board took issue with Meta’s automated tools and the way it handles them two videos related to the conflict between Israel and Hamas.)

YouTube wrote last month that its “election-focused teams have been working non-stop to ensure we have the right policies and systems in place.” The platform said this summer that this would be the case stop removing false narratives of voter fraud. (YouTube said it wanted voters to hear all sides of a debate, though it noted that “this is not a license to spread harmful misinformation or promote hateful rhetoric.”)

Such content spread on X after billionaire Elon Musk took over in late 2022. Months later, Alexandra Popken left her role as trust and security manager for the platform. Many social media companies rely heavily on unreliable AI-powered content moderation tools, leaving stripped-down human crews in constant firefighting mode, said Ms. Popken, who later joined the content moderation company WebPurify.

“Election integrity is such a huge effort that you really need a proactive strategy, lots of people, brains and war rooms,” she said.

Leave A Reply

Your email address will not be published.