Billions of people will vote in major elections this year – by some estimates, nearly half the global population – in one of the largest and most consequential democratic exercises in living memory. The results will impact how the world will be run for decades to come.
At the same time, false narratives and conspiracy theories have increasingly grown into a global threat.
Unfounded claims of electoral fraud have damaged confidence in democracy. Foreign influence campaigns routinely target polarizing domestic challenges. Artificial intelligence has fueled disinformation efforts and distorted perceptions of reality. While major social media companies have reduced their security measures and reduced the size of election teams.
“Almost every democracy is under stress, independent of technology,” said Darrell M. West, a senior fellow at the Brookings Institution think tank. “When you add disinformation on top of that, it creates a lot of opportunities for mischief.”
This, he said, is a “perfect storm of disinformation”.
The stakes are huge.
Democracy, which spread globally after the end of the Cold War, faces increasing challenges around the world – from mass migration to climate disruption, economic inequalities to war. The struggle in many countries to respond adequately to such tests has undermined trust in liberal, pluralistic societies, opening the door to the appeal of populists and strongman leaders.
Autocratic countries, led by Russia and China, have harnessed political dissent to push narratives that undermine democratic governance and leadership, often by sponsoring disinformation campaigns. If those efforts are successful, the election could accelerate the recent rise of authoritarian-minded leaders.
Fyodor A. Lukyanov, an analyst who leads the Foreign and Defense Policy Council, a Kremlin-aligned think tank in Moscow, recently argued that 2024 “could be the year when the West’s liberal elites lose control over the world order. “
The political establishment in many countries, as well as intergovernmental organizations like the Group of 20, appear ready for upheaval, said Katie Harbath, founder of the technology policy firm Anchor Change and director of public policy at Facebook, who previously managed elections. . Misinformation – spread through social media as well as print, radio, television and word-of-mouth – risks destabilizing the political process.
“We’re going to get to 2025 and the world is going to look very different,” he said.
aggressive state operator
The biggest sources of disinformation in election campaigns are autocratic governments trying to discredit democracy as a global model of governance.
Russia, China and Iran have been cited in recent months by researchers and the US government as possible attempts to influence campaigns to disrupt other countries’ elections, including this year’s US presidential election. Brian Liston, an analyst at Recorded Future, a digital security company that recently reported on potential threats, said the country sees the coming year as “an attempt to embarrass us on the world stage, exploit social divisions, and undermine the democratic process.” See it as a “real opportunity”. To the American race.
The company also investigated a Russian influence effort, which Meta first identified last year, called “Doppelganger,” which impersonated international news organizations and sent fakes to spread Russian propaganda in the United States and Europe. Used to make accounts. The doppelganger appears to have used widely available artificial intelligence tools to create news outlets dedicated to American politics with names like Election Watch and My Pride.
Such disinformation campaigns easily cross borders.
Conspiracy theories – such as claims that the United States plots with allies in various countries to orchestrate local regime change or that it operates secret biological weapons factories in Ukraine – have increased American and European political and cultural influence around the world. Tried to defame him. They may appear in Urdu in Pakistan, while in Russia with different characters and language, thereby changing public opinion in favor of anti-Western politicians in those countries.
The false stories being spread around the world are often shared by diaspora communities or perpetrated by state-backed operatives. Experts predict that stories of election fraud will continue to evolve and resonate, as happened in the United States and Brazil in 2022, and again in Argentina in 2023.
A cycle of polarization and extremism
The increasingly polarized and combative political environment is fueling hate speech and misinformation, which further alienates voters. A motivated minority of extremist voices is often overruling the moderate majority, aided by social media algorithms that reinforce users’ biases.
“We are in the midst of redefining our societal norms about speech and how we hold people accountable for that speech online and offline,” Ms. Harbath said. “There are a lot of different perspectives on how to do this in this country, let alone around the world.”
Some extremist voices seek out each other on alternative social media platforms such as Telegram, Bitchute, and Truth Social. Calls to preempt voter fraud — which has historically been statistically insignificant — have been trending on such platforms recently, according to Pyra, a company that tracks threats and misinformation.
Payra found in a case study that “the prevalence and acceptability of these narratives are only gaining popularity,” even directly influencing electoral policy and law.
The company’s researchers wrote, “These conspiracies are taking root among political elites, who are using these narratives to win public favor while imperiling the transparency, scrutiny, and transparency of the system they are meant to maintain.” And damaging the balance.”
AI’s risk-reward proposition
According to a report, artificial intelligence “holds promise for democratic governance”. University of Chicago and Stanford University. Politically focused chatbots can inform voters about key issues and better connect voters with elected officials.
Technology can also be a vehicle for disinformation. Fake AI images have already been used to spread conspiracy theories, such as the baseless claim that there is a global conspiracy to replace white Europeans with non-white immigrants.
In October, Michigan Secretary of State Jocelyn Benson wrote to Senator Chuck Schumer, Democrat of New York and majority leader, saying that “AI-generated content could enhance the credibility of highly localized misinformation.”
“A handful of states – and particular enclaves within those states – are likely to decide the presidency,” he said. “Those who want to influence the results or sow chaos could use AI tools to mislead voters about wait times, closures or even violence at specific polling locations.”
Lawrence Norden, who runs elections and government programs at the Brennan Center for Justice, a public policy institute, said AI could copy large amounts of materials from election offices and disseminate them widely. Or, it may form late-stage October surprises, like the audio indicating signs of AI interference that were released during Slovakia’s tight election this autumn.
“All the things that have been threatening our democracy for some time are potentially made worse by AI,” Mr. Norden said, taking part in an online panel in November. (During the event, organizers introduced an artificially manipulated version of Mr. Norden to highlight the technology’s capabilities.)
Some experts worry that the mere presence of AI tools could undermine trust in information and enable political actors to dismiss genuine content. Others said that fear has increased a lot at the moment. James M. Lindsay, senior vice president of the Council on Foreign Relations think tank, said artificial intelligence is “one of many threats”.
“I will not condone all the old-fashioned methods of sowing misinformation or disinformation,” he said.
Big Tech Scale Back Protection
In countries planning general elections in 2024, disinformation has become a major concern for the majority of people surveyed by the UN cultural organization UNESCO. And yet efforts by social media companies to limit toxic content, which increased after the US presidential election in 2016, have recently diminished, if not completely reversed.
According to a recent report by Free Press, an advocacy organization, Meta, YouTube and . Some are introducing new features, such as private one-way broadcasts, that are particularly difficult to monitor.
Nora Benavidez, senior counsel at Free Press, said companies are starting the year with “low bandwidth, very little accountability in writing, and billions of people around the world turning to these platforms for information” – which is a threat to democracy. Not ideal for security.
New platforms, such as TikTok, will likely begin to play a larger role in political content. Substack, the newsletter start-up that said last month it would not ban Nazi symbols and extremist rhetoric from its platform, wants the 2024 voting session to be a “Substack election.” Politicians are planning livestream events on Twitch, starting with President Biden and former President Donald J. It’s also hosting a debate between AI-generated versions of Trump.
Meta, which owns Facebook, Instagram and WhatsApp, said in a blog post in November that it was “strongly positioned to protect the integrity of next year’s elections on our platforms.” (Last month, a company-appointed oversight board took issue with Meta’s automated tools and its handling of two videos related to the Israel-Hamas conflict.)
YouTube wrote last month that its “election-focused teams are continuing to work to ensure we have the right policies and systems in place.” The platform said this summer it would stop removing false voter fraud stories. (YouTube said it wants voters to hear all sides of the debate, though it noted that “this is not a free pass to spread harmful misinformation or promote hateful rhetoric.”)
Such content proliferated on X after billionaire Elon Musk took power in late 2022. Months later, Alexandra Popken left her role of trust and safety management for the platform. Ms. Popken, who later joined the content moderation company WebPurify, said many social media companies are leaning heavily on unreliable AI-powered content moderation tools, leaving separate teams of humans constantly in firefighting mode. .
“Election integrity is such a massive effort that you really need a proactive strategy, a lot of people and minds and war rooms,” he said.