Billions of people will vote in major elections this year – about half the world’s population, by some estimates – in one of the largest and most consequential democratic exercises in living memory. The results will affect the way the world will be run for decades to come.
At the same time, false narratives and conspiracy theories have become an increasingly global threat.
Unfounded accusations of electoral fraud have damaged trust in democracy. Foreign influence campaigns regularly target polarizing domestic challenges. Artificial intelligence has boosted disinformation efforts and distorted perceptions of reality. All this while the main social media companies have reduced their safeguards and reduced the size of their electoral teams.
“Almost all democracies are under pressure, regardless of technology,” said Darrell M. West, a senior fellow at the Brookings Institution think tank. “When you add misinformation to that, you just create a lot of opportunities for mischief.”
It is, he said, a “perfect storm of misinformation.”
The stakes are enormous.
Democracy, which spread globally after the end of the Cold War, faces growing challenges around the world, from mass migration to climate disruption, from economic inequalities to war. The struggle in many countries to respond adequately to such threats has eroded trust in liberal and pluralistic societies, opening the door to appeals from populists and warlord leaders.
Autocratic countries, led by Russia and China, have harnessed currents of political discontent to push narratives that undermine democratic governance and leadership, often sponsoring disinformation campaigns. If those efforts are successful, the elections could accelerate the recent rise of authoritarian-minded leaders.
Fyodor A. Lukyanov, an analyst who heads a Kremlin-aligned think tank in Moscow, the Foreign and Defense Policy Council, recently argued that 2024 “could be the year in which Western liberal elites lose control of the order.” world”.
The political establishment in many nations, as well as intergovernmental organizations like the Group of 20, appear prepared for upheaval, said Katie Harbath, founder of the tech policy firm Anchor Change and former public policy director at Facebook that manages elections. . Disinformation, spread through social networks but also through the press, radio, television and word of mouth, risks destabilizing the political process.
“We are going to get to 2025 and the world will be very different,” he said.
Aggressive state operations
Among the greatest sources of disinformation in electoral campaigns are autocratic governments that seek to discredit democracy as a global model of governance.
Russia, China and Iran have been cited in recent months by researchers and the U.S. government as likely to try to influence operations to disrupt other countries’ elections, including this year’s U.S. presidential election. Countries see the coming year as “a real opportunity to embarrass ourselves on the world stage, exploit social divisions and simply undermine the democratic process,” said Brian Liston, an analyst at Recorded Future, a digital security company that recently reported on potential threats. . to the American race.
The company also examined a Russian influence effort that Meta first identified last year, called “Doppelgänger,” that appeared to impersonate international news organizations and create fake accounts to spread Russian propaganda in the United States and Europe. Doppelgänger appeared to have used widely available artificial intelligence tools to create media outlets dedicated to American politics, with names like Election Watch and My Pride.
Disinformation campaigns like this easily cross borders.
Conspiracy theories – such as claims that the United States conspires with collaborators in several countries to engineer local power shifts or that it operates secret biological weapons factories in Ukraine – have sought to discredit American and European political and cultural influence around the world. . They could appear in Urdu in Pakistan and also emerge, with different characters and language, in Russia, changing public opinion in those countries in favor of anti-Western politicians.
False narratives circulating around the world are often shared by diaspora communities or orchestrated by state-backed actors. Experts predict that electoral fraud narratives will continue to evolve and reverberate, as they did in the United States and Brazil in 2022 and then in Argentina in 2023.
A cycle of polarization and extremism
An increasingly polarized and combative political environment is generating hate speech and misinformation, pushing voters to further isolate themselves. A motivated minority of extremist voices, aided by social media algorithms that reinforce users’ prejudices, are often drowning out a moderate majority.
“We are in the midst of redefining our social norms around speech and how we hold people accountable for that speech, online and offline,” Ms. Harbath said. “There are a lot of different views on how to do that in this country, let alone around the world.”
Some of the most extreme voices are sought out on alternative social media platforms, such as Telegram, BitChute and Truth Social. Calls to preemptively stop voter fraud, which is historically statistically insignificant, recently became a trend on such platforms, according to Pyrra, a company that monitors threats and misinformation.
The “prevalence and acceptance of these narratives is only gaining ground,” even directly influencing politics and electoral legislation, Pyrra found in a case study.
“These conspiracies are taking root among the political elite, who are using these narratives to curry favor with the public while degrading the transparency and checks and balances of the system they are meant to uphold,” the company’s researchers wrote.
The risk-reward proposition of AI
Artificial intelligence “holds promise for democratic governance,” according to a report by the University of Chicago and Stanford University. Politically focused chatbots could inform voters about key issues and better connect voters with elected officials.
Technology could also be a vector of misinformation. Fake AI images have already been used to spread conspiracy theories, such as the baseless claim that there is a global plot to replace white Europeans with non-white immigrants.
In October, Jocelyn Benson, Michigan’s secretary of state, wrote to Senator Chuck Schumer, Democrat of New York and the majority leader, saying that “AI-generated content can boost the credibility of highly localized misinformation.”
“A handful of states, and certain districts within those states, are likely to decide the presidency,” he said. “Those seeking to influence results or sow chaos can use artificial intelligence tools to mislead voters about wait times, closures, or even violence at specific polling locations.”
Lawrence Norden, who directs the elections and government program at the Brennan Center for Justice, a public policy institute, added that AI could imitate large amounts of materials from election offices and disseminate them widely. Or, you could make late-stage products. October surprises, like the audio with signs of AI intervention that was released during Slovakia’s closely contested elections this fall.
“All the things that have been threats to our democracy for some time can get worse with AI,” Norden said while participating in an online panel in November. (During the event, organizers presented an artificially manipulated version of Mr. Norden to highlight the technology’s capabilities.)
Some experts worry that the mere presence of artificial intelligence tools could undermine trust in information and allow political actors to dismiss real content. Others said fears, for now, are exaggerated. Artificial intelligence is “just one of many threats,” said James M. Lindsay, senior vice president of the Council on Foreign Relations think tank.
“I wouldn’t lose sight of all the old-fashioned ways of spreading misinformation or disinformation,” he said.
Big tech companies reduce protections
In countries with general elections scheduled for 2024, misinformation has become a major concern for a large majority of people surveyed by UNESCO, the United Nations cultural organization. And yet, social media companies’ efforts to limit toxic content, which intensified after the 2016 US presidential election, have recently eased, if not completely reversed.
Meta, YouTube and Some offer new features, such as one-way private transmissions, which are especially difficult to monitor.
Companies are starting the year with “little bandwidth, very little written accountability, and billions of people around the world turning to these platforms for information,” which is not ideal for safeguarding democracy, said Nora Benavidez, senior advisor at Free Press.
It is very likely that newer platforms, such as TikTok, will begin to play a larger role in political content. Substack, the newsletter startup that last month said it would not ban Nazi symbols and extremist rhetoric on its platform, wants the 2024 voting season to be “the Substack election.” Politicians are planning livestreamed events on Twitch, which is also hosting a debate between AI-generated versions of President Biden and former President Donald J. Trump.
Meta, which owns Facebook, Instagram and WhatsApp, said in a blog post in November that it was in a “strong position to protect the integrity of next year’s elections on our platforms.” (Last month, a company-appointed oversight board took issue with Meta’s automated tools and its handling of two videos related to the conflict between Israel and Hamas.)
YouTube wrote last month that its “election-focused teams have been working around the clock to make sure we have the right policies and systems in place.” The platform said this summer that it would stop removing false narratives of voter fraud. (YouTube said it wanted voters to hear all sides of a debate, though it noted that “this is not a free pass to spread harmful misinformation or promote hateful rhetoric.”)
This type of content proliferated on Many social media companies are leaning heavily on unreliable AI-powered content moderation tools, leaving skeleton crews in constant firefighting mode, said Popken, who later joined the company. WebPurify content moderation.
“Election integrity is such a gigantic effort that you really need a proactive strategy, a lot of people, brains and war rooms,” he said.