How foreign actors are using media to influence opinion before Election Day

A Georgia expert says the intent is not just to get people to believe something that is false, but to undermine confidence and faith in institutions

Last month, a website billing itself as a trusted source for conservative news in Savannah was found to be anything but.

“Savannah Time,” as the website was called, published articles focusing on Republican politics and LGBTQ issues, including sex reassignment, and claimed it delivered “accurate, insightful and locally relevant news coverage.” But none of its articles, even under its local news tab, had anything to do with the city itself.

Analysts with Microsoft’s Threat Analysis Center soon uncovered that it was one of at least four fake news websites created by groups connected to the Iranian government to target voter groups on the far left and right. These websites published articles that were stitched together using generative artificial intelligence-based tools, Microsoft said, and likely received few, if any, eyeballs in Georgia.

Then, earlier this month, federal prosecutors made public indictments in a much more high-profile case. A grand jury indicted a pair of employees of RT, a Russian state-controlled media outlet, in an alleged scheme to distribute video content containing foreign propaganda to U.S. audiences. The RT employees were alleged to have funneled $10 million to a right-wing outlet matching the description of Tenet Media, though Tenet was not named in the filing.

Federal agencies also announced a series of actions tackling a Russian government-backed effort to influence the election, which included sanctions on 10 individuals and entities and the seizure of several internet domains.

Foreign actors spreading disinformation to influence the outcome of an election is not a new threat. Such efforts have become more blatant and sophisticated starting in 2016, when federal investigations determined Russia threw its weight behind Donald Trump. These types of efforts are escalating and becoming increasingly sophisticated as the barriers to entry become smaller and smaller and the stakes to manipulate public opinion in an already polarized political spectrum have become higher.

U.S. intelligence officials have found that foreign actors are creating websites that publish fake news articles, fabricating false personas on social media to engage audiences on divisive political issues and using generative AI to create fake audio and visual content. They’re also attempting to hack and leak sensitive information from computers, emails and databases linked to both campaigns.

The intent, said Tony Lemieux — a Georgia State University professor who studies conflict and terrorism — is not just to get people to believe something is false.

“There is some of that, of course. But the broader point is to undermine confidence and faith in institutions and processes,” Lemieux said. “And the electoral process is one of them.”

‘Weakening of the fabrics of society’

What’s alleged in the indictments of the RT employees is indicative of the types of foreign influence activity most evident in this cycle.

The DOJ alleged that RT and the two employees recruited influencers with large audiences, including Tim Pool and Benny Johnson, to create and share short-form videos across social media addressing topics related to domestic and foreign policy, such as immigration and inflation.

Though the outlet was not mentioned explicitly, the details provided about the Tennessee company matched Tenet Media. At the time of the indictment, Tenet’s videos attracted 16 million views across YouTube and other platforms.

Most major U.S. television distributors dropped RT following Russia’s invasion of Ukraine in early 2022. Even still, the Russian government uses RT to direct disinformation and propaganda at Western audiences, according to the indictment.

During a Russian television appearance in February 2024, RT’s editor-in-chief said that RT had built “an enormous network, an entire empire of covert projects that is working with the public opinion, bringing truth to Western audiences.” In the indictment, the DOJ calls the Tennessee company believed to be Tenet Media one of its covert projects.

The influencers contracted by Tenet said they were unaware of the Russian ties to the publication. On X, formerly Twitter, Pool wrote: “Should these allegations prove true, I, as well as the other personalities and commentators, were deceived and are victims.” Johnson wrote a similar statement on X, adding that his lawyers will handle anyone “who states or suggests otherwise.”

Dave Rubin, another hired commentator, wrote a similar statement on X, saying he knew “absolutely nothing about any of this fraudulent activity. Period.”

None of the commentators have been accused of any wrongdoing.

On Sept. 13, the State Department said RT has moved beyond being simply a media outlet and is now engaged in information operations, covert influence and military procurement. In a press briefing, Secretary of State Antony Blinken said the government will not “stand by as RT and other actors carry out covert activities in support of Russia’s nefarious activities.” This week, following the government announcing sanctions on RT and its related entities, Meta banned RT from Facebook and Instagram.

Separately, federal prosecutors are also preparing criminal charges in connection with an Iranian hack targeting the campaign of Trump, the Republican nominee, the Washington Post reported last week.

But there are challenges in prosecuting these types of efforts, said Byung “BJay” Pak, who served as the U.S. Attorney for the Northern District of Georgia before resigning in 2021.

For one, attributing cyberattacks or other types of meddling to responsible parties can be difficult given the relative anonymity that the internet provides. Gathering evidence is complex and time-consuming. Prosecutors will likely have to rely on drawing information from intelligence agencies, which can contain classified or other sensitive information that cannot be shared as a part of discovery if a domestic criminal charge is brought against a defendant.

Russia, China and Iran are the three main countries the intelligence community is monitoring. This is because they’ve been the most capable and active foreign influence actors in prior election cycles, an official with the Office of the Director of National Intelligence said during a Sept. 6 press briefing.

Actors from each of these countries have different motivations, intelligence experts said. Russians are amplifying narratives to diminish American support for Ukraine. In Iran, actors are looking to exacerbate tensions over the Israel-Gaza conflict. Chinese groups are looking to portray the U.S. as a declining global power.

But wherever they’re coming from — and whether they’re motivated by an outcome favoring the Republicans, Democrats or neither — the intention is to sow division, said Lawrence Norden, the vice president of the elections and government program under the Brennan Center for Justice, a nonprofit public policy institute.

The overriding goal is to weaken the U.S. and American belief in the democratic system, Lemieux said.

“When you undermine faith in institutions and democratic norms, then you start to see the weakening of the fabrics of society,” Lemieux said.

Many of the foreign actors are conducting their operations on social media, which are sources of free speech and creativity. These platforms use algorithms that tailor content to users based on their interests. This content, often packaged into short sound bites or video clips, can reinforce existing belief systems.

“People tend to gravitate towards things that confirm their views instead of keeping an open mind,” Pak said.

Information silos have increased over time, Norden said. To save costs last year, some social media websites laid off staffers and contractors that moderated content, including Facebook’s parent company, Meta, and Google’s parent, Alphabet. After acquiring X, then known as Twitter, in 2022, Elon Musk slashed its trust and safety staff and changed its approach to removing content violating company policy. Instead of removing them altogether, X will limit the posts.

“It’s just easier to get less credible information out to people,” Norden said. “There aren’t the same kind of gatekeepers to make sure that the information is accurate.”

After 20 years of mostly getting their news from established media sources, the public is still adapting to a changing media environment and technology, Norden said. Everyone should have a healthy skepticism in the news they are consuming from unknown sources, particularly if it is emotionally charged. The closer Election Day becomes, more of this false information will focus on elections themselves.

“There’s no question in my mind that whatever the result is, there are going to be actors and foreign adversaries that are looking not just to create doubt but to increase volatility and generate anger. We saw the results of that in 2020, 2022 and, obviously, Jan. 6,” Norden said.