Last month, a video began circulating on social media purporting to tell the story of an internet troll farm in Kyiv targeting the American election.
Speaking in English with a Slavic accent, “Olesya” offers a first-person account of how she and her colleagues initially worked in support of President Volodymyr Zelensky of Ukraine. Then, she says, after a visit by mysterious Americans who were “probably C.I.A.,” the group began sending messages to American audiences in support of President Biden.
“We were told our new target was the United States of America, especially the upcoming elections,” the woman in the video says. “Long story short, we were asked to do everything to prevent Donald Trump from winning the elections.”
The video is fake, part of an effort to cloud the political debate ahead of the U.S. elections.
U.S. officials say the video is consistent with Russian disinformation operations as internet warriors aligned with Russia appear to be honing their strategy. Some of the old tactics of 2016 or 2020 could be used again, with new refinements.
While there has been much hand-wringing over the role that artificial intelligence could play this year in fooling voters, current and former officials said that videos were one of the most immediate threats.
Microsoft said the video featuring “Olesya” probably came from a group it calls Storm-1516, a collection of disinformation experts who now focus on creating videos they hope might go viral in America.
The group most likely includes veterans of the Internet Research Agency, a Kremlin-aligned troll farm that sought to influence the 2016 election. The agency was run by Yevgeny Prigozhin, the founder of the Wagner mercenary group who led a rebellion against the Kremlin and then was killed in a plane crash that American and allied officials believe was orchestrated by Russian intelligence agencies.
Microsoft said the group also included people associated with Valery Korovin, the figurehead of an obscure Moscow-based think tank called the Center for Geopolitical Expertise, a conservative organization affiliated with Aleksandr Dugin, an ultranationalist writer who faces U.S. sanctions for his role in recruiting fighters for the war.
Russian operatives are leaning into videos, many of them that falsely purport to be made by independent journalists or whistle-blowers. The videos, opposed to blog or social media posts, are more likely to spread beyond the conspiratorial fringes of America and become part of mainstream discourse.
On Wednesday afternoon, Avril D. Haines, the director of national intelligence, told the Senate Intelligence Committee that Russia was the most active threat to the coming election. Russia, she said, tries to erode trust in democratic institutions, exacerbate social divisions and undermine support for Ukraine.
“Russia relies on a vast multimedia influence apparatus, which consists of its intelligence services, cyberactors, state media proxies and social media trolls,” she said. “Moscow most likely views such operations as a means to tear down the United States.”
China has a sophisticated influence operation and is increasingly confident in its ability to affect election results, Ms. Haines said. But she added that the intelligence community assessed that China did not try to influence the 2020 presidential election, and that so far there was no information that China would be more active in this year’s contests.
Senator Mark Warner, Democrat of Virginia and the chairman of the Intelligence Committee, said that adversaries had a greater incentive than ever to intervene in elections but that the public had too often treated such meddling “as trivial or quaint.”
Clint Watts, the general manager of Microsoft’s Threat Analysis Center, said pushing out written disinformation with bots was largely a waste of time — in 2024 it is disinformation video that has the best chance of spreading with American audiences.
The C.I.A. video, Mr. Watts said, was a classic Russian tactic: accuse your adversary of the very thing you are doing. “When they say there’s a troll farm operated by Zelensky in Ukraine going after the U.S. election, what they’re saying is this is what we’re doing,” Mr. Watts said.
Walter Trosin, a spokesman for the C.I.A., said the agency was not involved in the activities described in the video.
“This claim is patently false and precisely the type of disinformation that the intelligence community has long warned about,” Mr. Trosin said. “C.I.A. is a foreign-focused organization that takes our obligation to remain uninvolved in American politics and elections very seriously.”
At the Senate hearing, Ms. Haines praised the C.I.A. for calling out the video publicly, saying it was an example of how the government will identify disinformation by Russia or other countries during the current election.
Multiple groups in Russia push out disinformation aimed at America. In addition to the videos, researchers and government officials say, Russia has created a handful of fake American local news sites and is using them to push out Kremlin propaganda, interspersed with stories about crime, politics and culture.
Gen. Paul M. Nakasone, who retired from the Army this year and is the former director of the National Security Agency, said the best defense to Russian disinformation remained the same: identifying it and publicizing the propaganda push. The United States, he said, needs to expand its information sharing both domestically and around the world so people can identify, and discount, disinformation spread by Moscow.
“The great antidote to all of this is being able to shine a light on it,” said General Nakasone, who last week was named as the founding director of Vanderbilt University’s new Institute for National Defense and Global Security. “If they are trying to influence or interfere in our elections, we should make it as hard as possible for them.”
Some mainstream Republicans have already warned fellow lawmakers to be wary of repeating claims that originated in Russian disinformation or propaganda.
“We see directly coming from Russia attempts to mask communications that are anti-Ukraine and pro-Russia messages, some of which we even hear being uttered on the House floor,” Representative Michael R. Turner, an Ohio Republican who is the chairman of the House Intelligence Committee, told CNN’s “State of the Union” on April 7.
Russia’s information warriors have pushed fake videos to spread lies about Ukraine, aimed at undermining its credibility or painting it as corrupt. Republican politicians opposed to sending more aid to Ukraine have repeated baseless allegations that Mr. Zelensky has tried through associates to buy a yacht, disinformation that first appeared on a video posted to YouTube and other social media sites.
Most of the videos produced by Storm-1516 fail to get traction. Others come close. A video pushed out on a Russian Telegram channel purported to show Ukrainian soldiers burning an effigy of Mr. Trump, blaming him for delays in aid shipments.
The video was highlighted on Alex Jones’s right-wing conspiracy site, InfoWars, and other English-language outlets. But it was quickly discounted — the purportedly Ukrainian soldiers had Russian accents and were masked.
“This campaign has been working to advance some of Russia’s key objectives, particularly that of portraying Ukraine as a corrupt, rogue state that cannot be trusted with Western aid,” Mr. Watts said.
Since last August, Microsoft has identified at least 30 videos produced by Storm-1516. The first ones were aimed at Ukraine. But others are trying to influence American politics by appealing to right-wing audiences with messages that Mr. Biden is benefiting from Ukrainian assistance.
Intelligence officials, lawmakers and security firms have warned about the use of artificial intelligence by China, Russia and other nation states intent on spreading disinformation. But so far, Russian groups like Storm-1516 have mostly avoided using A.I. tools, according to security firms.
“Many of the A.I. campaigns are easy to detect or unwind,” said Brian Murphy, the general manager of national security at Logically, which tracks disinformation. “A.I. is getting better, but it is still not at the stage this year wherein it is going to be used at the scale and with the quality some predict. Maybe in a year or so.”
Both government officials and outside experts, however, have said that A.I.-altered audio had proved more effective than altered videos. At the hearing on Wednesday, Ms. Haines highlighted a fake audio recording released in Slovakia two days before its parliamentary election. While quickly identified as fake, news and government agencies struggled to disclose the manipulation and the target of the fake recording lost a close election.
Artificial intelligence and other innovations, she said, “have enabled foreign influence actors to produce seemingly authentic and tailored messaging more efficiently at greater scale and with content adapted for different languages and cultures.”
For now, though, basic videos like the C.I.A. troll farm or yacht video that purport to have authentic narrators with access to exquisite information are the most prevalent threat.
In 2016, Russian-controlled propagandists could push out fake news articles or social media posts and, in some cases, have an impact. But now, those old techniques do not work.
“No one will pay attention to that nowadays,” Mr. Watts said. “You have to have a video form to really grab an American audience today, which 10 years ago was just not even technically that possible.”
Source Agencies