Researchers Say Russia Is Using AI to Predict Terrorism at Paris Olympics

Russia is kicking its disinformation machine into high gear ahead of the 2024 Paris Olympic Games — and according to Microsoft cybersecurity researchers, it’s using AI to do it.

The Microsoft Threat Analysis Center (MTAC) released a report yesterday outlining Russia’s sophisticated use of generative AI tools to spawn content designed to disparage the International Olympic Committee (IOC) and “create the expectation of violence breaking out in Paris during the 2024 Summer Olympic Games.”

According to the report, these disinformation attacks kicked off in June 2023, when a Russia-affiliated “influence actor” dubbed Storm-1679 posted a fabricated Netflix documentary titled “Olympics Has Fallen” — a spoof of the 2013 film “Olympus Has Fallen” — to the app Telegram. Noting in its report that the clip “demonstrated more skill than most influence campaigns we observe,” MCAT explained that Storm-1679 had used AI to deepfake the voice and likeness of American actor Tom Cruise, who “narrated” the fake video; the AI Cruise was blended with “bogus five-star reviews” from real and trustworthy outlets like The New York Times, The Washington Post, and the BBC, all “amid slick computer-generated special effects.”

The goal of the video, according to the report, was to denigrate the IOC, which has barred Russia from entry in the upcoming Games due to its ongoing and unprovoked war against Ukraine.

But the doctored doc was just the beginning, according to the report. More recently, Storm-1679 and another “prolific” group called Storm-1099 have been churning out fake news videos and articles — many falsely attributed to real news outlets — created to imply that terrorism and violence is widely expected to break out at the forthcoming event.

In one fake video “impersonating French broadcaster France24,” for example, “Storm-1679 falsely claimed that 24 percent of purchased tickets for Olympic events had been returned” because of “fears of terrorism,” according to the report. The disinformation actors also drafted fake press releases from state intelligence bodies like the Central Intelligence Agency and the French General Directorate for Internal Security urging that citizens refrain from attending due to security risks, and created “likely digitally generated” imagery of fake Parisian graffiti “threatening violence against Israelis” planning on attending the games.

The MCAT report didn’t mention which specific AI companies or products these bad actors might be using. But as MCAT noted in its report, Storm-1099 is also known as “Doppelganger.” In an OpenAI blog post published on May 30, the Microsoft-funded AI company announced that it had caught and quashed Doppelganger-perpetrated disinformation efforts in which affiliated actors had used its models to “translate and edit articles in English and French” that were posted to Doppelganger-controlled websites; “generate headlines”; and “convert news articles into Facebook posts,” among other uses.

In other words, Russian actors are likely using American technology, at least in part, to create and spread destructive propaganda.

Russia’s beef with the IOC is longstanding, as is its well-documented history of spreading disinformation about the Olympics specifically (seriously, that history traces back to analog efforts in the Soviet-era 80s.) And though the effectiveness of Russia’s most recent attacks on the Games remains unclear, the reality stands: state operators are using generative AI tools, pioneered in Silicon Valley, to propel disinformation campaigns.

Add this latest report to the growing pile of AI use in political content and campaigns, and AI’s steady creep into global information and influence structures marches on. Never believing everything you see online has always been good advice — but now more than ever, you might want to make sure you’re triple-checking your sources.

More on AI and information: Even Google’s Own Researchers Admit AI Is Top Source of Misinformation Online

Share This Article

Go to Source