International Cybersecurity Research Made in Hamburg
 

Doppelganger campaign: Are we prepared for information war as-a-service?

Matthias Schulze, 2.10.2024


On September 7, the FBI published a dossier on Russia's systematic information operations against the West and Ukraine. The 270-page document reveals a kind of automated information warfare-as-a-service. The scope, frequency and methods of the information operations against Ukraine and the West are frightening. Is Germany able to counter this? 


Social media agencies in the information war

We have been familiar with Russian disinformation campaigns and information operations since the 2016 US election at the latest. The attempts at influencing the public by intelligence services and the Internet Research Agency at the time were comparatively crude, cheap, improvised and rather limited in scope. Fast forward a few years and their evolution becomes clear. In the fall of 2022, the doppelganger disinformation campaign was uncovered for the first time, which continues to this day and is much broader and more systematic than assumed. The core of Doppelgänger is that Russian disinformation actors had replicated Western media websites such as Der Spiegel, Washington Post, FoxNews, Bild, Le Monde and others. On these fake websites, news with Russian disinformation narratives, such as that Western sanctions against Russia were ineffective and would only harm their own population, were placed alongside legitimate content from Spiegel and Co. Links to this fake news were then shared thousands of times via social media and comment columns of legitimate media.

While the content of the campaign is now being analyzed by many anti-disinformation initiatives, less was previously known about the infrastructure behind it. Documents obtained by the FBI in the form of PR material and recorded conversations between Russian actors paint a more comprehensive picture for the first time. This consists of various Russian PR and social media agencies that carry out psychological and information warfare for the Russian state. Their services include sowing fear, insecurity, doubt and hatred. The long-term goal is to stir up social unrest, delegitimize Western democratic systems and political parties and bring nationalist parties into government positions (the documents do not conceal which parties are meant by this).


KGB techniques in digital marketing

Analyzing doppelgangers reveals a combination of old KGB disinformation tactics mixed with elements of modern cybercrime and digital marketing. In the 1980s, the Czech KGB defector Ladislav Bittmann described the disinformation cycle in a seminal book, which is still used today: first, misleading information is generated with convincing narratives (e.g. "refugees are a burden on the welfare state"). Then it is disseminated via bought journalists or ignorant, ideology-oriented mouthpieces with great social influence. Finally, the messages are reinforced through repetition and accompanying reporting (e.g. via RT, Sputnik and co.). The operation is successful when the narratives are taken up by established media and politicians, e.g. in political talk shows, and disseminated to a broad audience.

The documents published by the FBI illustrate how easy it has become in the age of generative AI to mass-produce plausible disinformation material in the form of images, videos and text. Particularly credible narratives mix facts with suggestive half-truths, and use emotionalization and outrage in "Fox News style", as it is called there.  More relevant, however, it seems that publishing and amplification has become easier and more effective with modern digital marketing methods, in the form of data analysis pipelines and a touch of cybercriminal knowhow, generating a historically unprecedented reach. Russian PR agencies are offering "information warfare-as-a-service".


Analysis of the Western information space

This includes a range of services, such as the systematic monitoring of Ukrainian and Western media and the entire "information space" by dedicated analysis teams in order to identify social narratives and lines of tension. To this end, apparently modern data science methods are used to aggregate various news feeds, analyze them semi-automatically and prepare the results visually. Disinformation dashboards measure the resonance and click rates of authentic and fake narratives and track their life cycle. The success of Western counter-communication or "anti-Russian narratives", for example by the Ukrainian government or civil society measures to combat disinformation ("fact checking", "debunking"), is also measured.

The agencies also claim to systematically analyze Ukrainian and Western influencers and content creators. The documents refer to more than 2800 people in 81 countries. Influencers come from all social strata of society. Their posts are collected and stored in databases in order to draw conclusions about how they resonate with their followers and measure their reach. The aim is to identify posts that (unknowingly) represent Russian positions in order to amplify them massively. Amplification is achieved through retweets and likes by social media accounts (including the famous bots) under Russian control. This systematically exploits the "like algorithms" of modern social media platforms, which automatically flush posts with high "engagement" into the feeds of other users and thus spread the disinformation further. In the USA, the role of Russian-directed influencers is just beginning to be understood. In Germany, there are still many question marks.


Toxic comments x 100000

Word has gotten around that you shouldn't read the comments section anywhere on the internet. Now it is clearer why: Russian PR agencies systematically analyze the comments sections of Ukrainian and Western news sites and track the most discussed topics. This means that they have set up several website crawling services in the background that automatically search many different news websites, collect all comments in the comments section, store them in a database and perform a linguistic analysis with the data. This includes topic analysis and sentiment analysis, i.e. which topics produce a lot of "engagement" and which sentiments they trigger. According to the published report, the whole process is evaluated using AI. The aim is to create behavioral models of users and calculate their susceptibility to certain narratives. In other words, the aim is to measure how users react to certain stories in order to identify which ones generate the most feedback. The narratives that work best are then amplified the most. In addition, the sensitivity to certain topics can then be exploited specifically for amplification, e.g. by micro-targeting purchased advertisements. This mechanism was already uncovered in 2016 in the Cambridge Analytica scandal, but it now appears to be done systematically and on an industrial scale: Disinformation actors, despite countermeasures by companies and the EU, are buying targeted political advertising on Facebook and co, targeting previously identified demographic groups. The documents speak of up to 1,000,000 advertising measures per month.

In addition, an automated "comment machine" has been programmed to automatically enter masses of comments into Ukrainian and Western news sites, with a target of 100,000 comments per month. The report speaks of 60,000 comments per month for Germany and France combined. We are considered particularly vulnerable. But it's not just news sites, Twitter and Facebook that are being targeted, but basically all social media. For TikTok and Instagram, the agency offers to create 30-second clips (50 per month). For meme platforms such as 9Gag and 4chan, a meme generator was developed for funny, politicized images (approx. 200 per month). In addition, writing services are offered for short articles (200 per month) or longer analysis pieces, e.g. for fake news sites (approx. 70 per month). The agencies also create and curate Telegram channels for distribution and amplification on request. The service includes 40 news items per day.


Disinformation KPI

This paints the picture of a comprehensive, semi-automated disinformation, production and amplification pipeline that provides tools to easily create fakematerial and automatically feed it into every conceivable channel in the information space of modern societies: Comments on real news sites, fake sites, social media postings, events, purchased election ads and online groups (Facebook, X, Instagram, TikTok, Telegram). In addition, there is the dissemination via "open" propaganda channels such as RT, Sputnik or official government accounts. Amplification is carried out by the algorithms of social media platforms and the market dynamics that drive social media content creators, the "invisible rulers", as Reneé DiRestra aptly analyzes in her new book.

This appears to be backed up by comprehensive data analysis tools that measure the success of the content produced. As is usual for PR agencies, KPIs, key performance indicators, are specified for all services, which are intended to make the success of the disinformation campaign measurable (e.g. how many views, how many new followers for fake accounts). The documents also refer to an analysis dashboard for measuring the success of disinformation. It will show the volume of content production per week and its distribution data. In addition, key sociological indicators are recorded in target companies, such as opinion polls on certain topics that were addressed in the disinformation material (such as election polls or satisfaction with the government). In other words, the aim is to measure the extent to which the disinformation campaign leads to changes in the mood of the population. Apparently, people are so convinced that these manipulation tools are effective that they are apparently also being used inside Russia, for example in the 2024 "presidential election"


Cyber-criminal energy

From a technical point of view, the campaign exploits some elements that we are also familiar with from cybercrime, such as typosquatting. Typosquatting means registering an internet domain that looks like a legitimate domain but is not exactly the same. It may contain typosquatting errors ("typo"), e.g. www.tagesspeigel.de or www.tagesspiegei.de instead of www.tagesspiegel.de. The Russian disinformation actors are taking advantage of the free and open Internet: Anyone can register a DNS address, and anyone can host a website and link it to a DNS address. Basically, all you need is a credit card and some billing information. Where background checks exist, they can easily be circumvented with stolen identities and credit card details. The Russian actors disguised their access to these online services using proxy servers - often several in a row to disguise IP addresses. This is an established playbook of cyber attackers.

The FBI was able to take over a total of 30 DNS domains with fake news sites from the DNS registries and redirect them to their own IP addresses. The data of some of these proxy services on US territory was also seized. It was found that some of these were controlled via IP addresses from Germany or registered with German email addresses.

The seizure also highlights another weakness of the free and open internet: these services were often paid for in Bitcoin or other cryptocurrencies. Some cryptocurrencies are traceable, as all transactions are stored on the blockchain. This is why the disinformation actors used various tools ("mules" & "mixers") established by cyber criminals to make tracking more difficult.

Virtual currency exchanges were also exploited.  These are internet platforms that enable the buying, selling and trading of digital currencies such as Bitcoin with real money. Some of these VCEs were confiscated by the FBI. This gave them access to all transaction logs, accounts and IP addresses. The FBI discovered that the access to various money laundering accounts from IP addresses based in Moscow took place during Moscow office hours.


Conclusion

The mix of KGB disinformation techniques, generative AI, data science and behavioral analysis, cybercrime and the high degree of automation of this information warfare-as-aservice are frightening. Although some of the documents are PR material, which of course exaggerates its own capabilities, even if only half of what is mentioned works, it should still have a certain impact. It also maps out the development path for the future: even if none of this works today, it shows the goals that the company wants to achieve in the future.

Information warfare strategies often speak of "information superiority". The doppelganger campaign shows what this can mean. After all, these data collection and analysis databases and dashboards generate an enormous "intelligence advantage" that Western actors probably do not (or should not) have. At least little is publicly known about this. The systematic AI and data-driven evaluation of all information channels and behavioral analysis of entire societies raises some difficult fundamental rights issues for democracies. Authoritarian states do not care about this and are therefore currently likely to be playing in a different league, both quantitatively and qualitatively. The, analogous "press foam folders" that are common in German ministries seem anachronistic. Let me be proven wrong. At the very least, one can conclude that such a systematic effort with high personnel and financial expenditure is likely to be superior to ad hoc initiatives from civil society for "fact-checking" and "debunking" disinformation. After all, how effective is fact-checking against an actor that has already anticipated reactive fact-checking measures and has long since adapted its own campaign accordingly?

It is also questionable whether our own rather digitally conservative security authorities can keep up with the technical innovations on display and are legally allowed to do so. A few analog contact points, inter-agency exchange formats and relatively slow awareness campaigns, which presumably do not have the same reach as influencers in the social media age, will not be enough in the future.