• Jeremy Mears

NATION-STATE DISINFORMATION CAMPAIGNS

NEVER LET A CRISIS GO TO WASTE

The rise of COVID-19 has created an array of fresh opportunities for state actors to take advantage of the virus, preying on fear and anxiety across the globe. More broadly, countries are becoming more aggressive in the area of disinformation, spreading false claims to discredit a foreign adversary and often doing so while disguising their hand in the activity to create more freedom of operation.


In this realm, COVID has created the perfect storm for disinformation to thrive: humanity’s voracious appetite for information about the virus; a mass migration to working from home as never seen before, effectively increasing connectivity to devices and social media; and all of this taking place within an incredibly short timeframe.


Both China and Russia are deeply involved in a range of disinformation campaigns, among other nations, and have taken a keen interest in exploiting humanity’s curiosity about the pandemic.

Regarding the virus’s origin, by far the likeliest explanation is that the virus was transmitted from bats to humans, perhaps via an intermediary animal, at a market in Wuhan, China. Early on, however, China and Russia sought to spread perceived propaganda against the US, claiming that the virus came from America and sought to slap down foreign critics who criticized their government response to the crisis.


Russia allegedly also looked to exploit COVID by planting falsehoods targeting China, in an effort to magnify the problem and negatively affect China’s economy in part by harming tourism and cross-border trade. Also during the early days of the virus, Chinese operatives purportedly attempted to spread panic across the United States by deploying fake messages to phones and social media accounts, some promoting a pro-China view with respect to the country’s response, while suggesting that America was negligent. China and Russia probably perceive that such information warfare sows a degree of chaos, helping them to gain a strategic advantage, advance foreign policy goals, and keep the West off balance.


China’s efforts at disinformation are relatively new; however, this is certainly not Russia’s first outbreak of the practice. At least in modern history, Russian disinformation campaigns targeting the health sector date back to circa the early 1980s when the KGB spread disinformation that the United States had orchestrated the rapid rise of AIDS as part of a biological weapons campaign against marginalized groups in the US and abroad. Russia’s intent with “Operation Infektion,” as it was called, was to sow distrust and exacerbate relations between the US and countries around the globe.


A big difference between then and now is how quickly social media can spread the hogwash. In the current digital environment of rapid information sharing, operatives can reach a much larger swathe of the population and within a much shorter time frame, oftentimes disseminating untruths before governments and firms can detect what has just hit them. At this point, the response may be too late as much of the damage is widespread. So how can we defend against such campaigns and to what extent can we determine attribution?


In other words, how are security teams handling the disinformation problem and what are the key indicators for such a phenomenon that might suggest activity is taking place?

Analysts and investigators of disinformation campaigns are drowning in an ocean of data and must look to emerging technologies such as artificial intelligence to unearth perpetrators, methods, and intended targets, as well as to measure public reaction to such falsehoods. Social media cooperation is critical, however thus far has been underwhelming as these firms have only begun to post warnings regarding any suspicion of false statements. Advanced data science can do much more, recognizing activities and patterns in real-time. The problem is not a lack of data, but rather quickly understanding what the data is telling us. We must deepen the resilience of populations through activity awareness, increasing the ability to identify fake versus fact and altering the outcome before the outcome occurs.