Australia needs to consider global perspectives to weed out online deception and disinformation
5 Dec 2023|

Fallopia japonica, better known as Japanese Knotweed, is a highly invasive plant that forms dense thickets, outcompeting native vegetation.

Present day disinformation is a lot like Japanese Knotweed. It takes just one post (or plant) to kick off an infestation. It spreads fast through a continuously growing horizontal underground stem—and it’s really hard to eradicate.

Reflecting on the recent inaugural OECD conference addressing the global disinformation challenge, parallels between strategies to combat knotweed and disinformation emerged.

The conference showcased international efforts akin to battling the pervasive and aggressive weed, with different nations sharing their models for managing the complex issue. Just as various methods including herbicides and encapsulation are employed against knotweed, governments alongside academia, civil society, and the private sector must engage in a multi-pronged approach to control and prevent the spread of disinformation.

At many disinformation forums I’ve attended, conversation has admired the problem without resolve or focused on the technological components driving today’s accelerated spread of synthetic media and fake news. But this OECD summit was not a mere rumination but a focused exploration of practical solutions. Experts, policymakers, and industry leaders from around the world converged on the theme of strengthening democracy through information integrity, and the event did not disappoint.

From Europe to the Asia-Pacific and across Latin-America, disinformation has emerged as the most significant threat to societies and democracies. Next year, a record 3.2 billion people worldwide will vote in elections across 40 countries. This includes Taiwan, Indonesia, Pakistan and the US. These are consequential elections, the outcomes of which will set the tone for global events for years to come.

Events in Slovakia’s recent election clearly show the danger we already face. Slovakia’s experience brings to the fore the stark reality of deepfakes and disinformation in elections serving as a warning for the 40 countries getting ready to vote in 2024. This is not a theoretical concern; it demands our immediate attention.

It’s not just deepfakes we need to worry about. Generative AI combined with data mining is a real threat. In the same way personal data is used for micro-targeting to sell us stuff, it can also be used for personalised disinformation: creating persuasive narratives and convincing dialogue that engage us as individuals, manipulating our beliefs. It’s precise, relevant and fine-tuned to you.

The broad consensus at the conference was that social media incentive structures that reward clicks over account and content authenticity was a deeply rooted element of the problem. Modern content creation and what makes news was widely discussed, with repeated calls for social media companies to make algorithms transparent.

Opaque amplification models have created a murky world that benefits only the platforms, advertisers, content distributors and threat actors. This goes to the concept of freedom of speech versus freedom of reach, touted by many of the speakers.

One needn’t go far for a practical example. In Paris, where the conference was held, there’s been mass hysteria over a little bug. Not the cyber kind. Bed bugs. Would there be the same level of real-world panic if a few media posts hadn’t gone viral?

Unfortunately, no one at the conference saw a clear path to upending current incentive structures and obscured algorithms, even with regulation.

The call instead was for social media companies to be more transparent about their data to help researchers better understand social media networks, content distribution, recommendation algorithms and social impact. At least Meta has come to the party.

But a single post or click is not entirely the problem. The real dilemma lies in campaigns and narratives, often pushed in a coordinated and artificial way to sow discord and dissent. They attack the ideas underpinning democracy as well as institutions and individuals with privileged access.

How do we deal with this? Data and algorithmic transparency aside, one oft cited view by speakers was the need for generational change: fostering a new breed of critical thinkers. Of course, this doesn’t address the immediate disinformation challenge we face. But it serves to build awareness over time, through media literacy and education in schools and workplaces, about disinformation techniques and targets.

There’s merit in this approach. From Senegal to Colombia, young people are concerned and want to tackle the problem.

The role of independent media and journalism also received much attention, with an emphasis on the need for robust domestic information sources. Despite the era of traditional media information monopoly being over, there was a view that we could create a monopoly on quality information, over quantity and speed.

Locally, the ABC is often criticised for bias. But upon hearing my Australian accent, conference attendees had nothing but praise for the fact that Australia has public service media. Built on values of integrity, respect, collegiality and innovation, the ABC has an implied responsibility to produce fact-based news content—the antithesis of disinformation.

Another approach is to recognise that technology is both a threat and an opportunity, able to generate and amplify content while also aiding in the detection and analysis of disinformation. The newly established Advanced Strategic Capabilities Accelerator’s first focus is on synthetic media and disinformation, indicating that the Australian Defence Force thinks technology can be leveraged in information warfare, as much as pose a challenge.

Finally, there were examples aplenty of success found though coordinated government approaches. Underpinned by a commitment to democratic values, transparency, accountability and individual freedoms, several governments are working alongside civil society and the private sector through a central coordination body.

France’s Viginum agency has identified several instances of complex and persistent digital information manipulation campaigns, including involving Russia. Canada established a protecting democracy unit in its Privy Council Office bringing together traditional intelligence and security agencies with government statisticians, communication experts and election agency staff to focus on disinformation undermining democratic institutions. Lithuania has a new crisis management centre to surge against a range of challenges, including disinformation.

The message is that government coordination is vital. Just as local councils are responsible in the battle against knotweed for identifying infestations, raising awareness and implementing control measures, government plays a vital role to fight the global disinformation menace.

But in Australia, there is no single responsible body. Instead, responsibility is spread across a myriad of agencies, including Home Affairs, DFAT, Defence, ACMA, ASIO and the ASD. The absence of a coordination body means there’s no centre of excellence that can align interested parties and move with agility. We desperately need this.

Representatives from around the world agree there is no single silver bullet to a problem that exhibits many dimensions. Teaching kids to think before you link, bolstering media transparency, and regulating algorithms are ineffective on their own. We need to implement not one but all of the above approaches, strategically and simultaneously.

Disinformation is proliferating largely unchecked across the digital terrain, infiltrating minds and landscapes, and devaluing the truth. It is the digital knotweed.

And just as knotweed erodes property values, disinformation erodes trust, distorts reality, and undermines the foundations of informed societies.

While total eradication of knotweed proves elusive, similarly, complete elimination of disinformation may well be improbable. But that’s no reason not to act.

We each have individual responsibility to disable the ad blockers, reject web cookies, and encourage our communities to be alert and alarmed by the digital infodemic. Through vigilance and persistent efforts, as individuals and as a nation, taking actionable steps such as those proposed at the OEDC disinformation conference, there’s hope for effective resistance.