The arrest of Telegram founder Pavel Durov in France has underscored the urgent need for more regulation of messaging and social media platforms that can be exploited for hybrid operation by both states and non-state groups.
Once celebrated as the ultimate tool for free communication thanks to its encryption and lax moderation practices, Telegram now stands accused by French authorities of facilitating criminal activities and possibly being exploited for hybrid threats, particularly by Russian state actors. Hybrid threats blend military force with non-military tactics including cyberattacks and disinformation.
To counter these threats, policymakers around the world must prioritise regulation, platform accountability, and the promotion of alternative platforms that are less susceptible to misuse, while also protecting free speech. It’s a delicate balancing act.
Telegram’s role in such hybrid operations, particularly in the context of Russian state-backed activities, have become increasingly evident. The platform has been used not only for legitimate private communication but also as a tool for spreading disinformation, propaganda, and extremist content. This is particularly concerning in conflict zones such as Ukraine, where Telegram has 7 million users, including government officials and opinion leaders.
Despite Durov’s claim that his platform is not backed by the Kremlin, Telegram’s financial ties to Russian oligarchs and state-controlled entities suggest otherwise. Investments from figures such as Roman Abramovich and Sergey Solonin—both of whom are linked to the Russian government—raise significant concerns about the platform’s susceptibility to state influence.
This financial entanglement is alarming given Telegram’s extensive reach across Eastern Europe, Central Asia, and the Middle East—regions that are subject to Russian hybrid operations.
The Russian government’s 2018 attempt to block Telegram and the app’s purported relocation to Dubai may have been a strategic façade, giving cover to Telegram channels that promote pro-Russia narratives including glorifying separatists, justifying the invasion of Ukraine and spreading extremist propaganda—some of which reportedly fuelled the recent anti-immigration riots in the UK.
This manoeuvre preserved the illusion of Telegram’s independence while keeping it accessible for Kremlin use, which aligns with Russia’s hybrid strategy. It’s an entirely understandable tactic; using disinformation to weaken adversaries either as an alternative or as a complement to military confrontation is a cost-effective approach.
Claims that Telegram’s servers and data centres are located in Russia, and are therefore subject to Russian laws, raised doubts about the platform’s pleas of independence. This raises serious security concerns, particularly for users who challenge the Russian government or operate in conflict zones where hybrid tactics are prevalent.
Democracies must take proactive steps to mitigate these risks. First, they should enforce stringent regulations that require transparency in how messaging platforms operate and how they manage user data. This includes clear guidelines on privacy, content moderation, data storage, and co-operation with law enforcement, ensuring that platforms cannot be easily exploited by malicious actors.
Second, governments and civil society should promote the use of alternative messaging platforms that prioritise transparency and accountability. Platforms like Signal, which offer end-to-end encryption and operate with a commitment to user privacy without the financial entanglements seen in Telegram, can serve as safer alternatives. This is particularly relevant for government officials and in sensitive sectors, for whom using encrypted messaging tools that are based in their own country or another trusted nation can reduce the risks.
Finally, enhancing media literacy and public awareness about the risks of disinformation is crucial. Educating users on how to identify and counteract disinformation campaigns can help build resilience against these types of hybrid threats. This approach should be coupled with efforts to develop and promote technologies that can detect and mitigate the spread of false information on digital platforms.
As messaging platforms become increasingly central to both communication and conflict, the lessons from Telegram’s rise and its connections to Russian interests underscore the importance of transparency, regulation, and the promotion of secure, accountable platforms.
More stringent regulations for digital platforms can be implemented to ensure their transparency in content regulation policy, ownership, legal regimes and data storage. The world can learn from, among others, Germany’s Network Enforcement Act (NetzDG), which requires social media platforms to remove illegal content promptly or face significant fines.
The European Union’s Digital Services Act (DSA) compels large online platforms to assess and mitigate risks related to the dissemination of illegal content, disinformation, and other harmful activities. Similarly, the United States’ Communications Decency Act Section 230 was established to hold platforms accountable for the content they host while preserving free speech. In Australia, regulation includes the Online Safety Act 2021 and Digital Platform Regulators Forum.
While the global community has been arguably too slow to hold messaging and social media platforms accountable, these steps towards greater government action reflect a positive shift towards better oversight. The Telegram episode serves as a reminder of the importance for democracies to guard against multifarious hybrid threats and implement measures tailored to their unique security concerns.
- The Strategist is running a short series of articles in the lead up to ASPI’s Sydney Dialogue on September 2 and 3. The event will cover key topics in critical, emerging and cyber technologies, including hybrid threats, disinformation, electoral interference, artificial intelligence, clean technologies and more.