- The Strategist - https://www.aspistrategist.org.au -

Removing the risks from a decentralised internet

Posted By on July 30, 2021 @ 15:15

Increasingly, people worry about the concentration of power in the digital environment, and the control that large companies exercise over users’ data and experiences online. The Australian government has opted to regulate ‘big tech’ for a range of online harms. But more broadly, this concern has led to calls to ‘re-decentralise’ the internet, harking back to the early days of the web before these companies which now serve as gatekeepers to the internet existed.

Under a decentralised internet, often referred to as ‘DWeb’ or ‘Web 3.0’, people’s data, information and interactions are widely distributed. Power is also redistributed, with people able to access online services and platforms without relying on a concentration of large technology companies that operate centralised servers.

While this allows users to protect their information and control their online experiences, it can also make it more difficult to hold users (or the entities behind them) responsible for illegal and harmful content and conduct.

Highly decentralised networks are currently used by a minority of users with special interests—and, unfortunately, some bad actors. However, there’s growing interest within the tech community in developing decentralised platforms and services for messaging, file sharing and social networking. For example, Twitter’s Bluesky project is looking at an open decentralised standard for social media.

At eSafety, we understand the importance of taking a balanced, nuanced and proactive approach to emerging technologies and digital trends. It is incumbent on us, as an agency with a mandate to ensure that Australians have safer and more positive experiences online, to assess risks in emerging technologies. We help prevent harm through research, awareness raising and education. We aim to better protect citizens when harm has occurred via our statutory content, reporting schemes and investigations and to support, guide and assist industry to develop safer online products via our Safety by Design [1] initiative.

Decentralisation has the benefit of improving users’ security, privacy and autonomy because they have greater control over their personal information and online experiences. It can enhance freedom of expression by removing the ability of technology companies and authorities to control who can connect and communicate online, or to control content and conduct. Conceptually, and dependent on a spirit of altruism and benevolence, this could protect diversity of thoughts and opinions and reduce the risk of monitoring, tracking and targeting of at-risk or marginalised individuals or groups, including whistleblowers and advocates for social change.

The risks centre on the absence of centralised servers, the lack of central authority, and the fact that storage and distribution of data are spread across many computers on decentralised services and platforms. These factors make it difficult to moderate, regulate or manage illegal and harmful content and activities. Similar to the dark web, these niche spaces can attract groups with an interest in violent extremism, child sexual exploitation or other forms of crime, particularly when they have been barred from mainstream centralised services.

A range of decentralised services—especially those that are also encrypted—can be used to facilitate the spread of harmful and illegal content and to organise harassment and violence with impunity.

eSafety and other INHOPE member hotlines around the world facilitate removal of child sexual exploitation and abuse material to minimise ongoing harm and re-traumatisation of victims. That’s done by determining where the content is hosted and alerting authorities in the relevant country so they can enforce removal, assuming that the content is illegal in that jurisdiction.

In a decentralised system where content is not hosted by a single server within a particular country, but stored and passed around in many ways from computer to computer, this takedown method is no longer effective.

Child sexual abuse offenders have been observed, in online forums, sharing tips on how to evade detection using peer-to-peer and end-to-end encrypted communications channels. Offenders preoccupied with preserving their ‘collections’ of material may also seek the perceived immutability of decentralised environments built on blockchain and peer-to-peer technology.

As mainstream platforms increasingly respond to violent extremist content and activity, it has become clear that extremist groups have started moving to decentralised services to fundraise, share propaganda and organise hate-based violence and harassment.

One of the most notable examples is the migration of Gab—a social network known to have users linked to Nazi ideology—to Mastodon, a decentralised software platform. While Mastodon’s creator has made clear his opposition to Gab’s aims and philosophy, he has also conceded that he can’t ban Gab from the platform because it’s decentralised. Most Mastodon administrators have blocked Gab users, minimising their reach into the broader federation. However, new users continue to join and connect on Gab.

Unchecked online environments could allow bullying, harassment, intimidation, discrimination and other abuses to grow, without providing any way for users to get help or for consequences to be imposed on those responsible. It would be up to the members of individual online communities on each decentralised service or platform, or the nodes within them, to decide and apply standards in their own environment or across their networks.

While decentralised communication systems can protect some marginalised voices from being silenced, these same environments can also allow racism, homophobia, misogyny and other forms of hatred to flourish. eSafety’s research [2] and reporting trends show that online abuse is most often targeted at individuals and groups who are more at risk than others because they are socially, politically or financially marginalised. For these people, the inability to enforce standards for conduct and content within a decentralised internet may harm freedom of expression instead of improving it. The current trend towards decentralisation may push marginalised groups away from the services and platforms that would otherwise allow them to be seen and heard, deepening the divide between those who can enjoy the internet and those who cannot.

To be socially responsible, decentralised services and platforms must commit to protecting the safety of users, and not just their privacy and security. That means being aware of the safety risks in what they provide, informing users about those risks and taking steps to reduce or eliminate them. It means taking a safety-by-design approach to the development of these platforms and broader Web 3.0 infrastructure so that the online safety risks of decentralisation are considered along with the benefits.

Safety protections for decentralised services may include community moderation and incentives where an online community maintains a moderation policy based on agreed rules. Features such as voting systems can allow users to decide acceptable conduct and accessible content. In addition, built in incentives, such as micropayments or other rewards, may encourage positive behaviour and safer environments.

Opt-in governance can be used on blockchain networks to allow users to agree to community standards or rules, without the need for a central authority to manage the agreement. In a blockchain network, these agreements are traceable and transparent. In theory, this means accountability and enforcement measures can be applied to terms of service breaches.

Verifying and storing a user’s digital identity through a decentralised system can allow people to access different services and platforms with multiple identities and pseudonyms without having to reveal personal information to the technology companies that own and operate centralised servers. A socially responsible decentralised community could allow users to endorse content from digital identities or pseudonyms which they trust not to engage in harm or abuse.

Decentralised services and platforms can be built using technology protocols that allow third-party content moderation tools to, for example, scan for child sexual abuse material. Their operation would have to be agreed to by the community of users.

This trend underscores the need to strive for improved safety on centralised services and platforms, ensuring that safety by design is given the same priority as security and privacy by design. We must work across borders and encourage greater international consistency and shared approaches to help counter online risks and harms on decentralised services and platforms.

Given that decentralised services currently have little reach into the general population, many bad actors continue to rely on mainstream platforms to find targets and it remains critical to continue pushing bigger tech companies to enforce their terms of service and collaborate with one another to remove pathways to online harm.

eSafety will continue to work collaboratively across sectors and jurisdictions to ensure that the safety and wellbeing of citizens in digital environments are being addressed. We’ll do that with an  eye to the future, so that we can shape the next-generation internet to be the Web 3.0 that we all want and need.

You can find a more detailed brief on decentralisation [3], as well as other tech trends and challenges briefs, on eSafety’s website [4].



Article printed from The Strategist: https://www.aspistrategist.org.au

URL to article: https://www.aspistrategist.org.au/removing-the-risks-from-a-decentralised-internet/

URLs in this post:

[1] Safety by Design: https://www.esafety.gov.au/about-us/safety-by-design

[2] research: https://www.esafety.gov.au/sites/default/files/2020-12/Protecting%20voices%20at%20risk%20online.pdf

[3] brief on decentralisation: https://www.esafety.gov.au/about-us/tech-trends-and-challenges/decentralisation

[4] eSafety’s website: https://www.esafety.gov.au/about-us/tech-trends-and-challenges

Copyright © 2024 The Strategist. All rights reserved.