- The Strategist - https://www.aspistrategist.org.au -

Government shouldn’t rush social media regulation

Posted By on April 2, 2019 @ 12:58



The Morrison government has announced its intention to introduce new legislation into parliament this week to stop content like the video of the Christchurch massacre from proliferating on social media platforms.

The new law would impose heavy fines and up to three years’ imprisonment for executives of social media companies which fail to ‘expeditiously’ take down content flagged to them by the Office of the eSafety Commissioner.

The government’s desire to be seen to be responding swiftly and strongly to the abuse of social media platforms in the wake of the Christchurch tragedy is understandable.

Action on the abuse of social media platforms by terrorists and their supporters is clearly necessary, and regulation may well form one part of a multifaceted response. But rushed regulation is almost inevitably bad regulation.

The government is right to highlight the importance of this issue. But it’s precisely because it is important that, rather than trying to slam this legislation through in the last sitting week before the election, the government—both the current one and the next—should slow down and take the time to get it right.

Here are three things which regulators should consider.

First, conceptual clarity. What are we trying to achieve, and is this the best way to achieve it? It’s not clear from the information currently available whether the goal of the legislation is, for example, to prevent Christchurch-style events in which a single attack is broadcast and amplified globally at lightning speed, or to disrupt long-term terrorist propaganda and influence campaigns. These goals are related but distinct from one another and the best methods for achieving them differ. Google’s senior vice president for global affairs and chief legal officer, Kent Walker, has warned that the government's proposal may be neither feasible nor an ‘appropriate model’ for managing extremist content online.

It’s also not clear how much consideration has been given to whether imposing criminal penalties on social media companies, and a regulatory focus on content, are better options than alternative approaches such as focusing on the users uploading the content. (An objection that many of these users might be outside Australia is valid, but that also applies to the many social media companies that have no physical presence in Australia.)

Second, technical feasibility. How will it work in practice, and is it really going to be an improvement on the current situation? The mechanism proposed in the new legislation, as far as we know, will involve Australia’s eSafety Commissioner notifying social media companies of ‘abhorrent violent material’ on their platforms, which the companies must then expeditiously take down.

At the height of the online storm which followed the Christchurch shooting, new copies of the video were being uploaded to YouTube at the rate of one per second. Hundreds of new accounts were created just to share it. Facebook says it took down 1.5 million copies in the first 24 hours after the attack.

It’s hard to imagine that the 45-person-strong Office of the eSafety Commissioner would have the capacity to identify, let alone issue notices on, anywhere near that volume of content. Even if they could, it’s not clear how issuing such notices would have helped social media companies respond more effectively, or if those notices would only have sucked up resources and time which could otherwise have gone to addressing the problem directly.

This doesn’t mean that regulation isn’t worth doing. What it does mean is that—coming back to conceptual clarity—we need to recognise what this kind of regulation is, and is not, good for in practice. The government’s proposed legislation would probably not have stopped the Christchurch massacre video from spreading as quickly as it did.

Third, regulators need to consider adverse consequences. I can already tell you how terrorists, extremists and their supporters will respond to laws like the one the government is proposing. Rather than uploading their content to mainstream platforms like Facebook and YouTube, they will upload it to third-party platforms over which Australia has no influence, and then continue to share links to that content on mainstream sites. I know they’ll do this, because it’s what they do already to circumvent the efforts of the mainstream platforms to automatically detect and remove such content. An increased crackdown by the big social media players will not take this content offline; it will simply disperse it more widely.

Again, the regulation may still be worthwhile, but—coming back to conceptual clarity and technical feasibility—we need to consider whether (further) fracturing extremist communications online is something we want to do, and whether it will make it technically easier or more difficult to achieve the ultimate goal of protecting the public.

There’s another major adverse consequence of the proposed law which the government must consider. If it passes, the legislation will give companies like Google and Facebook a very strong incentive to move as many senior executives as possible out of Australia. The government has complained about the social media companies sending their junior executives rather than decision-makers to a meeting on the response to the Christchurch attack held in Brisbane last week. If the government is unhappy with the level of senior representation in the country now, just wait until social media executives foolish enough to turn up in Australia are greeted by the prospect of jail time. The ultimate effect of this regulation could well be to make it much more difficult for the government to establish high-level contacts at the big social media companies.

The Morrison government is absolutely right about the necessity to address the abuse of online spaces by terrorist and extremist movements. The events surrounding the Christchurch massacre clearly call for an effective, powerful and coordinated response. Precisely because it is so important, however, this issue should not become hostage to the electoral cycle. There’s bipartisan support for the fight against online extremism and this is an issue on which the major parties can work together. The government, both now and after the election, needs to slow down, think clearly, consult widely and take the time to get this right.


Article printed from The Strategist: https://www.aspistrategist.org.au

URL to article: https://www.aspistrategist.org.au/government-shouldnt-rush-social-media-regulation/

[1] announced its intention: https://www.pm.gov.au/media/tough-new-laws-protect-australians-live-streaming-violent-crimes

[2] warned: https://www.theaustralian.com.au/nation/politics/google-rejects-governments-bid-to-force-vetting-of-all-videos/news-story/8ec900839af175e95a6339b8b94e52aa

[3] what they do already: https://www.tandfonline.com/doi/full/10.1080/1057610X.2018.1513984?af=R

[4] complained: https://www.smh.com.au/politics/federal/facebook-censured-by-government-for-failure-to-act-on-livestreaming-concerns-20190326-p517sb.html