eSafety reaching across borders: Federal Court grants injunctions in X Corp proceedings
The Australian eSafety Commissioner has succeeded in obtaining an interim injunction requiring X Corp to hide extreme violent video content of an alleged terrorist act.
The eSafety Commissioner
The eSafety Commissioner is Australia's independent regulator for online safety. Its prescribed regulatory functions under section 27 of the Online Safety Act 2021 (Cth) ("Online Safety Act") are broad, including promoting online safety for Australians, coordinating Commonwealth Government activities relating to online safety for Australians, undertaking research about online safety for Australians, and monitoring and promoting compliance with the Online Safety Act, among others.
The eSafety Commissioner has a range of powers, including investigating complaints in relation to the online content scheme, issuing blocking requests and notices in relation to abhorrent violent material, as well as powers through the online content scheme to issue removal notices, link deletion notices, and app removal notices in relation to Class 1 (Restricted Content) and Class 2 (X 18+) material.
Under section 23(1) and Part 10 of the Online Safety Act, these enforcement powers (supplemented by the Regulatory Powers (Standard Provisions) Act 2014 (Cth) ("Regulatory Powers Act")) are expressly extended to acts, omissions, matters and things outside of Australia.
The eSafety Commissioner enforces the Online Safety Act through civil penalties for non-compliance, issuing infringement notices, enforceable undertakings, and injunctions, and can apply to the Federal Court to order a person to cease providing a social media, electronic, or designated internet service, or supplying an internet carriage service.
The interim injunction
On 16 April 2024, the eSafety Commissioner issued Class 1 removal notices to X Corp under section 109 of the Online Safety Act formally seeking removal of the video content of the alleged terrorist act. The application was commenced after the eSafety Commissioner was not satisfied with the actions X Corp took in response to the removal notice. Under the Online Safety Act, the Federal Court is empowered to impose a civil penalty for non-compliance with a removal notice.
On 22 April 2024, the Honourable Justice Kennett granted the eSafety Commissioner's application for an interim injunction against X Corp under section 122(1)(b) of the Regulatory Powers Act to compel X Corp to hide video content of an alleged terrorist act that occurred in Sydney on 15 April 2024. This interim injunction was then successfully extended on 24 April 2024 and has effect until 10 May 2024, with a further hearing to take place on that day. Reasons for decision have not yet been published.
The proceeding is likely to:
- provide clarification on the scope of when "material can be accessed by end‑users in Australia" – a condition for issuing a removal notice – such that it enlivens the enforcement powers of the eSafety Commissioner; and
- address the extraterritorial scope of the eSafety Commissioner's enforcement powers, noting the practical reality that 'geoblocking' technologies like Virtual Private Networks ("VPNs") only restrict the visibility of content to Australia end-users rather than removing access to it, making the relevant content visible to Australians who use VPNs to set their location overseas.
While the Online Safety Act is focused on Australian end-users, it may have a practical effect on all end-users given a borderless digital ecosystem. For example, if the eSafety Commissioner issues a removal notice on a social media service that operates in multiple jurisdictions for a particular URL that is accessible to both Australian and other international end-users, this case may establish that the removal of content from that URL for all end-users is within the remit of the eSafety Commissioner's enforcement powers and is proportional to the object of its powers to ensure that content is not accessible to Australian end-users.
Expansion of the eSafety Commissioner's remit
The Australian Government has initiated an independent review of the Online Safety Act that is due to be completed by 31 October 2024 ("Review"). The Issues Paper published in April 2024 shows a broad-ranging Terms of Reference, which will "includ[e] whether the law should be amended to impose a new duty of care on platforms towards their users".
The Review will also consider whether to address other online harms not currently captured under existing laws, including online hate, volumetric (pile-on) attacks, technology-facilitated abuse and gender-based violence, online abuse of public figures, and harms from emerging technologies including generative artificial intelligence, immersive technologies, recommender systems, end-to-end encryption, and decentralised platforms. The Review will be considering international examples in this space from the United Kingdom, European Union, and Canada.
The Issues Paper acknowledges the importance of global "cooperation" to address the absence of effective 'international boundaries" in digital technologies, including the internet. In view of this, the Issues Paper highlights that "the enforceability of penalties upon individuals or platforms based overseas" is a key issue that it aims to address through the Review, despite the operation of section 23 of the Online Safety Act extending its operation outside of Australia. This will likely discuss the "practical challenges" that the eSafety Commissioner faces in enforcing its powers on services accessed by Australian end-users despite "little or no local presence" of the service in Australia.
What's next?
Operators of digital services should monitor the outcome of the proceeding, which should address the extraterritorial scope of the eSafety Commissioner's enforcement powers and whether removal of content for all end-users (not just Australian end-users) is within the remit of the enforcement powers.