On Wednesday evening (Sydney time), Australia’s Federal Court extended an interim injunction mandating X Corp to hide posts containing copies of a live-streamed stabbing attack in a Sydney church.
The injunction followed an official removal notice issued by the eSafety Commissioner, ordering that X remove the posts from its platform globally. Having already geo-blocked the content in Australia, X has contested the legal grounds for the removal notice. The issue will be ruled on in a final hearing scheduled for May 10.
This case coincides with heightened scrutiny of online platforms in Australia. Lawmakers are considering changes to the Online Safety Act (OSA) that would increase obligations on service providers, and sharpen enforcement mechanisms. Simultaneously, a specialised task force on algorithmic harms has been convened and a misinformation bill is set to be proposed soon. The opposition in Parliament has also called for social media to be blocked for children and for age verification to be made mandatory.
New regulatory currents are gathering pace in Australia. Here’s what online service providers should know about how things sit currently and what may lie ahead.
What are “removal notices” under Australia’s Online Safety Act?
The eSafety powers that X is challenging come from Australia’s Online Safety Act 2021 (OSA). Under the OSA, the Australian regulator—the eSafety Commissioner—can issue removal notices that require in-scope service providers to remove certain material from their services.
In-scope providers include:
- Social media services;
- Relevant electronic services;
- Designated internet services; and
- Hosting service providers.
eSafety is empowered to assess whether online content qualifies for a removal notice, within the requirements of one of four categories, including:
- Cyber-bullying material targeted at an Australian child;
- Cyber-abuse material targeted at an Australian adult;
- Intimate images non-consensually shared on the service; and
- Class 1 and 2 material under the Online Content Scheme (includes material related to child exploitation, terrorism, and revolting or abhorrent phenomena).
Notices must be complied with within 24 hours, or such a longer period as the Commissioner allows. Non-compliance can result in court injunctions and ongoing daily fines. In the case of repeated non-compliance within 12 months, eSafety may obtain a court order requiring the service provider to cease providing their service in Australia.
What could the Federal Court show-down mean for Australia’s removal notice regime?
There are two key issues at play in the eSafety vs X case in the Federal Court that may fundamentally redefine Australia’s notice-and-takedown regime.
The first issue is jurisdictional. X says that it has geo-blocked the relevant content in Australia. However, eSafety argues that X must remove the content globally. The Commissioner is concerned about risks of radicalisation associated with the content, and that Australian users can easily circumvent the geo-block by using a VPN. Whether eSafety has the power to enforce global content removals will raise a number of compliance issues for online service providers.
The second issue is definitional. This case may test eSafety’s definition of class 1 content under the Online Content Scheme. X argues that the content does not violate its terms of service. However, eSafety claims it is “gratuitous or offensive violence with a high degree of impact or detail”. If and how the court rules on this issue will have consequences for the Commissioner’s scope to issue removal notices. It will also influence online service providers’ content moderation practices by more clearly defining what is lawful content under the Australian regime.
Are there any general-scope obligations in Australia?
Where Australia’s removal notice regime addresses online harm “down-stream” at the content level, general-scope regulations impose broad-based responsibility “up-stream” on service providers regarding the design and functioning of their services.
Australia’s 2021 reforms introduced Basic Online Safety Expectations (BOSEs) as a new general-scope instrument.
BOSEs are set by the Government and require in-scope service providers to take a series of broadly defined systemic actions to protect their users from harm. This includes taking “reasonable steps” to:
- Ensure safe use (including conducting safety risk and impact assessments);
- Prevent access by children to age-restricted material;
- Minimise the provision of illegal content; and
- Ensure mechanisms to report about illegal content.
The BOSEs are non-mandatory, but eSafety can order online platforms to report on their compliance. Non-compliance involves reputational risks as eSafety can publish a public report detailing the service provider’s failures.
Next steps for Australia: an overarching duty of care?
The ongoing review of Australia’s OSA was brought forward following a parliamentary inquiry into social media and online safety in 2023. The review has been tasked with considering, among other things, the “introduction of a duty of care requirement towards users”. This would “place broad obligations on platforms to focus on their systems and mitigate against harms before they happen”, according to think tank Reset Tech Australia.
The nature of these obligations might closely reflect the risk assessment and mitigation requirements of the European Union’s Digital Services Act (DSA). Rigorous record-keeping and documentation by affected platforms would be a necessity to demonstrate compliance.
In the Australian context in particular, a duty of care would invert the focus of regulatory exposure from reactive to systemic responsibilities. Currently, compliance is concentrated on cooperation with eSafety’s removal notices regarding specific items of content. Under a duty of care, platforms would be required to deploy significant resources to mitigate harms on their services.
The bottom line: expect new systemic obligations and stronger enforcement
The trend in Australia indicates a shift towards general-scope obligations with stronger enforcement. In 2021, Australia updated its regime by expanding eSafety’s notice-and-takedown powers and adding the non-mandatory BOSEs. The ongoing review is now considering replacing or augmenting the BOSEs with a new mandatory general-scope instrument, such as an overarching duty of care. This would mean mandatory responsibilities for platforms to address systemic risks, and tougher enforcement.
How can Tremau help?
A new era of online safety regulation is upon us. Online platforms are increasingly subject to strict obligations in key markets across the globe. The compliance challenge is only going to grow more complicated and inescapable, and the consequences of failure are significant fines and reputational harm. Our advisory team can help you decipher and disentangle the complex web of online safety regulation, and ensure efficient and functional compliance. Send us an email at info@tremau.com to get in touch!