Submission to the Statutory Review of the Online Safety Act 2021

June 25, 2024

Tech Icon-svg
Responsible Technology

Australia has developed an online safety regulatory framework that has left us with strong content focused laws. This involves complaints and content-based removal notice schemes to support those who have experienced online harm. However, it lacks provisions to address and oversee the risk-producing systems themselves.

Globally, there is an emerging trend in digital service regulation that seeks to compel greater accountability, transparency and compliance with human rights from large social media companies and digital service providers. These address the impact that providers have on what is essentially public space and focus on what their responsibilities and obligations to the public should be as stewards of the modern public square. Australia should take a similar approach.

  • Social media platform providers should be responsible for the impact their business decisions have on what is essentially public space – the burden of responsibility for online harms should fall on them and not solely on individual users.
  • Australian lawmakers should take a risk management approach in online safety legislation – digging beneath the content layer to target the risk producing systems themselves.
  • Laws should be amended to impose a statutory duty of care on platforms to ensure, so far as is reasonably practicable, that the users of their service, and that people who may be affected by the service and are not users of that service, are not appreciably harmed as a result of its operation or use.
  • This duty of care should be broad; singular; and enforceable by the regulator.
  • Strong transparency measures will need to be implemented to ensure appropriate steps are being taken to identify and mitigate risks and comply with the duty of care. Australian laws makers should incorporate aspects that are similar to those in the EU Digital Services Act that deal with transparency requirements into our own legislation. These are best practise and large global companies are already working under these rules in some parts of the world.
  • Large social media platforms have been a democratising force and arguably make up our modern town square. Laws need to be carefully designed to ensure freedom of expression online is not unreasonable burdened while also considering other fundamental human rights.
  • Large social media companies have shown reluctance to act in the best interest of their users on numerous occasions. It has become clear that governments must regulate for transparency, accountability and compliance with human rights by these companies.

Grayson Lowe provided research assistance and made contributions to this submission.