Navigating the Digital Services Act and Online Safety Act: A Quick Guide for Digital Platform Providers on the need to police content
In the wake of increased scrutiny from European regulators, particularly concerning Elon Musk and the platform X (formerly Twitter), businesses with digital platforms are advised to pay close attention to compliance with the EU's Digital Services Act (DSA). With potential penalties reaching up to 6% of annual global turnover for non-compliance, understanding and adhering to the DSA is crucial. In addition, UK companies should consider how they can comply with their obligations under the Online Safety Act (OSA), once the OSA’s provisions fully take effect in early 2025.
Understanding the DSA
Taking a step back, as we reported on back in 2022, the DSA aims to create a safer and more transparent digital space, with a focus on consumer protection and establishing clear accountability for online platforms. It introduces a range of obligations that vary depending on the size of the company. However, there are fundamental principles applicable to all companies covered by the DSA:
- Reactive Measures: Platforms must promptly remove illegal content following orders from relevant authorities and have systems in place to both quickly identify and remove such content and to address user grievances regarding content removal decisions.
- Proactive Reporting: Companies must report suspected criminal offences that pose a threat to life or safety to the appropriate law enforcement authorities.
- Transparency Reporting: At least annually, platforms must publish reports detailing their content moderation activities.
Understanding the OSA
In the UK, the OSA has a particular emphasis on illegal content and protecting children online. Key anticipated duties include:
- Staff Training: Companies must train staff on a Code of Conduct focused on user protection from illegal content.
- Tracking and Reporting: Platforms should have mechanisms to monitor and report on illegal content trends to their senior governance body.
- Content Moderation: Effective systems must be in place to quickly remove illegal content, informed by risk assessments and evidence of emerging harm.
- Complaint Procedures: Operating an efficient complaint process to ensure UK users and affected persons can report issues and prompt appropriate action.
By December 2024, Ofcom is expected to release the Illegal Harms Codes of Practice, which companies will need to comply with by March 2025.
Conclusion
Whilst the situation with Musk and X is a stark reminder of large platforms’ obligations under the DSA regarding policing of illegal content, it also serves as a timely reminder for many online businesses more generally about the importance of DSA and OSA compliance and other reasons why companies should monitor content on their platforms. Keep an eye out for more articles and insights where we will be delving deeper into these areas, their effect on digital platforms and businesses and how businesses can adopt a proactive approach on compliance.
The development and use of technology has rapidly progressed over recent years, but rapid technological progress does not come without negative impacts. However, further technological developments are inevitable and will dictate the need for further, complementary, legislation in this space in order to protect the vulnerable from harm.