Increases in Online Safety
The European Council’s recently adopted Digital Services Act (“DSA”) will introduce and impose obligations on various online intermediaries who offer their services to the single market, whether they are based in the EU or outside, including intermediaries such as social media platforms, providers of hosting services and online marketplaces.
The aim and rationale behind the DSA is to ensure a safer and more transparent digital space for users. There is an increased focus on protecting consumers online and establishing a clear accountability framework for online platforms.
The UK government is also proposing, through the UK Online Safety Bill (“Bill”), measures to improve online safety. The Bill, which will introduce new rules relating to search engines and platforms allowing users to post content or interact with others, is currently awaiting its Third Reading in the House of Commons before it is passed to the House of Lords.
Digital Services Act:
The obligations imposed on individual online intermediaries by the DSA will depend upon the role and size of the intermediary and the risks associated with the intermediary’s activities and function.
The measures introduced will include obligations:
- for online marketplaces to combat the online sale of illegal products and services;
- for platforms to counter illegal content online and react quickly when issued with an order to remove illegal content when notified by a relevant authority;
and will:
- ban misleading interfaces known as ‘dark patterns’ and practices aimed at misleading users;
- increase transparency around advertising, including measures to protect minors; and
- increase transparency and accountability for platforms.
Stricter rules apply for very large online platforms and search engines, which will have to:
- offer users a system for recommending content that is not based on profiling; and
- analyse the systemic risks they create (including risks related to the dissemination of illegal content, negative effects on fundamental rights and on gender-based violence or mental health).
UK Online Safety Bill:
Platforms and companies in scope of the proposed Bill are those that host ‘user-generated’ content or allow users to communicate and interact with other users online. These may include:
- social media platforms;
- forums, messaging applications, online games, cloud storage; and
- search engines (which may enable users to access harmful content).
The proposed Bill aims to impose obligations to increase the protection of online users when interacting with such platforms – with particular focus on protecting children - as well as balancing that with freedom of speech. The Bill has been delayed recently because of the change in Prime Minister and concerns over obligations for platforms to remove illegal material and address content regarded as ‘legal’ but ‘harmful’. The Bill may be amended further, however, we will have to wait to see the outcome of the remaining Parliamentary stages. The Bill is set to return to Parliament in December 2022.
Under the current draft Bill:
- platforms are to introduce more ‘empowerment tools’, which may include verification settings, but also tools to give users more control over what content they see on platforms and who they interact with.
- platforms will have a duty to protect both children and adults (but see above) from legal, yet harmful content, and make it clear in their terms and conditions that their platform cannot be used in such a way (and must make sure that they do enforce such a restriction).
- The larger platforms will have an obligation to implement systems that will prevent fraudulent adverts from being displayed on their platform.
Under the proposed Bill, Ofcom will have enforcement powers to ensure that those platforms in scope are compliant.
Conclusion
The development and use of technology has rapidly progressed over recent years but rapid technological progress does not come without negative impacts. It is certainly the case that legislation has failed to keep up with technological progress and has exposed users, in particular children, to harmful content and influences. The changes proposed under the DSA and the Bill are aimed at tackling these negative impacts and influences. However, further technological developments are inevitable and will dictate the need for further, complementary, legislation in this space in order to protect the vulnerable from harm.