The EU's Digital Services Act reaches its first milestone as the UK's Online Safety Bill weaves towards the finish

The EU's Digital Services Act reaches its first milestone as the UK's Online Safety Bill weaves towards the finish

Overview

The EU and the UK are each determined to regulate online content and protect users from online harms.  The EU got there first.  Its Digital Services Act, which will impact all online intermediaries operating in the EU at varying levels, is already in force.

17th February 2023 marked the DSA's first milestone, the deadline for online platforms and search engines to publish average active user figures as a precursor to the European Commission identifying the very largest online platforms and search engines, which will face the most stringent controls and responsibilities.  Meanwhile, the journey of the UK's Online Safety Bill, which has seen four prime ministers since its inception as the Online Harms White Paper, has been a troubled one.  This briefing looks at the obligations on online intermediaries under the Digital Services Act and at some of the key similarities and differences between the Digital Services Act and the Online Safety Bill.

The scope of the EU's Digital Services Act (DSA )

The DSA, an EU Regulation that applies directly to each Member State, aims to create a harmonised set of rules redefining the responsibilities and accountabilities of online intermediaries providing goods, services and content in the EU. 

Extra-territorial effect of the DSA

The DSA has extra-territorial effect: the rules apply if the online intermediary operates in the EU and there is a "substantial connection" to the EU (an establishment in the EU or significant number of users in the EU or targeting of one or more Member States).  UK-based intermediaries could therefore be subject to both the EU and UK regimes.

A tiered approach for obligations under the DSA

The DSA applies to a range of providers across the digital ecosystem, dividing them into four categories, where each of categories (ii) to (iv) below is a subset of the one before:

  • online intermediaries
  • hosting services (such as cloud and webhosting services)
  • online platforms (bringing together sellers and consumers and disseminating information to the public at their request, such as online marketplaces, app stores, collaborative economy platforms and social media platforms); and
  • "Very Large Online Platforms" (VLOPS) and "Very Large Online Search Engines" (VLOSEs). A platform or search engine is “very large” if it has 45 million active monthly service recipients in the EU and if the EU Commission designates it as such.

The DSA imposes some obligations on all online intermediaries, but the obligations build through the tiers, with the most stringent duties being reserved for the highest tier, the VLOPs and VLOSEs.  Key obligations that apply cumulatively per tier are set out below.

 

 

Obligations that apply to all online intermediaries:

The definition of "intermediary services" is based on the familiar classes of intermediary services from the eCommerce Directive: mere conduits, caches and hosts.  The DSA preserves the hosting liability shield from the eCommerce Directive, prohibiting Member States from imposing an obligation on providers to monitor content and retains the "notice and takedown" process whereby a hosting provider is only liable for illegal content that it hosts if, having obtained actual knowledge of the illegality of that content, it fails expeditiously to remove or block access to it.  The DSA reinforces the liability shield with a new "Good Samaritan" clause to ensure that providers do not lose these defences by reason of voluntarily taking proactive steps to remove illegal content. 

Meanwhile it increases transparency and bolsters the "takedown process" requiring providers to:

  • publish two contact points to facilitate communication with authorities and users respectively and, if based outside the EU, appoint a legal representative in the EU (who can be held directly liable for non-compliance)
  • set out in their terms and conditions any restrictions on use of the services, content moderation measures and algorithmic decision-making
  • issue an annual transparency report on matters such as content moderation measures including the number of take down and disclosure orders received; and
  • inform authorities of action in response to takedowns and disclosure orders.

Plus...

Providers of hosting services must also:

  • implement a notification mechanism for illegal content, report back to requesters on whether the content is removed and notify users if content is restricted; and
  • report to law enforcement authorities if the hosted content creates a suspicion that a criminal offence has occurred involving a threat to life or safety.

Plus...

Providers of online platforms must also:

  • publish average numbers of active recipients by 17 February 2023 and six monthly thereafter (see "First Milestone" text box below)
  • not use "dark patterns", interfaces that manipulate or distort the choices taken by users, e.g. design choices that benefit the provider but are not in users' interests, options presented in a biased manner, processes for cancellation of a service that are more difficult than the process for subscription and default settings that are awkward to change
  • offer users a complaint and out-of-court settlement process following a takedown decision
  • suspend repeat offenders
  • provide transparency on adverts so that users know that what they are receiving is an advert, on whose behalf it is sent and why the advertisement was selected for them.  In a similar way, terms and conditions must set out the main parameters used in recommender systems (which rely on previous choices that the user has made to target future content), as well as any options for the recipients to change those parameters
  • where platforms are accessible to minors, put in place measures to ensure the privacy, safety, and security of minors and not present adverts based on profiling where the provider is reasonably certain the recipient is a minor; and
  • for online marketplaces (such as Amazon), vet and ensure traceability of traders on their platforms to help identify sellers of illegal goods (including notifying consumers if they become aware of illegal products or contents offered on the platform by a trader in the preceding six months).

Plus...

The heaviest compliance burden falls on VLOPs and VLOSEs, which must also:

  • monitor, report on, and mitigate any systemic risks stemming from the design or functioning of their platform (whilst balancing these against restrictions on freedom of expression). The list of risks to consider is substantial and includes not just illegal content but other societal risks such as impacts on fundamental rights, election manipulation, harms to children and physical and mental well-being etc.
  • create an independent compliance function
  • commission an annual independent audit
  • as well as publishing the independent audit report and their reports on content moderation, adhere to additional, stringent transparency requirements
  • provide access to data to regulators and independent researchers to monitor and assess their compliance with the DSA (including access to the workings of their algorithms)
  • provide at least one option for their recommender system that is not based on profiling; and
  • co-operate fully with the Commission in responding to public health and security crises.

From when do the obligations under the DSA apply?

For most entities, the DSA will apply from 17 February 2024, but there are earlier deadlines:

  • 17 February 2023, for providers of online platforms and online search engines to publish, for each online platform or online search engine, information on the average monthly active recipients in the EU of the service (AMARs), calculated as an average over the period of the past six months (see "first milestone" box below).
  • Providers of VLOPs and VLOSEs must comply with the rules specifically applicable to them four months after their designation by the European Commission (or 17 February 2024 if earlier).

First milestone for the DSA

In the absence of any official methodology, online platforms and online search engines that were required to meet the 17 February 2023 deadline to publish AMARs had to rely on the definitions in the DSA (including recital 77) and the set of Q&A guidance published by the Commission on 1 February 2023.  Unfortunately the Q&A guidance was not particularly enlightening.  It is reasonable to assume that their calculations are therefore based on a range of different interpretations - providers should document their methodology and ensure it is consistently applied for each review.

At the time of writing, Amazon, Facebook, Google Maps, Google Search, Instagram, Pinterest, Snapchat, TikTok, Twitter and YouTube have all declared figures in excess of the threshold.  No particular surprises there.  Others have only stated that they are below the threshold without giving any figures. Commission spokesman, Johannes Bahrke, has warned that this isn't considered sufficient,

"The rules are clear. A number is a number. We call on those platforms that haven't done so yet to publish the numbers without delay."

Enforcement of the DSA

Each Member State will appoint a "Digital Services Coordinator" with responsibility for supervising the intermediary services established in their Member State.   The Commission will also have powers to oversee and enforce the obligations placed on VLOPs and VLOSEs.   A "European Board for Digital Services" will be created to coordinate compliance and enforcement and act as an advisory board at an EU level, made up of the Digital Services Coordinators from each Member State and chaired by the Commission.

Failure to comply with the DSA can result in fines of up to 6% of global annual turnover and, for the most serious contraventions, a temporary suspension of the service.  Users are also able to seek compensation from a provider for loss suffered due to the provider’s non-compliance.

And how does the UK's Online Safety Bill (OSB) compare?

There are significant areas of cross-over between the DSA and the OSB.  They each:

  • have a shared objective of creating a safe online environment and tackling illegal content
  • catch the same types of entities and have extra-territorial effect, meaning that many UK and EU organisations will be subject to both regimes
  • focus on providers' systems, processes and practices, rather than enabling regulators to challenge individual content
  • impose a more onerous set of responsibilities on the largest providers/those perceived to pose the greatest risk and require them to undertake risk assessments and mitigate risks
  • impose obligations with a view to protecting rights of free speech
  • take a tougher stance on fraudulent and misleading advertising; and
  • carry a similar threat of large fines for non-compliance (maximum fines under the OSB are 10% of global annual turnover).

But scratch beneath the surface of this apparent common ground and you still find a very different underlying approach; key differences include:

  • The DSA's scope is much broader than that of the OSB, with the DSA regulating areas such as intellectual property, consumer protection and illegal goods, "dark patterns" and crisis response, which are beyond the ambit of the OSB. But within that narrower scope, the obligations that the OSB imposes on providers are arguably more onerous than the more specific demands placed on providers by the DSA in similar areas. 
  • Category 1 (highest risk) providers under the OSB will not necessarily be designated as VLOPs under the DSA (and vice versa).
  • The DSA doesn't define illegal content (which will be determined by other EU laws or the laws of Member States) and doesn't differentiate between types of illegal content, whereas the OSB has different tiers of illegal content to which attach varying levels of obligations.
  • The DSA only requires VLOPs to undertake systemic risk assessments whereas, under the OSB, all services must carry out illegal content risk assessments and mitigate the risks identified (albeit that Category 1 providers and services likely to be accessed by children are required to carry out further assessments). In this respect, the OSB demands much more of providers than the DSA.  
  • The DSA specifically addresses illegal content whereas the OSB goes some way towards tackling legal but harmful content as well (although the UK Government was forced to curtail its approach to lawful content in response to concerns around freedom of speech).  The OSB still requires providers to have systems and processes designed to prevent children from having access to harmful content.  In relation to adults, the Government compromised with a "triple shield" whereby, in addition to removing illegal content, a provider is required to remove content that breaches its terms and conditions (although it isn't clear how much discretion providers will have to decide what legal content is permitted under their terms).  Providers also need to provide users with the tools to control what legal content the user sees and with whom the user interacts.
  • The OSB requires Category 1 services to remove fraudulent adverts swiftly when alerted to them and put in place systems and processes to prevent users from encountering them, whereas the DSA's focus is more around transparency in relation to adverts (as well as banning targeted advertising by profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation).   

The compliance task ahead for organisations subject to both the EU and UK regimes, seeking where possible to shape systems and processes that will serve the demands of both regimes, is a particularly challenging one given these disparities.  This is not helped by the fact that the DSA and OSB are following different timetables: the OSB at the time of writing is still making its way through the House of Lords.  Moreover, even when the OSB finally hits the statute books, much of the detail will be left to secondary legislation, guidance and codes of practice and so providers will not have certainty around requirements for some time yet.

GET IN TOUCH

Read Dan Reavill Profile
Dan Reavill
Read Helen Reddish Profile
Helen Reddish
Back To Top