Digital Services Act

The Digital Services Act Regulation 2022 (EU) 2022/2065 ("DSA") is a regulation in EU law to update the Electronic Commerce Directive 2000 regarding illegal content, transparent advertising, and disinformation. It was submitted along with the Digital Markets Act (DMA) by the European Commission to the European Parliament and the Council on 15 December 2020. The DSA was prepared by the Executive Vice President of the European Commission for A Europe Fit for the Digital Age Margrethe Vestager and by the European Commissioner for Internal Market Thierry Breton, as members of the Von der Leyen Commission.

On 22 April 2022, European policymakers reached an agreement on the Digital Services Act. The European Parliament approved the DSA along with the Digital Markets Act on 5 July 2022. On 4 October 2022, the European Council gave its final approval to the Regulation on a Digital Services Act. It was published in the Official Journal of the European Union on 19 October 2022. Affected service providers had until 1 January 2024 to comply with its provisions. Popular online platforms and search engines need to comply with their obligations four months after they have been designated as such by the EU Commission.

Objectives of the DSA
Ursula von der Leyen proposed a "new Digital Services Act", in her 2019 bid for the European Commission's presidency.

The expressed purpose of the DSA is to update the European Union's legal framework for illegal content on intermediaries, in particular by modernising the e-Commerce Directive adopted in 2000. In doing so, the DSA aims to harmonise different national laws in the European Union that have emerged at national level to address illegal content. Most prominent amongst these laws has been the German NetzDG, and similar laws in Austria ("Kommunikationsplattformen-Gesetz") and France ("Loi Avia"). With the adoption of the Digital Services Act at European level, those national laws would be overwritten and would have to be repealed.

In practice, this will mean new legislation regarding illegal content, transparent advertising and disinformation.

New obligations on platform companies
The DSA is meant to "govern the content moderation practices of social media platforms" and address illegal content. It is organised in five chapters, with the most important chapters regulating the liability exemption of intermediaries (Chapter 2), the obligations on intermediaries (Chapter 3), and the cooperation and enforcement framework between the commission and national authorities (Chapter 4).

The DSA proposal maintains the current rule according to which companies that host others' data become liable when informed that this data is illegal. This so-called "conditional liability exemption" is fundamentally different from the broad immunities given to intermediaries under the equivalent rule ("Section 230 CDA") in the United States.

The DSA applies to intermediary service providers that offer their services to users based in the European Union, irrespective of whether the intermediary service provider is established in the European Union.

In addition to the liability exemptions, the DSA would introduce a wide-ranging set of new obligations on platforms, including some that aim to disclose to regulators how their algorithms work, while other obligations would create transparency on how decisions to remove content are taken and on the way advertisers target users. The European Centre for Algorithmic Transparency was created to aid the enforcement of this.

A 16 November 2021 Internet Policy Review listed some of new obligations including mandatory "notice-and-action" requirements, for example, respect fundamental rights, mandatory redress for content removal decisions, and a comprehensive risk management and audit framework.

A December 2020 Time article said that while many of its provisions only apply to platforms which have more than 45 million users in the European Union, the Act could have repercussions beyond Europe. Platforms including Facebook, Twitter, TikTok, and Google's subsidiary YouTube would meet that threshold and be subjected to the new obligations.

Companies that do not comply with the new obligations risk fines of up to 6% on their global annual turnover. In addition, the Commission can apply periodic penalties up to 5% of the average daily worldwide turnover for each day of delay in complying with remedies, interim measures, and commitments. As a last resort measure, if the infringement persists and causes serious harm to users and entails criminal offences involving threat to persons' life or safety, the Commission can request the temporary suspension of the service.

Large online platforms
On 23 April 2023, the European Commission named a first list of 19 online platforms that will be required to comply starting 25 August 2023. They include the following very large online platforms (VLOPs) with more than 45 million monthly active users in the EU as of 17 February 2023.
 * Alibaba AliExpress
 * Amazon Store
 * Apple AppStore
 * Booking.com
 * Facebook
 * Google Play
 * Google Maps
 * Google Shopping
 * Instagram
 * LinkedIn
 * Pinterest
 * PornHub (added 20 December 2023)
 * Shein (added 26 April 2024)
 * Snapchat
 * Stripchat (added 20 December 2023)
 * Temu (added 31 May 2024)
 * TikTok
 * Wikipedia
 * X (formerly Twitter)
 * XNXX (added 10 July 2024)
 * XVideos (added 20 December 2023)
 * YouTube
 * Zalando

Very Large Online Search Engines (VLOSEs):


 * Bing
 * Google Search

Amazon and Zalando both initiated proceedings in the General Court challenging the designations, claiming unequal treatment compared to other large retailers, and that their core business models are retail not distributing third party content/products. Zalando argued the criteria and methodology lack transparency, for instance in how it counts active users, while Amazon said VLOP rules are disproportionate for its business model and asked to be exempted from transparency around targeted ads.

As of December 2023, 13 VLOPs have received a request for information (RFI), the procedure necessary to verify compliance with the DSA, and one is being subjected to a formal proceedings. 3 further platforms, all of them providing adult content, were added on 20 December 2023.

Legislative history
The Digital Services Act builds in large parts on the non-binding Commission Recommendation 2018/314 of 1 March 2018 when it comes to illegal content on platforms. However, it goes further in addressing topics such as disinformation and other risks especially on very large online platforms. As part of the preparatory phase, the European Commission launched a public consultation on the package to gather evidence between July and September 2020. An impact assessment was published alongside the proposal on 15 December 2020 with the relevant evidence base.

The European Parliament appointed Danish Social Democrat Christel Schaldemose as rapporteur for the Digital Services Act. On 20 January 2022 the Parliament voted to introduce amendments in the DSA for tracking-free advertising and a ban on using a minor's data for targeted ads, as well as a new right for users to seek compensation for damages. In the wake of the Facebook Files revelations and a hearing by Facebook Whistleblower Frances Haugen in the European Parliament, the European Parliament also strengthened the rules on fighting disinformation and harmful content, as well as tougher auditing requirements.

The Council of the European Union adopted its position on 25 November 2021. The most significant changes introduced by the Member States are to entrust the European Commission with the enforcement of the new rules, in the wake of allegations and complaints that the Irish Data Protection Watchdog was not effectively policing the bloc's data protection rules against platform companies.

The Data Governance Act (DGA) was formally approved by the European Parliament on 6 April 2022. This sets up a legal framework for common data spaces in Europe which will increase data sharing in sectors such as finance, health, and the environment.

With Russia using social media platforms to spread misinformation about the 2022 Russian invasion of Ukraine, European policymakers felt a greater sense of urgency to move the legislation forward to ensure that major tech platforms were transparent and properly regulated, according to The Washington Post. On 22 April 2022, the Council of the European Union and the European Parliament reached a deal on the Digital Services Act in Brussels following sixteen hours of negotiations. According to The Washington Post, the agreement reached in Brussels solidifies the two-bill plan the Digital Services Act and the Digital Markets Act, a law regulating competition. The latter is aimed at preventing abuse of power against smaller competitors by larger "gatekeepers". The next stage before the bills become law, was the votes from the Parliament and policymakers from EU 27 nations. The law passed on October the 19th of 2022. This was considered to be a formality.

Influence of the European Court of Human Rights
The DSA was passed alongside the Digital Market Act and the Democracy Action Plan. The latter of these is focused on addressing the nuanced legal interpretation of free speech on digital platforms, a fundamental right that has been extensively guided by the European Court of Human Rights (ECtHR) and the European Convention on Human Rights. Accordingly, the Democracy Action Plan, and subsequently the DSA, were strongly influenced by the Delfi AS v. Estonia and Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary ECtHR cases, which outlined a framework for assessing intermediary liability on digital platforms.

In Delfi AS v. Estonia, the ECtHR applied proportionality analysis when considering whether the Estonian courts' decision to hold the online platform Delfi liable for hate speech posted by its users was a proportionate restriction on Delfi's right to freedom of expression. The court found that, given the serious nature of the hate speech, the Estonian courts' actions were justified to protect the rights of others. In other words, the ECtHR upheld the liability of online platforms for hate speech posted by their users, underlining that platforms could be expected to take proactive steps to control content when there is a clear risk of harm from unlawful comments. This case highlighted the responsibilities of platforms to prevent the spread of harmful content.

On the other hand, the MTE and Index.hu v. Hungary case illustrated the nuanced limits of freedom of speech on digital platforms. In its application of proportionality analysis, the ECtHR found that the Hungarian courts had failed to strike a fair balance between protecting reputation and ensuring freedom of expression. The Hungarian courts imposed strict liability on the platforms for user comments that were offensive but did not constitute hate speech, constituting a disproportionate interference in the platforms' right to freedom of expression. The ECtHR ruled that imposing strict liability on platforms for user comments, without consideration of the nature of the comments or the context in which they were made, could infringe on freedom of expression. This judgment emphasized the need for a balance between protecting reputation and upholding free speech on digital platforms.

These decisions by the ECtHR provided critical legal precedents that shaped the EU’s decision-making process on the framework of the DSA. In particular, the DSA drew from the ECtHR's distinction between different types of illegal content, as well as its proportionality analysis in both cases, by incorporating nuanced rules on intermediary liability and ensuring that measures taken by platforms do not unreasonably restrict users' freedom of expression and information.

Reactions
Media reactions to the Digital Services Act have been mixed. In January 2022, the editorial board of The Washington Post stated that the U.S. could learn from these rules, while whistleblower Frances Haugen stated that it could set a "gold standard" of regulation worldwide. Tech journalist Casey Newton has argued that the DSA will shape US tech policy. Mike Masnick of Techdirt praised the DSA for ensuring the right to pay for digital services anonymously, but criticised the act for not including provisions that would have required a court order for the removal of illegal content.

Scholars have begun critically examining the Digital Services Act. Some academics have expressed concerns that the Digital Services Act might be too rigid and prescribed, excessively focused on individual content decisions or vague risk assessments.

Civil Society organisations such as Electronic Frontier Foundation have called for stronger privacy protections. Human Rights Watch has welcomed the transparency and user remedies but called for an end to abusive surveillance and profiling. Amnesty International has welcomed many aspects of the proposal in terms of fundamental rights balance, but also asked for further restrictions on advertising. Advocacy organisation Avaaz has compared the Digital Services Act to the Paris Agreement for climate change.

Following the 2023 Hamas-led attack on Israel, Thierry Breton wrote public letters to X, Meta Platforms, TikTok, and YouTube on how their platforms complied with the DSA regarding content related to the conflict and upcoming elections. The Atlantic Council's Digital Forensic Research Lab reported that Breton's letters did not follow DSA processes, and digital rights group Access Now criticised Breton's letters for drawing a "false equivalence" between illegal content and disinformation.

Tech companies have repeatedly criticised the heavy burden of the rules and the alleged lack of clarity of the Digital Services Act, and have been accused of lobbying to undermine some of the more far-reaching demands by law-makers, notably on bans for targeted advertising, and a high-profile apology from Sundar Pichai to Breton on leaked plans by Google to lobby against the Digital Services Act.

A bipartisan group of US senators have called the DSA and DMA discriminatory, claiming that the legislation would "focus on regulations on a handful of American companies while failing to regulate similar companies based in Europe, China, Russia and elsewhere."

The DSA was mostly welcomed by the European media sector. Due to the influence gatekeepers have in selecting and controlling the visibility of certain journalistic articles over others through their online platforms, the European Federation of Journalists encouraged EU legislators to further increase the transparency of platforms' recommendation systems via the DSA.

Nevertheless, the DSA's later stage inter-institutional negotiations, or Trilogues, have been criticized as lacking transparency and equitable participation. These criticisms mirror past experiences with the drafting of the EU Regulation on Preventing the Dissemination of Terrorist Content Online as well as the General Data Protection Regulation (GDPR).

Swedish MEP Jessica Stegrud argued that the DSA's focus on preventing the spread of disinformation and "harmful content" would undermine freedom of speech.