The Digital Services Act[1] (DSA) is an EU regulation adopted in 2022 that addresses illegal content, transparent advertising and disinformation.
[4] Key requirements include disclosing to regulators how their algorithms work, providing users with explanations for content moderation decisions, and implementing stricter controls on targeted advertising.
[6] The expressed purpose of the DSA is to update the European Union's legal framework for illegal content on intermediaries, in particular by modernising the e-Commerce Directive adopted in 2000.
[8] This so-called "conditional liability exemption" is fundamentally different[9][10] from the broad immunities given to intermediaries under the equivalent rule ("Section 230 CDA") in the United States.
[12] A December 2020 Time article said that while many of its provisions only apply to platforms which have more than 45 million users in the European Union, the Act could have repercussions beyond Europe.
In addition, the Commission can apply periodic penalties up to 5% of the average daily worldwide turnover for each day of delay in complying with remedies, interim measures, and commitments.
As a last resort measure, if the infringement persists and causes serious harm to users and entails criminal offences involving threat to persons' life or safety, the Commission can request the temporary suspension of the service.
[18][19] As of December 2023, 13 VLOPs have received a request for information (RFI),[15] the procedure necessary to verify compliance with the DSA, and one is being subjected to a formal proceedings.
[23] The Digital Services Act builds in large parts on the non-binding Commission Recommendation 2018/314 of 1 March 2018[24] when it comes to illegal content on platforms.
On 20 January 2022 the Parliament voted to introduce amendments in the DSA for tracking-free advertising and a ban on using a minor's data for targeted ads, as well as a new right for users to seek compensation for damages.
[44] Accordingly, the Democracy Action Plan, and subsequently the DSA, were strongly influenced by the Delfi AS v. Estonia and Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary ECtHR cases, which outlined a framework for assessing intermediary liability on digital platforms.
[48] In its application of proportionality analysis, the ECtHR found that the Hungarian courts had failed to strike a fair balance between protecting reputation and ensuring freedom of expression.
In particular, the DSA drew from the ECtHR's distinction between different types of illegal content, as well as its proportionality analysis in both cases, by incorporating nuanced rules on intermediary liability and ensuring that measures taken by platforms do not unreasonably restrict users' freedom of expression and information.
[53] Mike Masnick of Techdirt praised the DSA for ensuring the right to pay for digital services anonymously, but criticised the act for not including provisions that would have required a court order for the removal of illegal content.
[55][56] Some academics have expressed concerns that the Digital Services Act might be too rigid and prescribed,[57] excessively focused on individual content decisions or vague risk assessments.
[63] Tech companies have repeatedly criticised the heavy burden of the rules and the alleged lack of clarity of the Digital Services Act,[64] and have been accused of lobbying to undermine some of the more far-reaching demands by law-makers, notably on bans for targeted advertising,[65] and a high-profile apology from Sundar Pichai to Breton on leaked plans by Google to lobby against the Digital Services Act.
[72] Swedish member of the European Parliament Jessica Stegrud argued that the DSA's focus on preventing the spread of disinformation and "harmful content" would undermine freedom of speech.