Things To Know About EU’s Digital Content Law

Things To Know About EU's Digital Content Law
Things To Know About EU's Digital Content Law

The Digital Services Act, a landmark piece of EU legislation, requires digital companies to take aggressive measures against illicit and problematic content.

With over 45 million active monthly users in the European Union, the law has been applied to very large platforms since August. The largest tech companies in the world risk significant fines if they violate the law.

The massive law goes into effect on Saturday for all businesses, though the smallest ones will have some exemptions.

More actions are anticipated from the European Commission, which has already launched a flurry of investigations to find out what the digital behemoths have done to comply.


These are the main components of the regulation:

1. Guidelines for every platform

All platforms are required, among other things, to take immediate action to either remove or prevent access to any illegal content as soon as they become aware of it.

When they suspect a criminal offense that endangers people’s lives or the safety of others, they must also notify the authorities right away.


Companies are required to release an annual report detailing their content moderation efforts and the time it took them to act upon receiving notice of unlawful content. They will also provide an update on the decisions made in user disputes.

The law requires online shopping sites to authenticate users and ban repeat fraudsters, while platforms are required to suspend users who share illegal content, such as hate speech or fake advertisements, on a regular basis.

Furthermore, children under the age of 17 are not allowed to see targeted advertising, which is governed by stricter regulations.

The EU forbids targeted advertising based on sensitive data, including sexual orientation, religion, or ethnicity. It also wants users to be able to see how their data is used.


Small businesses, which are defined as those with fewer than 50 employees and a yearly revenue of less than 10 million euros, are exempt from the more onerous requirements of the law.

2. Specific guidelines for big platforms

The clothing retailer Zalando, Apple, Amazon, Facebook, Google, Instagram, Microsoft, Snapchat, TikTok, and three significant adult websites are among the 22 “very large” platforms that the EU has identified.

Amazon and Zalando have filed lawsuits contesting their designations, and Meta and TikTok are suing to avoid paying an enforcement fee.

These big platforms need to evaluate the risks associated with using their services in terms of the dissemination of illicit content and invasions of privacy.


Additionally, they need to put internal mechanisms in place to reduce these risks, like better content moderation.

In order for authorities to determine whether the platforms are following the law, the platforms also need to grant regulators access to their data.

Approved researchers will also have access to this.

Businesses will pay for an independent audit conducted once a year by third-party organizations to verify compliance. In addition, an independent internal supervisor will be appointed to monitor compliance with regulations.

3. Penalties and complaints

The DSA aims to facilitate the process of users’ complaints being taken seriously.

Users will have the option to file a complaint with their appropriate national authority alleging that a platform violates the DSA.

Online retailers could be held liable for any harm caused by user-purchased, hazardous, or noncompliant goods.

Fines for violations could reach six percent of a company’s worldwide revenue, and the EU may even decide to bar offending platforms from operating in Europe if they commit violations repeatedly.

“Very large” platforms will be sanctionable by the commission.

4. National and EU coordination

The EU’s 27 member states are required by law to designate a competent authority that has the authority to look into and punish any infractions committed by smaller businesses.