Internet giants must implement measures against illegal content and disinformation in the EU

The Digital Services Law (DSA) became legally applicable this weekend for the 19 platforms and search engines with more than 45 million users in the European Union, such as Google, Meta (the parent company of Facebook and Instagram), X (formerly Twitter), and TikTok, among others.

This implies that they must comply with the strictest provisions of the regulations, which oblige them to take measures against illegal content disseminated on the Internet and put a stop to online disinformation.

The list of online ‘giants’ also includes platforms such as Amazon or Zalando, which have filed appeals against the Commission’s decision before the EU Court, since they consider that they do not fit this definition and seek to avoid the regulations being applied to them.

The battery of measures of the new law obliges the large platforms and search engines that operate in the EU to establish terms and conditions and an appeal mechanism for content moderation decisions “that even a child can understand”, explained community sources.

The period also expired this weekend for the platforms to present their first annual evaluation exercise, “aimed at reducing the risks associated with the dissemination of illegal content or the manipulation of services with an impact on democratic processes and public safety.” In addition, they must present their transparency reports within a period of two months.

In the same way, the law introduces the concept of “algorithmic responsibility” by which “the European Commission, as well as the member states, will have access to the algorithms of the large online platforms”, forced from now on to eliminate the “products, illegal services or content quickly after they have been reported.

Likewise, the DSA introduces new transparency requirements regarding the parameters of the content recommendation systems that digital platforms use and present their users with the content they consider relevant to their interests. For this reason, large content platforms as well as Google or other prominent search engines “will have to offer users a recommendation system that is not based on their profile.”

In this sense, some “important changes” have already been observed, in particular with regard to updating the terms and conditions of use of some of these ‘giants’ and community sources have pointed out that TikTok has presented a new algorithm that it is not based on user profiles and a “brand new” ad file.

In the case of other platforms such as X (formerly Twitter), which are undergoing a great transformation process, the same sources have assured that Elon Musk himself “read the DSA and liked it” although, in June, when a delegation from Brussels visited the headquarters of the platform, “they were not even remotely prepared to comply with the regulations” although they have been “receptive”.

For their part, the Member States will have to appoint digital service coordinators before February 17, 2024, the date on which also platforms with less than 45 million active users — which fall within the scope of national legislation — will have to comply with all DSA regulations.

If the rule is breached, the platforms and search engines could receive fines of up to 6% of their worldwide turnover, while in the case of the ‘giants’, the European Commission will have the exclusive power to demand such compliance.

Although the legislation will not have effects “immediately visible to users”, as recognised from Brussels, a “long-term impact” is expected that generates “substantial” changes that contribute to eliminating illegal content online.

The law also places limits on the so-called “dark web patterns”, interaction systems used by some companies on the Internet aimed at the economic exploitation of their users through misleading questions, the absence of price comparisons or the introduction of artificial obstacles to terminate certain services.

This new legislation also requires any content platform accessible by minors “to establish special protection measures to guarantee their safety, particularly when they are aware that a user is a minor. Thus, “platforms will be prohibited from presenting targeted advertising based on the use of personal data of minors as defined in EU law”.