Not ‘too big to care’ any long: The Digital Services Act becomes applicable for Very Large Online Platforms and Search Engines
On Friday 25th of August, the EU’s Digital Services Act came into force for large online platforms. The legislation, which will become fully applicable by February 17th, 2024, aims to provide a safer digital space by establishing obligations for digital services providers operating in the EU regarding transparency, privacy, and illegal and harmful content.
Digital services have become an integral part of our day-to-day life. From communication and shopping to ordering food, seeking information, indulging in entertainment through movies and music, and much more, these services have revolutionized the way we live. Yet, beneath the surface of this digital transformation lies a complex landscape, fraught with challenges that range from the clandestine exchange of illegal goods and services to the misuse of algorithmic power for the propagation of misinformation and other harmful content. To navigate this intricate link between the rise of the digital society and the safeguarding of users’ fundamental rights in the digital age, the European Union launched the Digital Services Act.
The DSA establishes a comprehensive framework of obligations for intermediary service providers offering network infrastructure, such as internet access providers, cloud, and online platforms operating within the European Union, regardless of their geographical origin. These obligations are designed to foster transparency and accountability of digital service providers who play a pivotal role in facilitating digital interactions and content dissemination. Some of the key obligations envisioned are:
- Establishing robust mechanisms to combat the proliferation of illegal goods, services, or content, for example, by establishing user-driven reporting system that empowers individuals to flag such material or by engaging with organizations with expertise in identifying and addressing harmful content (trusted flaggers).
- Enhancing the traceability of business uses by implement reasonable measures, must by enabling the identification of sellers peddling illegal goods or carrying out random checks, to ensure products or services offered do not appear in official databases of illegal items.
- Empowering users to challenge content moderation decisions made by platforms. This added layer of transparency and accountability ensures that users have recourse when their digital experiences are affected.
- Restricting certain types of targeted advertisements on online platforms. Specifically, it prohibits the targeting of children and the use of sensitive personal data categories, such as ethnicity, political views, or sexual orientation, for ad targeting.
- Increasing transparency. Platforms are required to disclose information about the algorithms they employ for content recommendations, shedding light on the mechanics that shape users’ online experiences and recommendations.
The DSA stipulates that the scope of each service provider’s obligations will depend on its size, role and influence in the online ecosystem. Recognising the unique influence exercised by some digital giants and their ‘too big to care’ attitude – quoting the EU’s Internal Market Commissioner, Thierry Bretton -, the DSA introduces a specialised regulatory regime covering 17 very large online platforms (VLOPs) and 2 very large online search engines (VLOSE), i.e. those with user bases exceeding 45 million, as designated by the Commission on April 25th, 2023. These entities are subject to additional, more stringent requirements to prevent the misuse of their systems such as adopting proactive risk-based measures, requiring risk management systems to be assessed by independent auditors or granting researchers free and open access to crucial data. Furthermore, these very large platforms and their compliance with the DSA will be directly supervised by the Commission, as opposed to smaller platforms that will be monitored by national Digital Services Coordinators, to be established by Member States by February 2024.
In the weeks leading up to the entry into force of the DSA for the major digital platforms, some of them published new features and measures to comply with the new legislation. Meta has assembled a substantial team to develop solutions for DSA compliance, expanded its ads transparency tools, adjusted ads experiences for teens aged 13-17, and offered users more control over their content viewing experiences. Furthermore, they are providing unprecedented insights into AI content ranking, tools for researchers, and greater transparency in ad targeting. Changes performed by TikTok include the addition of an option for users to report potentially illegal content, a dedicated team to assess content legality, enhanced transparency in content moderation decisions, and the option for users to turn off personalization for their feeds to meet DSA requirements. Furthermore, TikTok is actively safeguarding the privacy of teenagers by making accounts for users under 16 private by default and limiting personalized advertising.
Regarding Twitter, there is a lot of silence. Although the social media platform withdrew from the EU’s voluntary code to combat misinformation in May, weeks later, Elon Musk emphasized that “Twitter will obey the law. If laws are passed, Twitter will obey the law.” So far, it has launched a Digital Services Act portal, where it allows reporting illegal content in the EU, appealing decisions on illegal content, or obtaining information about out-of-court dispute resolution. There is no information about other major changes.