Taking a closer look at the digital services act: content moderation, trader traceability and transparency for advertising and recommender systems
Introduction
The European Union's (EU) Digital Services Act (DSA) entered into force on 16 November 2022. Part of a comprehensive regulatory package for online services, the DSA aims at "setting out harmonised rules for a safe, predictable and trusted online environment" where both innovation and fundamental rights are protected.
The DSA introduces a broad range of requirements for providers of digital "intermediary services" – a definition which captures a broad range of businesses which store or transmit the content of third parties, such as web hosting services (including cloud providers, online market places, social media networks and other online platforms) as well as online search engines and internet service providers.
In this article we examine key DSA measures relating to:
- notice and action mechanisms for illegal online content
- internal complaint-handling systems in relation to content moderation
- traceability of traders
- transparency of online advertising
- recommender systems.
These measures impose obligations on a large number of providers of hosting services (including online platforms), no matter their size, as long as they offer services to users established or located in the EU. In particular, social media platforms, online platforms that display advertising or recommend content, or those allowing consumers to conclude distance contracts should pay close attention to these new measures. These obligations apply alongside a broader set of requirements within the DSA's in a 'graded' structure, which applies different requirements according to the nature of service provided and the size of the service provider. For a general overview of the requirements introduced by the DSA, see our article: The Digital Services Act – what is it and what impact will it have?
Entry into force
Following the approval of both the European Parliament and the Council of the European Union, the DSA was adopted on 4 October 2022 and published in the Official Journal of the EU on 27 October 2022. The DSA entered into force on 16 November 2022, 20 days after its publication in the Official Journal. The majority of the DSA's provisions will apply from 17 February 2024, save for certain provisions addressing providers defined as 'very large online platforms' or 'very large online search engines', which became effective on 16 November 2022. As a result, the providers designated as very large online platforms or very large online search engines by the European Commission (Commission) will be required to comply with the DSA within four months of being so designated. In addition, as of 17 February 2023 and at least once every six months thereafter, all providers, regardless of their size, will be required to publish information on the average monthly active recipients of their respective service(s) in the EU. The same information has to be communicated to the Digital Services Coordinator (i.e., the authority in charge of supervising the application of the DSA and enforcing it in each Member State) and to the Commission, if they so request.
ALL Hosting Service Providers: The Notice and Action Mechanism
Under the DSA, hosting service providers will be required to adopt a clearly identifiable reporting mechanism for the purpose of the "notice and action mechanism". In a nutshell, Article any individual or entity must be able to indicate to the providers of hosting platforms any content on their platform which they consider to be illegal, in order to protect the rights and legitimate interests of all affected parties (which particularly applies to the fundamental rights guaranteed by the Charter of Fundamental Rights of the European Union ('Charter')). It is worth noting that certain entities with specific expertise (called 'trusted flaggers') have the right to submit prioritised notices if they deem that certain products or services offered are illegal.
The DSA clarifies that service provider is only considered to be on notice of illegal content (and therefore exposed to liability in relation to it) only when the notice fulfils certain requirements (e.g., includes information on why the individual or entity considers the information illegal and the electronic location of the content).
Online Platforms: Establishment of an Internal Complaint-Handling System
A sub-set of hosting services providers which fall into the definition of "online platforms" are required to set up an internal complaint-handling system, enabling their service recipients to object to the online platform provider's content moderation decisions under the DSA. This requirement seeks to help ensure freedom of speech if protected when content is being removed or restricted on the ground that the information is illegal or incompatible with the platform's terms and conditions. Service recipients must be able to submit complaints against decisions that restrict visibility, availability, or accessibility of the information submitted by the recipient, as well as against decisions that limit the provision of the service to the recipient (in whole or in part), that suspend or terminate the recipient's account,, or that restrict the ability to monetise the relevant content.
It must be possible for recipients to lodge a complaint for up to six months following the date when the user is informed of the decision. The platform is then required to handle the complaint "in a timely, non-discriminatory, diligent and non-arbitrary manner", which includes requiring the platform to (a) reconsider its decision regarding the provided information and potentially reverse it if there is reasonable ground to consider the content, in fact, legitimate, (b) inform complainants without undue delay of their reasoned decision in respect of the complaint, and (c) ensure that the decisions are made under the supervision of appropriately qualified staff, not solely on the basis of automated means. It is, however, unclear whether the platform provider may disable certain content in case alleged illegality arose in the course of an ongoing (provider internal or even judicial) assessment of the content's legality.
Setting up an internal complaint handling system as required by the DSA will likely involve a significant administrative effort as well as a noticeable cost burden for the service providers. The decision not to impose these obligations on platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC is clear evidence that the legislators did foresee the economic impact of the implementation of the complaint-handling system.
Besides the internal complaint handling system, the DSA provides that any recipient of the service may resort to an out-of-court dispute settlement body which has been certified by the Digital Services Coordinator of the Member States. The certification of the respective bodies is subject to certain conditions, including impartiality, independence (which includes financial independence), expertise, ease of accessibility, and capability of settling disputes in a swift, efficient and cost-effective manner and in at least one of the official languages of the institutions of the Union. The DSA further promotes an easily accessible out-of-court procedure by reducing – to the highest extent possible – the fees borne by the recipient of the service. The settlement is free of charge for the recipient (or charged at nominal fee). If the settlement dispute body rules in favour of the service recipient, the service provider bears all fees charged by the settlement body and is required to reimburse the recipient for any reasonable expense that is has paid in relation to the dispute settlement. If the dispute is settled in favour of the service provider, the recipient is not required to reimburse any fees or expenses of the service provider, unless it is found that the recipient manifestly acted in bad faith.
Online Platforms Allowing Customers to Conclude Distance Selling Contracts With Traders: Ensuring Trader Traceability
Providers of online platforms will be involved in ensuring (i) traceability of those traders allowed to enter into distance agreements with consumers using their platform, and (ii) the lawfulness of the products and/or contents offered through the platform. Under the DSA, providers of online platforms must, to the best of their ability, check the reliability and integrity of the information provided by traders (including, inter alia, information on the trader, such as name, address, contact information, the trader's bank account details, a copy of the identification document, information on the registration in a public register and a self-certification issued by the trader committing him to only offer products or services that comply with the applicable EU law) and request traders to correct this information in case of doubt regarding its accuracy or any incompleteness. Notably, the platform is not responsible for the accuracy of such information since the liability lies with the trader. However, if the trader fails to correct or complete the information upon request of the platform, the platform is required to "swiftly" suspend the provision of its service to the trader. Therefore, the service provider may virtually be liable vis à vis consumers if it did not timely remove the products offered by the trader which did not provide complete and sufficient information upon request of integration by the platform.
In addition, the service providers of online platforms shall (i) adopt an online interface which is designed and organised in a way that enables traders to comply with their obligations regarding providing mandatory pre-contractual information under applicable EU law and (ii) react if they become aware of illegal products or contents offered on the platform by a trader located in the EU in the six months preceding the moment when the service provider became aware of the illegality. The service provider shall inform the recipient of the acquired product or service in question about the illegality, the responsible trader, and options for seeking redress. If the provider is not in possession of the consumer's contact details, they shall make this information publicly available through their online interface. Interestingly, the DSA finally adopted a softer approach regarding the duties of providers when illegal products or services are offered on their platform than the originally foreseen in the first draft.
Several issues remain uncertain, such as a possible liability of platform providers vis-à-vis consumers for traders' information that turns out to be false, or enforcement of the DSA when sellers are based in third countries outside of the EU – or when the disclosure of required information is in conflict with the national legislation in such third country.
Online Platforms: Transparency of Online Advertising
The DSA addresses online advertising and related issues. Although the European Parliament and the European Data Protection Board considered "a phase-out leading to a prohibition of targeted advertising on the basis of pervasive tracking", the DSA takes a less radical approach. It requires online platforms hosting advertising content to comply with transparency obligations, consistent with the increasing call for transparent use of technology set out, for example, in the draft AI Act and draft Data Act. Notably, many national courts and regulators in the EU have already embraced this quest for transparent technology by forcing digital and gig economy businesses to exploit AI and monetise data in more transparent, unbiased and ethical ways (see, for example, the Deliveroo, Uber, Ola Cabs, Facebook cases in Italy and the Netherlands).
Article 26 of DSA aligns with the E-Commerce Directive (which required commercial communications and the entity on whose behalf the communication is made to be clearly identifiable as such) in focusing on the platform's disclosure duties. Interestingly, the DSA pursues the objective of informing the recipient on the following key elements: (i) the "marketing" nature of the information, clarifying the identity of the person who paid for the advertisement if it is different from the subject on behalf of whom the advertisement is made and (ii) the parameters used to determine the recipients to whom the advertisement is displayed, and how to change those parameters. So, in terms of approach, the EU legislators chose a clear "privacy DNA" for this provision, by requiring online platforms to:
- provide their users with satisfactory information (obviously without prejudice to the requirement for privacy notices under the General Data Protection Regulation (GDPR));
- rely on a legal basis for any personal data processin (in accordance with the GDPR). Interestingly, express reference to "the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising" in the DSA's recitals suggests that the DSA identifies consent as the only lawful basis on which online targeted advertising may rely, and accordingly calls for the adoption of easily enforceable consent refusal and withdrawal mechanisms. This approach is consistent with the original intention to take a stringent approach to targeted advertising, but seems to take a very narrow view of the GDPR, which does not automatically close the door to profiling based on legitimate interest; and
- prevent profiling (as defined in the GDPR) using special categories of personal data " for advertising purposes. This provision would interact with Article 9(2)(a) GDPR, under which consent may not be used to process special categories of data (e.g., data revealing racial or ethical origin, political opinions, etc.) in cases where "Union … law provide[s] that the prohibition [to process that kind of data] may not be lifted by the data subject".
In practical terms, explaining profiling and tracking by referring to generic parameters and clusters (e.g. 'Male Italian speaker, aged 18-25') will not be sufficient to meet the transparency duties set out in the DSA. Instead, providers of online platforms will be required to publish on their websites suitable terms and conditions – or standalone 'transparency statements' – outlining the logic underlying the processing. Explaining in simple terms (aimed at ensuring that every web user is in a position to understand) may be challenging when tracking and processing rely on highly sophisticated processes and techniques, such as machine learning technology, which may rely on counterintuitive or unconventional logic.
Online Platforms: Regulation of Recommender Systems
The DSA regulates the use of recommender systems, i.e. a software which uses information to predict a user's preferences. Recommender systems are responsible, for example, for generating 'Recommended for today' or 'Made for you' playlists on music platforms or 'Films you may like' on video streaming platforms.
In a similar vein to the transparency requirements for online advertising, the DSA requires websites recommending content to provide users with readily available and easily retrievable information regarding:
- The parameters used in the recommender systems. The DSA contains a minimum list, including (i) the criteria that are most significant in determining recommendations and (ii) the reasons for the relative importance of those parameters; and
- The options available to the user to modify or influence the main parameters of the recommender system. Additionally, the user must be put in a position to easily select and modify at any time his/her preferred option for each of the recommender systems that determines the relative order of information presented to him/her.
Interestingly, the DSA does not expressly contemplate a mandatory 'disable recommendations' option for platforms to make available to users who do not wish to receive recommendations. The reason may lay in the fact the EU legislators acknowledge that recommendations are the inherent reason for many platforms' success (the vast majority of the most viewed films and shows on certain platforms are 'recommended'). Processing for recommendation purposes may lawfully rely on the platform's legitimate interest to maximise the user's experience by suggesting content he/she may like. But what if the recommender system embeds bias in its logic and ends up discriminating users based on faulted criteria? As currently drafted, the DSA does not seem to be concerned with this possibility, thereby leaving any prevention of discriminatory or biased recommender systems to be addressed in other laws.
Practical Considerations
How will the DSA ultimately impact service providers on a practical level?
Compliance with the DSA will likely put a significant financial burden on online platforms and other providers of digital intermediary services. Adapting a platform's technical design and providing adequate resources for maintenance of the notice and action mechanism and for the complaint-handling system will most likely involve substantial costs. The exact costs arising from compliance with the DSA will depend on the size and the reach of the service provider, with the highest cost – including a yearly fee intended to cover the costs involved in monitoring compliance – remaining limited to very large online platforms and very large online search engines.
For many businesses, preparation for the DSA will include, for example:
- assessing the current status of their services against DSA definitions, taking into account guidance issued by authorities to clarify the measures;
- reviewing their tools and interfaces, as well as their terms and conditions and other notices, to determine whether these comply with the DSA obligations; and
- ensuring that they have sufficient financial resources and personnel designated ('appropriately qualified staff') for implementation and maintenance of the relevant measures, including for the information gathering processes necessary for reporting obligations.
Several measures the DSA introduces may already have been implemented by service providers, as they correspond with obligations imposed on them by current national legislation in EU Member States. In this regard, the DSA may actually be considered by some to induce positive change, as it consolidates the currently fragmented regulatory system for digital services across EU Member States to create a set of uniform rules, which may bring some efficiency benefits for companies operating across the EU single market. The practical changes the DSA brings might be less extensive for companies that have already implemented measures to comply with such other laws. However, businesses will nevertheless have to check whether their current processes and controls are sufficient to meet the new requirements imposed on them by the DSA.