The Digital Services Act (DSA) aims to create a safer digital space by implementing a common set of rules relating to intermediary service providers' obligations and accountability.

Obligations placed on intermediary services, hosting platforms, Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) will impact brands, their marketing communications, and the content brands place online. This article examines certain aspects of the DSA that indirectly affect advertising transparency; traceability of traders; algorithm transparency; and content moderation.

Advertising Transparency (Ad Repositories)

The DSA requires VLOPs to compile and publish advertisement repositories that detail certain information about advertisements, advertisers, targeting parameters and customer reach. This applies to all advertisements, including commercial advertisements. Increased advertising transparency has several potential implications for brands in the context of marketing.

Enhanced transparency will allow brands to strengthen their influencer strategy, enabling them to identify which influencers are prominent in which markets. It will also enable brands to identify industry trends more easily, potentially giving them a competitive edge. It will be possible for brands to access the ads of their competitors and gain insights into their marketing strategies. However, although increased transparency offers access to previously unavailable insights, it also has the potential to weaken competitive advantage. Disclosing information about confidential advertising strategies could weaken the competitive positions of some major brands, leading to a loss of market share. Amazon recently succeeded in obtaining interim measures to halt its obligation to provide an advertisement repository for this very reason. This is part of its wider challenge against its designation as a VLOP (see our previous analysis of this decision here).

Traceability of Traders

Trader traceability under the DSA will affect brand owners operating in online marketplaces. The DSA requires online marketplaces to trace traders who enter contracts with consumers on their platforms. Brands wishing to trade on such platforms must provide the following information:

  • name, address, telephone number, and email address (contact details);
  • identity document or other electronic identification;
  • payment account details;
  • if applicable, extract and number from the commercial register or details from a similar public register; and
  • self-certification committing to comply with EU law.

It is important to note that traders are responsible for the accuracy of their details and information. However, online marketplaces must also verify the accuracy and completeness of such information. In doing so, they may request traders to provide them with supporting documentation to help with the verification process. There is an obligation on online marketplace platforms to request traders to correct relevant information, if they suspect that it is inaccurate, incomplete, or outdated. If the trader does not provide this information, the platform is required to temporarily stop their access to the services, until correct information is provided.

This data, and an overarching "know-your-business-customer" approach, is intended to act as a deterrent against bad actors and will be critical for brand owners who are looking to trace sellers of counterfeit goods. Online marketplaces will be notified of illegal products being sold. The online marketplace must, in turn, inform customers who purchased these goods about: the illegality; the identity of the trader; and any relevant means of redress. This will likely result in traders relying more heavily on traceability solutions to ensure that illegal goods do not reach European consumers. This enhanced traceability will add to brand owners' arsenal in combatting the sale of counterfeit goods and trade mark infringement.

Algorithm Transparency

The DSA requires all intermediary service providers to provide information on policies, procedures, measures, and tools used for content moderation in their terms and conditions. This includes information relating to algorithmic decision-making. There is an additional obligation on VLOPs and VLOSEs that use recommender systems. VLOPs and VLOSEs are required to provide transparency on the main parameters of the algorithms used on their platforms and justification for the main parameters that these algorithms use. This involves explaining the criteria that affect the information users receive, and the reasons for the use of these criteria. Furthermore, they must provide users with at least one option for each of their recommender systems which is not based on profiling. Amazon recently sought to avoid this obligation to provide an opt-out. It applied for interim measures excluding it from complying with the recommender system opt-out obligation, arguing that giving customers the ability to opt-out would create a risk of irreversible loss of market share and result in negative effects on shopping experiences. However, Amazon was not successful, and the European General Court dismissed the application.

Algorithm transparency is positive for brand owners, as a greater understanding of how the algorithms work will enable them to forecast and predict how certain content or marketing communications will perform or, for example, how their products could be ranked on an online marketplace. Furthermore, it is anticipated that enhanced transparency will result in platforms taking steps to combat issues of potential bias and discrimination embedded in algorithms. This may result in a greater variety of brand content and marketing reaching users from different demographics.

Content Moderation

Brands now, more than ever, must ensure that their content is appropriate, not misleading, or illegal. Under the DSA, platforms are required to set up mechanisms for users and brands to report illegal content on their platforms. They are also required to allow users to "flag" content that they think is inappropriate and to cooperate with so-called "trusted flagger" entities to identify and remove illegal content.

As platforms tighten up on content moderation, brands will have to assume more responsibility for the content they post. A lack of diligence by brands and their partners could result in content being reported and removed, platform services being withdrawn from them, or their accounts being suspended.

Conclusion

The DSA has the potential to significantly impact how digital businesses and brands cooperate in the digital space. Brand owners should take note of the potential implications and plan accordingly, ensuring that they:

  • have appropriate marketing guidelines which are compliant with the DSA; and
  • use the trader traceability mechanisms as an additional weapon in their arsenal for IP enforcement.

This is an evolving space, and we expect to see the DSA become a more prominent tool in the fight against brand manipulation, infringements and counterfeit activities.

Contributed by Sophie Jones

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.