lawyermonthly 1100x100 oct2024eb sj lawyermonthly 800x90 dalyblack (1)

The Digital Services Act: No Longer The Wild West

In this Article
Reading Time:
4
 minutes
Posted: 8th March 2022 by
Catherine McGregor
Last updated 8th March 2022
Share this article

The world in 2022 looks very different than it did even five years ago. Shifts in economic, consumer, and working practices were already underway; Covid-19 has accelerated these changes at a pace once unimaginable. Central to many of these developments in how we work, live and shop is digital innovation, which brings with it new communication and transactional opportunities for platforms, brands, and consumers. 

However, regulation of this digital frontier has lagged behind the pace of new innovations. Benefits of convenience for online consumers must be balanced with the myriad dangers of rapid growth; fraudulent activity; counterfeit, scams; hate speech; fake reviews and unauthorised goods, now common in the digital marketplace. According to research by consumer rights group Which?, cases of fraud reported to the UK police unit Action Fraud rose by one third during 2020 due to the boom in online shopping caused by Covid-19. 

Until now, online platforms and marketplaces have not been liable for user-generated content on their sites, and the legislation of platforms has failed to evolve as the digital world has developed and expanded. But change is afoot.

This cannot come too soon, says one expert in the field. Chris Downie is the co-founder of Pasabi, a technology company that helps organisations weed out fake goods and content online:     

To date, online platforms and marketplaces have enjoyed considerable success without legal responsibility or ramifications for their content. They have been focused on growth which has been accelerated by the shift to digital during the pandemic. Managing risk has not been their top priority resulting in consumers being exposed to counterfeit goods, fake reviews and online fraud.”

The DSA Is On Its Way

The EU has responded to the need for digital regulation of online platforms with the creation of the Digital Services Act (DSA).  Recently opened up to negotiation by the EU member states, the DSA aimed to define how digital services should operate and moderate content in the EU. It focuses on creating a safer digital space for online consumers and companies through safe, trusted and transparent platforms. Among core concerns addressed by the legislation are the trade and exchange of illegal goods, services and content online, and algorithms that help spread disinformation. Users will have more control over what they see, the choice to allow targeted advertising and clear information about why specific content is aimed at them. 

One significant element of the DSA is how platforms should respond to illegal content. Across the 27 EU member states, there are four types of illegal content: child sexual abuse material, racism and xenophobia, terrorism, and Intellectual Property Rights infringements. Online platforms will be liable for users’ unlawful behaviour if they are aware of illegal content and fail to remove it. 

The new legislation, expected to take effect from June 2022, applies to UK businesses offering products and services to businesses or individuals in the EU. Trading in the EU means complying with the DSA, regardless of your physical location. Platforms without a physical EU presence must designate an EU-based legal representative responsible for cooperating with supervisory authorities, the European Commission and the European Board for Digital Services. Designated legal representatives can be held liable for non-compliance.

Safer Marketplaces

The DSA will require platforms to take counterfeit seriously. Online marketplaces will be required to ‘trace their traders’ – to know their business customers and discourage traders selling unsafe or counterfeit goods. Platforms will be requested to create a system of ‘trusted flaggers’ to help ensure easier and faster identification and removal of fake goods and public authorities will be provided with new tools to order the removal of unsafe products.

For larger platforms that reach more than 10% of Europe’s 450 million users, the DSA has specific and additional rules. These include:

  • Access to key data. Larger platforms must provide researchers with access to key data to understand how online risks evolve over time. 
  • Independent audits. Independent auditors experienced in risk management and algorithms will be introduced to prevent the misuse of larger platforms’ systems and to check for compliance with the DSA.  

A Focus for All

It is essential that all platforms and digital services providers, regardless of size, are proactive in advance of the DSA. Content monitoring will require an active approach, specifically for content moderation and product listings, and how interactions take place on platforms.  

  • Larger platforms (e.g. Facebook and YouTube) will need to be more transparent regarding content recommendations made using algorithms and AI tools, and advertising practices. They must keep account of these as evidence.
  • Marketplaces (e.g. Amazon and Wish) will be required to remove illegal or fraudulent content. They will need to create reporting channels for users to identify unlawful content and outline a process for contesting removal decisions. 
  • Community-based platforms (e.g. TripAdvisor and Mumsnet) must become proactive in detecting and removing hate speech.
  • Gig-economy platforms (e.g. Uber and Deliveroo) will face greater responsibility for criminal offences occurring via the platform. 

Non-compliance with the requirements of the DSA can mean fines of up to 6% of annual income or turnover, plus periodic penalty payments for continuous infringements of up to 5% of average daily turnover in the previous financial year. But platforms that proactively detect any illegal content themselves are not liable for it. 

Chris Downie at Pasabi welcomes the change:

Innovation in technology is happening so quickly that regulators thus far haven’t been able to keep up with it. The tide is now turning, as greater efforts are being made to safeguard users online. Consumers are looking for change and regulators are finally starting to take proper notice and action.

What Can Platforms Do To Prepare?     

  • Undertake a full risk assessment establishing potential threats to your platform from illegal or fraudulent activity.
  • Consider the complaints handling process by defining clear processes for users to report unlawful content and plan how to act.
  • Define ‘transparency’ for users by outlining the rationale for content removal and be open about algorithm content recommendations.
  • Introduce transparency reports and consider how these will be incorporated into workflows.
  • Create a single point for electronic contact for interactions with the authorities overseeing the legislation.

About the author: Catherine McGregor, author of Business Thinking in Practice for In-House Counsel: Taking Your Seat at The Table. (Globe Law & Business 2020), provides thought leadership consultancy and workshops on a range of subjects relating to human-centred skills in business and Diversity, Equity and Inclusion. She is currently the editor of Modern Lawyer which focuses on ideas, opinions, learning and creativity for legal leaders.

About Lawyer Monthly

Lawyer Monthly is a news website and monthly legal publication with content that is entirely defined by the significant legal news from around the world.