The new EU digital rulebook sets out unprecedented standards on the accountability of online companies, within an open and competitive digital market.
On Tuesday, Parliament held the final vote on the new Digital Services Act (DSA) and Digital Markets Act (DMA), following a deal reached between Parliament and Council on 23 April and 24 March respectively. The two bills aim to address the societal and economic effects of the tech industry by setting clear standards for how they operate and provide services in the EU, in line with the EU’s fundamental rights and values.
The Digital Services Act was adopted with 539 votes in favour, 54 votes against and 30 abstentions. The Digital Markets Act – with 588 in favour, 11 votes against and 31 abstentions.
What is illegal offline, should be illegal online
The Digital Services Act (DSA) sets clear obligations for digital service providers, such as social media or marketplaces, to tackle the spread of illegal content, online disinformation and other societal risks. These requirements are proportionate to the size and risks platforms pose to society.
The new obligations include:
- New measures to counter illegal content online and obligations for platforms to react quickly, while respecting fundamental rights, including the freedom of expression and data protection;
- Strengthened traceability and checks on traders in online marketplaces to ensure products and services are safe; including efforts to perform random checks on whether illegal content resurfaces;
- Increased transparency and accountability of platforms, for example by providing clear information on content moderation or the use of algorithms for recommending content (so-called recommender systems); users will be able to challenge content moderation decisions;
- Bans on misleading practices and certain types of targeted advertising, such as those targeting children and ads based on sensitive data. The so-called “dark patterns” and misleading practices aimed at manipulating users’ choices will also be prohibited.
Very large online platforms and search engines (with 45 million or more monthly users), which present the highest risk, will have to comply with stricter obligations, enforced by the Commission. These include preventing systemic risks (such as the dissemination of illegal content, adverse effects on fundamental rights, on electoral processes and on gender-based violence or mental health) and being subject to independent audits. These platforms will also have to provide users with the choice to not receive recommendations based on profiling. They will also have to facilitate access to their data and algorithms to authorities and vetted researchers.
A list of “do’s” and “don’ts” for gatekeepers
The Digital Markets Act (DMA) sets obligations for large online platforms acting as “gatekeepers” (platforms whose dominant online position make them hard for consumers to avoid) on the digital market to ensure a fairer business environment and more services for consumers.
To prevent unfair business practices, those designated as gatekeepers will have to:
- allow third parties to inter-operate with their own services, meaning that smaller platforms will be able to request that dominant messaging platforms enable their users to exchange messages, send voice messages or files across messaging apps. This will give users greater choice and avoid the so-called “lock-in” effect where they are restricted to one app or platform;
- allow business users to access the data they generate in the gatekeeper’s platform, to promote their own offers and conclude contracts with their customers outside the gatekeeper’s platforms.
Gatekeepers can no longer:
- Rank their own services or products more favourably (self-preferencing) than other third parties on their platforms;
- Prevent users from easily un-installing any pre-loaded software or apps, or using third-party applications and app stores;
- Process users’ personal data for targeted advertising, unless consent is explicitly granted.
Read more here.