The EU has decided on another ambitious piece of legislation to police the online world.
After hours of negotiations, early Saturday morning, the bloc agreed on the broad duration of the DSA or Digital Services Act, which will force tech businesses to take greater accountability for the content that emerges on their platforms.
New obligations incorporate:
- Releasing illegal content and goods more quickly.
- Explaining to users and investigators how their algorithms work.
- Taking stricter movement on the spread of misinformation.
Companies face penalties of up to 6 % of their annual turnover for non-compliance.
“The DSA will upgrade the ground rules for all online services in the EU,” declared European Commission President Ursula von der Leyen. “It gives practical development to the principle that what is illegal offline should be illegal online. The greater the size, the more significant the responsibilities of online platforms.”
Margrethe Vestager, the EU Commissioner for Competition who spearheaded much of the bloc’s tech regulation, said the deed would “ensure that platforms are held accountable for the risks their services can pose to society and citizens.”
The DSA shouldn’t be mistaken with the DMA or Digital Markets Act, which was decided in March. Both acts involve the tech world, but the DMA focuses on designing a level playing field between businesses, while the DSA markets with how companies police content on their platforms. Therefore, the DSA will likely have a more immediate influence on internet users.
Although the legislation only spreads to EU citizens, the effect of these laws will be felt in other parts of the world, too. Global tech companies may choose it is more cost-effective to implement a single approach to police content and accept the EU’s comparatively stringent regulations as their benchmark. While lawmakers in the US are keen to rein in Big Tech with their rules, they have already begun examining the EU’s rules for inspiration.
The final words of the DSA have yet to be released, but the European Parliament and European Commission have detailed several obligations it will contain:
Targeted advertising is banned based on an individual’s religion, sexual orientation, or ethnicity.
“Dark patterns” — confusing or tricky user interfaces designed to navigate users into making confident choices — will be prohibited. The EU says that, as a rule, canceling subscriptions should be as comfortable as signing up for them.
Large online platforms like FB will have to make the working of their recommender algorithms transparent to users. Users should also be delivered a recommender system “not based on profiling.” For instance, in the case of Instagram, this would mean a chronological feed.
Hosting services and online platforms will have to demonstrate clearly why they have released illegal content and give users the ability to appeal such takedowns. However, the DSA itself does not define what content is prohibited and leaves it to individual countries.
The biggest online platforms will have to deliver critical data to researchers to “provide more insight into how online risks evolve.”
Online marketplaces must keep basic information about traders on their platforms to track down individuals selling illegal goods or services.
Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine).
The DSA will, like the DMA, distinguish between tech companies of different sizes, placing more outstanding obligations on more prominent companies. The largest firm — those with at least 45 million users in the EU, like Meta and Google — will face the most scrutiny. These tech companies have lobbied hard to water down the requirements in the DSA, particularly those concerning targeted advertising and handing over data to outside researchers.
Although the EU member states have now agreed upon the broad terms of the DSA, the legal language still needs to be finalized, and the act officially voted into law. This last step is seen as a formality at this point, though. Therefore, the rules will apply to all companies 15 months after the act is voted into law or from 1 January 2024, whichever is later.