The “Digital Services Act” (DSA) is already one of the biggest and ambitious legislative projects of the mandate: The EU wants to set new rules for internet services and platforms, how they deal with illegal content, the rights of individual users, and it aims at setting stricter rules for market dominant tech companies such as Google, Facebook and Co. The goal of the effort is an update of the 20-year-old e-Commerce Directive (ECD).
The planned reform should set new rules for digital services that are active in the EU. The process is likely to be as long and complex as the work on the General Data Protection Regulation of the past mandate and is likely to take several years. Europe faces the next major confrontation between many conflicting interests – between authors, media groups, online platforms, anti-discrimination and civil rights, NGOs and consumer protection groups and many more.
Even though the legislative proposal for the Digital Services Act will only see the light of the day towards the end of February 2021, work has already begun in the EU Parliament – with three own-initiative reports (legally non-binding), MEPs are telling the Commission what they wish to see included in the proposed text next year.
The main report is being prepared by the Internal Market and Consumer Protection Committee (IMCO), for which I am responsible as shadow rapporteur for the Greens/EFA group. This means that I am negotiating the content of the report that my Social Democrat colleague Alex Agius Saliba has drafted (see his draft, pdf).
The draft report is very balanced and solid. It is good to see that important points that we shadow rapporteurs have given the rapporteur during a first meeting have been taken into account. I especially welcome the proposals:
However, an own-initiative report must also be seen as an opportunity to submit innovative proposals to the EU Commission, which the Commission will not be able to ignore next year when it legislates. The draft has missed this opportunity. It is not inspired enough on several important points – as we have to bear in mind that platform regulation will be a huge project and a long-awaited opportunity to fix the internet at crucial points.
In particular, I miss these three central points in the draft:
In Germany, the requirements of the NetzDG law were a good start, but experience has now shown that there is room for a lot of improvement (see for instance these two analyses here and here). Firstly, the introduction of standardised reporting formats would be incredibly helpful to enable researchers to analyse and compare companies’ practices. Another point is that transparency reports need to provide more detailed data – while it is important to get the compliance rate and take-downs by type of content in general, it is also interesting to know what the compliance rate is for each type of content. For example: Is the deletion of hate speech faster than dealing with defamation cases on one social network or another?
We also need transparency rules for the recommendation systems of online platforms such as YouTube’s “Up next” or pages suggested by Facebook. Studies have shown that for common search terms on YouTube, people are increasingly being recommended extreme articles, as attention on social networks is lucrative. For example: If you ask for “vaccine facts”, it’s only a few steps to get to anti-vaccination conspiracy theories; if you look for “global warming”, you get climate denials and so on.
According to which rules and criteria do algorithms and human employed by the tech companies decide which content we get to see and which they make disappear?
We urgently need annual transparency reporting requirements for systemic platforms to disclose content-specific ranking and recommender decisions, we need binding rules for public interfaces (APIs) and thirdly, authorities must be able to conduct independent audits of the algorithms.
The Digital Services Act could establish the principle of “freedom of expression by design” – just as the DSGVO introduced “privacy by design”. This means, among other things, that the general rules, such as terms and conditions or community standards, of tech companies should meet certain minimum standards in Europe – they should be transparent, fair, easy to understand, non-discriminatory and predictable.
Online services should therefore be designed from the outset in such a way that they do not drive away people affected by hate speech from their networks, that they do not erroneously delete legal content and thus offer a safe space for everyone. The dynamics on digital services the net must not encourage hate speech against women, LGBTQI groups, people with a migration background etc.
Moreover, dissuasive penalties and conditions should be imposed in the event of breaches of these rules by Internet services. The fines applied in data protection law to date have not proved effective in view of the enormous financial strength of some corporations.
The next steps are planned as follows: