DSApic

Digital Services Act: Proposals for the future of platform governance

The “Digital Services Act” (DSA) is already one of the biggest and ambitious legislative projects of the mandate: The EU wants to set new rules for internet services and platforms, how they deal with illegal content, the rights of individual users, and it aims at setting stricter rules for market dominant tech companies such as Google, Facebook and Co. The goal of the effort is an update of the 20-year-old e-Commerce Directive (ECD).

The planned reform should set new rules for digital services that are active in the EU. The process is likely to be as long and complex as the work on the General Data Protection Regulation of the past mandate and is likely to take several years. Europe faces the next major confrontation between many conflicting interests – between authors, media groups, online platforms, anti-discrimination and civil rights, NGOs and consumer protection groups and many more.

Even though the legislative proposal for the Digital Services Act will only see the light of the day towards the end of February 2021, work has already begun in the EU Parliament – with three own-initiative reports (legally non-binding), MEPs are telling the Commission what they wish to see included in the proposed text next year.

The main report is being prepared by the Internal Market and Consumer Protection Committee (IMCO), for which I am responsible as shadow rapporteur for the Greens/EFA group. This means that I am negotiating the content of the report that my Social Democrat colleague Alex Agius Saliba has drafted (see his draft, pdf).

We discussed it for the first time on 20 May in the IMCO Committee (see recording of the session, minute 18:06). You can find all amendments that I have tabled here (pdf).

The draft report is very balanced and solid. It is good to see that important points that we shadow rapporteurs have given the rapporteur during a first meeting have been taken into account. I especially welcome the proposals:

  • to maintain the basic safe harbour principles of the ECD;
  • to maintain the prohibition of a general monitoring obligation to encourage the creation of innovative services and to protect the fundamental rights of users, in particular freedom of expression and information;
  • to extend the scope to third countries when services are targeted at EU consumers;
  • to substantially strengthen transparency requirements;
  • to adopt additional ex-ante rules for platforms with significant market power.

However, an own-initiative report must also be seen as an opportunity to submit innovative proposals to the EU Commission, which the Commission will not be able to ignore next year when it legislates. The draft has missed this opportunity. It is not inspired enough on several important points – as we have to bear in mind that platform regulation will be a huge project and a long-awaited opportunity to fix the internet at crucial points.

In particular, I miss these three central points in the draft:

1. Meaningful transparency

In Germany, the requirements of the NetzDG law were a good start, but experience has now shown that there is room for a lot of improvement (see for instance these two analyses here and here). Firstly, the introduction of standardised reporting formats would be incredibly helpful to enable researchers to analyse and compare companies’ practices. Another point is that transparency reports need to provide more detailed data – while it is important to get the compliance rate and take-downs by type of content in general, it is also interesting to know what the compliance rate is for each type of content. For example: Is the deletion of hate speech faster than dealing with defamation cases on one social network or another?

We also need transparency rules for the recommendation systems of online platforms such as YouTube’s “Up next” or pages suggested by Facebook. Studies have shown that for common search terms on YouTube, people are increasingly being recommended extreme articles, as attention on social networks is lucrative. For example: If you ask for “vaccine facts”, it’s only a few steps to get to anti-vaccination conspiracy theories; if you look for “global warming”, you get climate denials and so on.

According to which rules and criteria do algorithms and human employed by the tech companies decide which content we get to see and which they make disappear?

We urgently need annual transparency reporting requirements for systemic platforms to disclose content-specific ranking and recommender decisions, we need binding rules for public interfaces (APIs) and thirdly, authorities must be able to conduct independent audits of the algorithms.

2. Freedom of expression by design

The Digital Services Act could establish the principle of “freedom of expression by design” – just as the DSGVO introduced “privacy by design”. This means, among other things, that the general rules, such as terms and conditions or community standards, of tech companies should meet certain minimum standards in Europe – they should be transparent, fair, easy to understand, non-discriminatory and predictable.

Online services should therefore be designed from the outset in such a way that they do not drive away people affected by hate speech from their networks, that they do not erroneously delete legal content and thus offer a safe space for everyone. The dynamics on digital services the net must not encourage hate speech against women, LGBTQI groups, people with a migration background etc.

3. Sanctions

Moreover, dissuasive penalties and conditions should be imposed in the event of breaches of these rules by Internet services. The fines applied in data protection law to date have not proved effective in view of the enormous financial strength of some corporations.

To be improved…

  • Fundamental rights: this part of the draft report is a bit weak, Europe must show much more teeth. Otherwise, the practices of large digital companies will continue to contribute to silencing vulnerable groups and minorities – as the companies often gain financial benefits from driving more attention towards offending content. We cannot leave it to failed “self-regulation” efforts of companies, or ask them on their knees to set standards for what they believe should be “responsible” behaviour in Europe.
  • In addition to the proposed dispute resolution mechanism, that I very much agree with in this form, there should also be Social Media Councils: These could be bodies that develop ethical guidelines and provide an open, transparent, accountable and participatory forum to address general issues of content governance on online platforms. These councils should be composed of people from different backgrounds to ensure that all interests are represented.
  • Finally, I very much agree with the idea of creating a new European body to deal with these issues, as national authorities are often overburdened and overstretched. I propose that the competences be supplemented: This body could support the creation of a European research repository that would bring together data from several platforms – this repository could facilitate appointment procedures and it could allow regulators, researchers or other observers to review and analyse platform decisions.

The next steps are planned as follows:

  • Consideration of amendments: 8 June (tbc)
  • Consideration of compromise amendments: 6 July (tbc)
  • VOTE in IMCO: 28 September (tbc)
  • Plenary in October (tbc)
  • +32 22 84 59 05
  • Alexandra.Geese@ep.europa.eu

Anmelden für den Newsletter