This article is the first in a series on the Digital Services Act (DSA). Watch this space for the next articles on my priorities in the DSA: Algorithmic transparency for recommender systems, ad tech and limitations to massive data exploitation, interoperability and unbundling to break up big tech monopolies, user rights via notice and action procedures and Social Media Councils.
The legislative initiative report, EU Parliament
In the EU Parliament, there are currently three initiative reports ongoing on the Digital Services Act.
The report in IMCO is considered being one of the main reports as it is “legislative” (but not binding, so no actual legislation), alongside the report in JURI, meaning that the Parliament can suggest specific language in the final resolution that the Commission should put into its legislative proposal.
I am responsible for the Greens/EFA for this report. Title: 2020/2018(INL) Digital Services Act: Improving the functioning of the Single Market
JURI’s very strong report (Rapporteur: Tiemo Wölken, S&D) is also “legislative”. It will be interesting to see who in the end – IMCO or JURI – will have the competence and be lead for the actual piece of legislation.
The legislative proposal: Digital Services Act, EU Commission
In her political guideline from July 2019, the President of the European Commission Ursula von der Leyen has committed to “upgrade the Union’s liability and safety rules for digital platforms, services and products, with a new Digital Services Act” (DSA). In December 2020/January 2021, the Commission is expected to publish the proposal for the Digital Services Act. It will very likely be a package and contain two individual pieces of legislation:
Adopted nearly 20 years ago, the ECD sets up liability exemptions for online services for content that users upload and share on their networks. Until recently, these rules applied horizontally to all sorts of illegal content, including copyright infringements, hate speech, and child abuse material. The current rules for take-downs and removals are therefore (indirectly) defined by the ECD.
While the ECD is not perfect and led to problems, mainly due to lack of clarity, its safe harbour provisions encouraged the protection of the fundamental rights of users, in particular the freedom of expression and of information. Since the adoption of the ECD, however, the landscape of services has drastically changed. Notably, cloud services and social media platforms became very important players and some have gained significant market power. Currently, a small number of dominant platforms (mostly US-based) have a high impact on individuals’ rights and freedoms, our societies and on our democracies.
The nature of the internet has also vastly changed in the past 20 years towards an increasingly participatory community. As a result, the amount of user-generated content has increased exponentially. On the other hand, we witnessed more government pressure on companies to implement voluntary mechanisms against alleged illegal or “harmful” content. These two parallel developments resulted in an increasing number of wrongful removals and blocking of legitimate speech making a clear legal framework even more important.
Timeline and current status
The DSA will revise the rules contained in the E-Commerce Directive of 2000 that affect how online companies regulate and influence user activity on their platforms, including people’s ability to exercise their rights and freedoms online, such as freedom of expression, the right to privacy and data protection.
This review aims to update the rules on how online services, such as social media networks, should or should not delete or block illegal and “harmful” content. A reform might also bring changes to how these companies could be held liable when such content is not taken down. Secondly, the DSA will propose new rules for online companies that have gained a dominant market position. It will therefore be an opportunity to start fixing some of the societal problems that big tech is causing today.
This is why reforming those rules has the potential to be either a big threat to fundamental rights or a major improvement of the current situation online. It is also an opportunity for the European Union to decide how central aspects of the internet will look in the coming ten years and to become a global standard-setter.
The reform is expected to tackle important issues including centralised platform monopolies, broken business models based on profiling and tracking, and illegal online content and behaviour.
Fotohinweis: CC-BY 2.0 Oliver Hinzmann, oohaa.de