The digital economy
Digital - for Justice and Participation
Digital technologies make it easier for us to communicate, travel, work, learn, study and make new friends – across borders. In keeping with the founding principles of the internet, digital networking should lead to more participation, more involvement and more knowledge for everyone and thus put our democracy on a broader footing.
In recent years, however, digital technologies have been used to concentrate power and wealth in a very few hands. Google, Facebook and co are gaining control over ever larger parts of the economy and the process of shaping political opinion in western countries. In China, digitalisation is being used to monitor every aspect of the life of society, so that any form of dissent can be nipped in the bud.
More than ever, we need freedom and justice. I intend to campaign for solidarity-based, green and feminist digitalisation which leaves no one behind.
These are the issues I will focus on during this parliamentary term:
The internet should be a free forum. The state must not hand over to private firms the task of filtering content and must itself decide whether the material posted by users meets the requirements of the law or not. Clear procedures are also needed to safeguard the rights of users whose content is unlawfully removed from the internet. More information here.
Google and Facebook earn money from the adverts they run. It is in their interests, therefore, to keep users on their platforms for as long as possible. To do so, they use algorithms that decide what we see there. Unfortunately, it is human nature to interact for particularly long periods with content that generates strong feelings in us – hatred, anxiety and anger. For example, our timelines are flooded with content that deals with issues – including political issues – in an emotionally and often factually incorrect manner. This is a threat to our democracy, because many people are finding it increasingly difficult to distinguish facts from fake news. I want to change that, and I am calling for greater transparency. Research bodies must be allowed to study the way algorithms work. On the basis of the findings, we will have to decide what free but pluralist information platforms should look like.
Micro-targeting means that certain content is disseminated only to small groups of people, who can be targeted very precisely on the basis of their characteristics, preferences or psychological profile. As a result, the messages disseminated cannot be challenged in public spaces because they are not visible to the majority of users. There are suspicions that the Brexit referendum and the election of US President Trump were influenced by micro-targeting. This is unacceptable. I have already organised an event with experts to discuss the regulatory options. I will continue to work on this issue, because every democracy needs universally accepted facts so that opinions can be shaped fairly. Only when we are aware of the information made available to others is true communication possible.
Hate speech, abuse and threats are part of everyday life online. According to a Campact study, half of those surveyed are now afraid to express their views publicly online. This massive restriction on freedom of expression affects, in particular, women, young people and people with an immigration background or a disability, i.e. groups who already find it more difficult to gain a hearing offline. We must put a stop to this. The internet belongs to everyone. You can find details of all the measures I am proposing to protect people against hate speech and harassment online here.
The tech giants not only control large sections of public opinion, but also exert huge influence over online trading and are earning more and more money despite employing fewer and fewer people. Promising start-ups are taken over, data held by different companies (e.g. WhatsApp, Instagram, Facebook) combined. This makes life difficult for small and medium-sized enterprises, particularly in Europe, with its many different languages and laws. These monopolies must be broken up in order to give smaller competitors a chance. As a first step, we must review competition law and guarantee data protection.
Many AI applications currently work in a flawed and discriminatory manner, because the underlying data sets are not sufficiently balanced, neutral and comprehensive. We therefore need to devise ways of compiling high-quality data sets which can be made available to industry for research and product development purposes. Data trustees could be one solution. Good data are fundamental to the development of effective products. If we in Europe focus on the quality and not the quantity of data, our industry has an opportunity to set itself apart from its competitors by virtue of the quality and non-discriminatory nature of the products it offers – with ‘AI made in Europe’.