#SOTEU Commission President Von der Leyen: “Algorithms must not be a black box”

I’m happy she has taken up my fight!

Today, EU Commission President Ursula von der Leyen set out her priorities for the year ahead in her first State of the Union address to the European Parliament. I welcome her declaration that “the next decade must be Europe’s digital decade”. She announced three priorities: First data, secondly artificial intelligence and thirdly infrastructures.

Regarding artificial intelligence (AI) technologies she said:

Algorithms must not be a black box and there must be clear rules if something goes wrong. The Commission will propose a law to this effect next year.

In addition, she said with a clear commitment to fight racism in the EU:

We will tackle unconscious bias that exists in people, institutions and even in algorithms.

Tackling bias and discrimination in algorithmic decision-making and artificial intelligence is one of my key priorities of this mandate – one that I often repeated in my plenary speeches (YouTube), Commissioner hearings and Committee meetings in the EU Parliament. I am pleased that my concern was heard and found its way to the top of the EU Commission hierarchy.

Moreover, that algorithms should not be a “black box” is a strong statement, and I hope that the Commission proposals announced for the next year will be up to the task. Unfortunately, the White paper published in February this year was not ambitious enough and could have contained much stronger proposals to limit the use of AI where it can have a negative impact on fundamental rights.

When it comes to the announced legislative proposal and transparency of algorithms, it is clear that the devil will be in the detail – you can find my political demands and suggestions in more detail for the upcoming Commission proposal further below.

Regarding her first priority, it is welcome that she wants Europe to make better use of industrial, non-personal data since a “real data economy, (…) would be a powerful engine for innovation and new jobs” by securing this data and making “it widely accessible”. A European cloud based on GaiaX could boost European industry and is needed to protect European privacy standards.

Her third and last digital priority is infrastructure, and one action points would be to bring fast broadband connections to rural areas is also hugely important – even for Germany, where a lack of access to fast internet has been demonstrated over and over again.
I completely missed any mention to open source software and the role of civil society.

In conclusion: There’s still much work to do for Europe to get in the driver’s set for AI and platform regulation, fostering civil society participation in digital development and boosting tech tools to fight climate change and make our economy future-proof. I’ll be happy to work on these priorities and to usher Europe’s digital decade!

My suggestions to make sure that we really open the “black box” are as follows:

  • A nuanced risk-based approach: AI systems need to be assessed in advance on the basis of the potential harm to the individual as well as to society as a whole – although systems that may affect an individual’s access to resources or participation in social processes must never be placed in the lowest risk category. A binary approach to assessing risk (either low or high) is clearly insufficient since a wrong categorization of applications could mean collateral damage in both directions.
  • No discrimination: AI systems must be tested for their potential to discriminate or violate other rights before they are purchased and deployed.
  • Strong oversight, supervision and accountability: Supervisory authorities should be given the right to go to the companies and gain access to software documentation, code and data sets – as agreed in my opinion (pdf) on “the framework of ethical aspects of AI”, and adopted by all Members of the European Parliament of the IMCO Committee. A European agency or board is needed to coordinate efforts and give support to Member State authorities that struggle with complicated technical decisions 
  • A ban or at least a moratorium on biometric applications in publicly accessible spaces, because the threat to our fundamental rights is far too high.
  • +32 22 84 59 05
  • Alexandra.Geese@ep.europa.eu

News & Events​