Whether you're taking out new car insurance, applying for a mortgage, queueing to pass security at Schiphol airport, or booking a holiday: every day you are confronted with digital systems making decisions about you, often without you being able to influence this or without you even realising this is happening. These decisions can have a major influence on your life: you might be picked out of the queue at Schiphol, or have to pay more for your car insurance per month.
‘Discrimination based on an address suffix’
“Dutch law does not yet address the choices that these algorithms sometimes make,” Zuiderveen Borgesius warns. “We now have clear rules to protect people from discrimination based on skin colour, sex or similar characteristics. These rules apply in respect of discrimination by computers as well. But digital discrimination of this kind tends to be difficult to discover.”
Zuiderveen Borgesius: “There is a second category of problems that is even more difficult. Digital differentiation might be unfair even though it does not specifically impact people who share a particular skin colour or similar characteristic. Insurers can use algorithms, for example, to predict which clients make damage claims more often. The computer then uses data to discern correlations that might help predict that.”
“This has led to some insurers charging people whose address contains a letter suffix, 1A or 90L for example, higher premiums. That can be unfair. Digital differentiation can also impact the poor. An example of this is when an energy supplier demands higher advance payments from residents of a neighbourhood with a higher incidence of payment arrears. The people affected by this are often those in poorer neighbourhoods. And for people who always pay their bills on time, it seems unfair if they are treated as defaulters simply because of where they live.
In his inaugural lecture, Zuiderveen Borgesius will be considering the legal complications involved in this type of digital differentiation in more detail. There are a number of complications: discriminating against poor people or people with a particular type of house number is not prohibited under the Dutch non-discrimination statutes. And it may be tricky to determine which data formed the basis for the computer's decisions, and to make this transparent.
“As it stands now the law can protect people from unfair digital differentiation to a certain extent. There are, for example, useful rules in the general non-discrimination statutes, and in the General Data Protection Regulation (GDPR). We must start by enforcing these rules better. Yet there are gaps in the current rules. Supplementary rules are needed.
However, for certain forms of digital differentiation we do not know what standards to apply. Should discrimination against poor people be prohibited? And, if so, in which cases? To get answers to these types of questions, we need more interdisciplinary collaboration between computer scientists, lawyers, philosophers and researchers from other disciplines.”
About Frederik Zuiderveen Borgesius
Frederik Zuiderveen Borgesius obtained his doctorate at the University of Amsterdam's Institute for Information Law (IViR) in 2014 with a dissertation on behavioural advertising and privacy. After this he worked a post-doctoral researcher at this institute, and later at Vrije Universiteit Brussel in Belgium. Since 2019 he has been at Radboud University, where he teaches law to information science students, and he is involved with iHub, the Interdisciplinary Hub for Digitalization and Society.