Workshop: Human values in the age of Artificial Intelligence

  • Radboud University: Bert Kappen (Department of Neurophysics, Donders Institute), Alex Khajetoorians (Institute for Molecules and Materials), Hans Thijssen and Tamar Sharon (iHub for Security, Privacy, and Data Governance)
  • External Partners: Humathon

The success of artificial intelligence (AI) and deep learning has been unprecedented in the last decades. At the core, AI is about building machines that can think and act intelligently and includes tools such as Google's search algorithms or the machines that make self-driving cars possible.

humanalgorithm

While most current applications are used to impact humankind positively, there is a growing question about the role of AI in this digital age, and its interplay with humanity. Are these technologies improving our quality of life, or are they forever changing the way we function as a society? The most well-known examples involve the gathering of big-data and the role of privacy in this new age of information-gathering.

New initiatives, such as Humathon.org, are sprouting with the aim to discuss human values in the digital age. Concurrently, there is a severe lack in the education system at all levels, discussing the security, philosophical, ethical, and scientific implications of AI on society.

As Radboud University we aim to take a leading role in raising awareness on issues regarding human values in the digital age, to kick start the discussion on how we as academic institutions can contribute to new education on these issues. To this end we will create a one-day nation-wide workshop on the role of AI in society aimed at a Dutch-wide academic audience. With contributions from all parts of society, such as government, education, and industry, we will discuss the following topics:

Social manipulation

Social media, through its autonomous-powered algorithms, is very effective at target marketing. Social media companies know who we are, what we like and are incredibly good at surmising what we think. Social media may also influence our voting behaviour, and this can pose a direct risk to our democratic principles. How should we as citizens react to this new reality?

Invasion of privacy and social grading

It is now possible to track and analyse an individual's every move online while we are going about our daily business. With the implementation of cameras with facial recognition algorithms nearly everywhere, our movements in the real world can also be tracked and recorded. In fact, this is the type of information that is going to power China's social credit system that is expected to give every one of its 1.4 billion citizens a personal score based on tracked behaviour. What is the risk that such systems will develop in Europe? And how should citizens respond to this?

Discrimination

Since machines can collect, track and analyse so much about you, there is a real possibility of this information being used against you. It’s not hard to imagine an insurance company telling you you’re not insurable based on the number of times you were caught on camera talking on your phone. An employer might withhold a job offer based on your “social credit score.” Who in the end owns our data and what can we do to ensure data integrity?

For more information on the workshop, keep an eye on this website.