About the research
When you apply for your next job or loan, you might be assessed by a computer. Many banks and companies already use smart algorithms, also known as AI, to judge your data. But what to do if such an algorithm learns unacceptable patterns, leading to discrimination? Carmen Mazijn (VUB) created a technique called LUCID to detect how an AI model makes decisions and determine if the model is biased. "Together, we can ensure that computers make fair decisions without discriminating," she says.