Roberta Sinatra awarded a Velux Foundation grant to fight discriminatory algorithms
Algorithms used for ranking publications and evaluating researchers on the basis of their citations are far from fair. The Velux Foundation has awarded associate professor, Roberta Sinatra, the Young Investigator grant to pursue her ambition of creating the foundation for better algorithms and metrics.
Computer Science DepartmentResearchalgorithmsgrants
Written 25 January, 2021 09:40 by Jari Kickbusch
Today, we have fast and efficient algorithms that search, sort and rank scientific information. Yet, these algorithms have an issue: they are based on citations, which are ingrained with human biases. Therefore, explains associate professor at the IT University of Copenhagen, Roberta Sinatra, their output is also biased, creating inequalities and raising serious concerns of discrimination:
- For example, there is scientific evidence that given same quality, women receive less recognition than men. Similarly, given same the quality, a researcher from a Danish University receives less recognition than, say, a researcher from Harvard University. In general, citations are distorted by effects that have nothing to do with quality, like gender, ethnicity or affiliation. Why is this a problem? Because we want algorithms and metrics to find high quality papers and researchers, regardless of their gender or affiliation, she says.
Young Investigators
In 2021, Roberta Sinatra is among 19 researchers to receive the Velux Foundation's Young Investigator grant; a part of a funding programme, which "aims to support early career researchers with ambitions of creating their own, independent research identity, and with the potential to significantly contribute to research in technical and natural sciences at a Danish research institution."
The fund has invested DKK 6 million in Roberta Sinatra's project Bias Explained: Pushing Algorithmic Fairness with Models and Experiments. The project aims to uncover the mathematical bias mechanisms and to create fair algorithms.
- We all know biases exist. Yet, we have no quantitative understanding of the long-term effects of bias mechanisms on recognition. In the long term, I expect that this research will empower us to study recognition inequalities not only in science, but also in many other fields where we use biased measures to quantify output, like in art or movies. We have the moral responsibility to create systems where equal opportunities are not undermined by our own biases or by poorly designed algorithms. This project aims to build a more fair society by creating a debiased framework to recognize true talent and to allocate resources in a fair way, she ends.
Learn more about the Velux Foundation's Young Investigator Programme
Learn more about this year's 19 awarded researchers
Jari Kickbusch, phone 7218 5304, email jark@itu.dk