Online emancipation: Protecting users from algorithmic bias
News feeds, search engines and product recommendations increasingly use personalisation algorithms to cut through the mountains of available information and find those bits that are most relevant to us – but how do we know if the information we get really is the best for us?
In an age of ubiquitous data collecting, computer analysis and processing, our online personal information is increasingly being used to make decisions on our behalf. But how trustworthy are these systems and how fair are they?
In an effort to establish a system of auditability and build trust and transparency when it comes to internet use researchers from The University of Nottingham, the University of Oxford and the University of Edinburgh want to learn more about the concerns and perspectives of internet users with the aim of drawing up policy recommendations, ethical guidelines and a 'fairness toolkit'.
UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy is an Engineering and Physical Sciences Research Council (EPSRC) funded research project which will be looking at the user experience of algorithm driven internet services and the process of algorithm design. It has drawn on expertise from HORIZON Digital Economy Research at The University of Nottingham, the Oxford e-Research Centre at the University of Oxford and the Centre for Intelligent Systems and their Applications (CISA) at the University of Edinburgh.
An example of the type of issues the UnBias project will be looking at is given the Conversation piece Dr Ansgar Koene, a Senior Research Fellow at HORIZON Digital Research in the School of Computer Science at Nottingham recently published titled 'Facebook's algorithms give it more editorial responsibility – not less'.
Dr Ansgar Koene, Senior Research Fellow at HORIZON Digital Research in the School of Computer Science at Nottingham, said: "Selections made by algorithms are commonly presented to consumers as if they are inherently free from human bias and fair but there is no such thing as a neutral algorithm."
"There is a growing need to provide internet users with a better understanding of the systems that control online environments as well as to raise awareness among online providers about the concerns and rights of citizens. The project is relevant for young people as well as society as a whole to ensure trust and transparency are not missing from the internet."
Co-produced with young people and other stakeholders the research team will also produce educational materials and resources to support youth understanding about online environments as well as raise awareness among online providers about the concerns and rights of young internet users.
The results of this two-year project will be widely disseminated through peer reviewed journals, and through community groups, schools and youth clubs. Progress updates will appear on the project blog and Twitter feed (@UnBias_algos).