Half of UK adults don’t trust computer algorithms
BCS found that as few as 7% of respondents trusted algorithms to be used by the education sector
More than half (53%) of UK adults don’t trust computer algorithms to make decisions on issues ranging from welfare to education, according to a survey conducted by BCS, The Chartered Institute for IT.
The research was conducted in the aftermath of the A-levels results scandal, which saw 36% of young people having their grades marked down by an algorithm. Although it was ultimately scrapped in favour of teachers’ predictions, the issue might have resulted in a heightened distrust in automated decision-making, which is becoming increasingly prevalent.
BCS found that as little as 7% of respondents trusted algorithms to be used by the education sector, on par with social services and the armed forces.
The trust in algorithms also differed between age groups. While only one in twenty (5%) over-55s expressed confidence in the use of algorithms, the percentage was over three times higher in the 18-24 age group at 16%.
Older people were generally less trusting about the use of algorithms in public life, with 63% of over-55s feeling negative about the idea, compared with 42% of 18-24-year-olds.
The disparity was also reflected in attitudes towards computerised decisions made by the NHS, private healthcare, and local councils. Almost one in three (30%) 18-24-year-olds said they trusted the use of algorithms in these sectors, while for those over 55, it was 14%.
Overall, automated decision-making was most likely to generate trust when used by the NHS, at 17%, followed by financial services, at 16%, and intelligence agencies, at 12%. These services use algorithms to determine issues such as medical diagnosis, credit scoring, and national security. Police forces and tech giants were among the least trusted when it came to using algorithms to make personal decisions about the respondents, both at 11%.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
BCS director of policy Dr Bill Mitchell said that, despite the deep distrust in algorithms, "there is little understanding of how deeply they are embedded in our everyday life".
"People get that Netflix and the like use algorithms to offer up film choices, but they might not realise that more and more algorithms decide whether we’ll be offered a job interview, or by our employers to decide whether we’re working hard enough, or even whether we might be a suspicious person needing to be monitored by security services," he said.
According to Dr Mitchell, the government and businesses face problems with "balancing people’s expectations of instant decisions, on something like credit for a sofa, with fairness and accounting for the individual, when it comes to life-changing moments like receiving exam grades".
"That’s why we need a professionalised data science industry, independent impact assessments wherever algorithms are used in making high-stakes judgements about people’s lives, and a better understanding of AI and algorithms by the policymakers who give them sign-off," he added.
Following the release of the algorithm-based A-levels results, the UK government and Ofcom were faced with at least three legal challenges, which included a potential GDPR violation.
Having only graduated from City University in 2019, Sabina has already demonstrated her abilities as a keen writer and effective journalist. Currently a content writer for Drapers, Sabina spent a number of years writing for ITPro, specialising in networking and telecommunications, as well as charting the efforts of technology companies to improve their inclusion and diversity strategies, a topic close to her heart.
Sabina has also held a number of editorial roles at Harper's Bazaar, Cube Collective, and HighClouds.