The UK government wants to upskill regulators in the age of generative AI
The government wants regulators to develop practical tools to monitor the use of AI across a raft of industries
The UK government has pledged £100 million in funding in a bid to launch new artificial intelligence (AI) research centers and prepare regulators for the widespread adoption of the technology.
The new funding is a response to the consultation on the AI Regulation White Paper, with £10 million allocated to help regulators develop practical tools to monitor and address the risks and opportunities in their sectors, from telecoms and healthcare to finance and education.
Key regulators, including Ofcom and the Competition and Markets Authority (CMA), are being asked to publish their approach to managing the technology by April 30. They'll be expected to set out the AI-related risks in their areas, detail their current skills capabilities, and outline plans on how to regulate the use of AI over the coming year.
Some have already embarked on this process, including the Information Commissioner’s Office (ICO). The data watchdog has updated its guidance on how data protection laws apply to AI systems that process personal data, for example, and has already begun issuing enforcement notices.
As part of the move, a steering committee will be launched this spring to support and guide the activities of a formal regulatory structure on AI within the government.
Announcing the funding, technology secretary Michelle Donelan said the government aims to ensure regulators are agile enough to contend with rapid AI advancements over the last year following the emergence of generative AI.
"AI is moving fast, but we have shown that humans can move just as fast," she said.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
"By taking an agile, sector-specific approach, we have begun to grip the risks immediately, which in turn is paving the way for the UK to become one of the first countries in the world to reap the benefits of AI safely."
New AI research hubs will drive R&D
Around £90 million in funding will go towards launching nine new research hubs across the UK, the government said. These will focus on areas including healthcare, chemistry, and mathematics.
There's also £9 million of funding through the government’s International Science Partnerships Fund, bringing together researchers and innovators in the UK and the US to focus on developing safe, responsible, and trustworthy AI.
Tommy Shaffer Shane, AI policy advisor at the Centre for Long-Term Resilience, welcomed the move as a positive step toward ensuring regulatory flexibility as AI adoption continues to grow globally.
"We’re pleased to see this update to the government’s thinking on AI regulation, and especially the firm recognition that new legislation will be needed to address the risks posed by rapid developments in highly-capable general-purpose systems," he said.
"Moving quickly here while thinking carefully about the details will be crucial to balancing innovation and risk mitigation, and to the UK’s international leadership in AI governance more broadly."
Responsible AI development a key government focus
Another £2 million of Arts and Humanities Research Council (AHRC) funding will go to support new research projects aimed at helping to define responsible AI across sectors such as education, policing and the creative industries.
Put virtual reality technology to work in your business
Meanwhile, £19 million will go towards 21 projects aimed at developing trusted and responsible AI and machine learning solutions to accelerate deployment of these technologies and drive productivity.
Julian David, CEO at techUK, said the funding announcement underlines the government’s commitment to a “pro-innovation and pro-safety approach” and that regulators will play a crucial role in promoting long-term responsible development.
"We now need to move forward at speed, delivering the additional funding for regulators and getting the central Function up and running. Our next steps must also include bringing a range of expertise into government, identifying the gaps in our regulatory system and assessing the immediate risks.”
Emma Woollacott is a freelance journalist writing for publications including the BBC, Private Eye, Forbes, Raconteur and specialist technology titles.