Wikipedia co-founder warns that the USA could run away with AI development
Jimmy Wales and OpenUK fear EU regulations could stifle open-source AI
Wikipedia co-founder Jimmy Wales has warned that the EU risks being outpaced by the US on AI development because of the stringent legislative approach it’s adopting with the rapidly developing technology.
Referencing the EU’s approach with its proposed AI Act, he said it represented the ‘typical’ attitude taken by the Union whenever it wants to control a potentially harmful technology.
“The EU AI act is going to be a classic European attempt to regulate something that’s just going to completely leave Europe behind the US.”
Wales’ comments were made during a roundtable discussion at which Amanda Brock, CEO at OpenUK, was also present.
Brock agreed with Wales regarding the dramatic pace of change over recent months. She called for an agile approach to legislation that was flexible and could adapt as AI evolved.
With the Wikipedia co-founder in agreement, Brock warned that AI regulation in the EU would present issues, saying: “It’s very prescriptive, it’s very detailed, it’s going to be very hard to follow and the open-source exclusions don’t go far enough”.
“It’s not going to work,” she said.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
What is the EU AI Act?
The purpose of the act is to regulate the use of AI in the EU. At a high level, it aims to ensure that AI systems in the EU are “safe, transparent, traceable, non-discriminatory and environmentally friendly”.
AI and cyber security
Read why AI/ML is crucial to cyber security, how it fits in, and its best use cases
Significantly, the European Parliament also wants to ensure that AI systems are overseen by people rather than automation in order “to prevent harmful outcomes”.
It also wishes to establish a technology-neutral, uniform definition for AI that has future applications.
The concern regarding the EU’s plans is around the inhibitive effect of regulation.
Brock noted that existing regulations already concern the handling of data depending on the use case - citing healthcare and finance as examples - and said “the amount of regulation around AI itself should probably be quite light and minimal”.
While the UK is no longer part of the EU, the actions taken by the Union and the flow of AI venture capital are two points of concern.
The UK, AI, and open source
According to OpenUK’s report, the Organisation of Economic Cooperation and Development (OECD) - using data from GitHub - found that the UK was the third largest contributor to public AI projects, accounting for 4% in 2022. India was top at 23%, and the 27 EU member states and the USA accounted for 14% each.
So far in 2023, the UK has received $29.2 billion in venture capital investment for AI, an 8.8% increase over 2022. The US’s total so far of $461.4 billion, however, stands 20.7% over its 2022 figure. The numbers indicate the UK still has a way to go before it can truly declare itself a leader in global AI.
The report also examined attitudes towards AI and how it intersects with open source.
For example, although 40% of respondents to the State of Open Survey 2023 said that open source software offered a solution to concerns regarding AI ownership, 45% were neutral on the matter.
Transparency was seen as hugely important to respondents, with 85% of them saying that datasets used to train an AI should be included in the software bill of materials (SBOM), while only 2% disagreed.
Brock told ITPro she was concerned that AI was in a similar position to other key technologies developed over the last few decades. She said that unless action was taken to “open it up and democratize it” there was every chance it might end up in the hands of a few.
She said: “The fundamental decision to be made is a simple one”.
“Will our government’s approach be one of control established through transparency and building trust engendered by openness, or will it be a decision to seek control by closing down innovation, thereby leaving the outputs of this most important of innovations, in a proprietary ‘black box’ without transparency?”
Richard Speed is an expert in databases, DevOps and IT regulations and governance. He was previously a Staff Writer for ITPro, CloudPro and ChannelPro, before going freelance. He first joined Future in 2023 having worked as a reporter for The Register. He has also attended numerous domestic and international events, including Microsoft's Build and Ignite conferences and both US and EU KubeCons.
Prior to joining The Register, he spent a number of years working in IT in the pharmaceutical and financial sectors.