What the UK’s Online Safety Act means for IT companies

Experts reveal the potential impact and necessary requirements of the UK’s Online Safety Act, introduced to protect children from harmful content online.

View of the Houses of Parliament, London, UK.
(Image credit: Getty Images)

The UK government’s Online Safety Act (OSA) comes into force on 25 July 2025, requiring technology platforms to implement processes and systems to prevent users – particularly children – from accessing “harmful and age-inappropriate digital content”.

The UK’s media regulator, Ofcom, which will be enforcing the new act, says that 59% of teenage children (aged 13-17) have encountered potentially harmful content online. It has determined that technology providers can do more to address this and has set out new codes of practice to which these providers must adhere.

“The Online Safety Act introduces a fundamental shift in how online services – both UK-based and international – must operate when accessible by UK users,” explains Dr. Loredana Tassone, Managing Consultant and Head of EU and UK Representative Services at GRCI Law. “Ofcom now holds wide-ranging investigatory and enforcement powers, making compliance not just a legal necessity, but an operational and ethical imperative.”

The legislation regulates operators of digital platforms in a number of new ways, making them more responsible for their users’ safety. And it includes the responsibility to implement systems and processes that reduce the risk of illegal activity, including the removal of illegal content when it does appear.

“The regulator, Ofcom, has the power to require the provision of certain information, to issue fines of up to the greater of 10% of global annual turnover or £18m and to impose various business disruption measures,” says Nick Harrison, senior associate at global law firm Taylor Wessing. “New criminal offences are also introduced for certain breaches or failures to comply.”

All tech businesses operating in the UK need to carefully consider whether their services comply with the law and keep it in mind when seeking to expand in the region.

Ofcom’s guidance

On 24 April 2025, Ofcom presented its codes of practice to the UK Parliament, publishing guidance on how providers should carry out risk assessments for evaluating the potential harm to children, with services given three months to complete their audits ahead of the 24 July deadline.

“The Act places significant demands on platform governance and user content moderation, especially for services likely to be accessed by children,” says Tassone. “Businesses will need to embed safety into their governance structures, assess and mitigate content risks proactively, and implement age assurance mechanisms that are effective yet privacy conscious.”

The OSA doesn’t just apply to UK businesses, either, or those with a physical presence in the United Kingdom; it applies to any service which has "links with the UK".

“This threshold can be met in a number of ways: if the number of UK users of the service is "significant" (this is deliberately not defined and will be context-dependent), if UK users are a target market of the service, or if content on the service presents a material risk of significant harm to users in the UK who can access the service,” Harrison tells ITPro.

The OSA sets out a range of duties and explains how providers should approach them. For example, one of the safety duties surrounding illegal content is to take proportionate measures to prevent users from encountering that content. And it also stipulates that this applies to how a service is designed, operated and used.

“It is simply not enough for a provider to say that it was happy with the original design of a service and the real-world operation of it was uncontrollable,” says Daniel Milnes, partner at Forbes Solicitors. “What happens to users, specifically including the effect of algorithms, is a provider’s responsibility. The existence of this specific set of duties, mandatory assessments and reporting, and dedicated enforcement powers, is certainly intended to make providers focus more on protecting users.”

Concerns from SMEs

Smaller companies, especially SMEs and startups, have expressed concerns that the Online Safety Act may place excessive pressure on them.

"Implementation of the Online Safety Act faces hurdles in cost and technical feasibility,” explains Jason Soroko, Senior Fellow at Sectigo, a leading provider of SSL certificates. “Platforms, especially smaller or independent operators, may struggle with the expense of robust age verification and content moderation tools.

“Enforcement also poses challenges due to varied jurisdictional reach and resource constraints. Regulators risk focusing on easily targeted platforms while larger, multinational sites exploit legal loopholes or inconsistent international cooperation.”

Kevin Quirk, director at AI Bridge Solutions, agrees that the Online Safety Act may result in smaller companies being disproportionately affected, whilst also facing greater scrutiny from Ofcom.

“While large tech firms have long had legal teams and compliance departments in place, the Act presents a real challenge for smaller companies like ours, particularly when trying to remain agile and cost-effective,” Quirk tells ITPro. “Since the Act came into force, our clients have become far more cautious when launching new platforms. We’re seeing delays in deployment timelines while legal teams reassess features. Others have asked us to rebuild or modify platforms to better align with safety-by-design principles.”

Quirk also claims that, whilst the Act requires “reasonable steps” to be taken, it doesn’t do a satisfactory job of defining what “reasonable” looks like for an SME. “That ambiguity creates risk,” he says. “We’re spending more time than ever on policy, legal consultations, and risk assessments; resources that would otherwise go toward development and innovation.”

Potential knock-on effects

Alongside the added pressure of cost and compliance, there have also been concerns raised about a lack of clarity around “harmful content”, which may further complicate compliance, as platforms concerned about incurring fines are forced to navigate the waters between the Scylla of harmful content and the Charybdis of excessive censorship.

“One consequence may be censorship,” says Boris Cipot, senior security engineer at Black Duck, which helps teams manage security and compliance risks. “In many cases, content will be removed or prohibited due to generic rules disregarding the context due to the fear of facing penalties. This may mean that different viewpoints might be disregarded.”

Nick Henderson-Mayo, Director of Learning and Content at compliance eLearning and software provider, VinciWorks, agrees that censorship needs to be addressed with consideration, as it could affect areas such as encryption, an issue that was raised by companies such as WhatsApp and Signal in 2023.

Encryption could be at risk if scanning mandates go too far, and fear of fines might lead to over-censorship. Ofcom says it’ll act proportionately, but privacy-focused platforms are watching closely,” says Henderson-Mayo. “In fact, several chat app providers have warned that if compelled to scan private messages proactively, it would “nullify the purpose of end-to-end encryption”. Some secure messaging apps might even withdraw from the UK rather than compromise their encryption, but we will need to see if they are serious or just scaremongering.”

While the UK’s Online Safety Act will mirror the EU’s goal of making the internet safer, with both the UK and EU acting in concert, there are concerns that the OSA and the EU’s Digital Services Act (DSA) may dissuade companies from launching and operating in Europe.

But Cipot tells ITPro that while firms may be concerned about this, the long-term effect is likely to be beneficial.

“Some services might not be willing to enter the UK market as they do not want to implement the additional safeguards. On the other hand, would you be ok using a service that does not have the safeguards provided by the Act?,” says Cipot. “I believe the Act will be a good way to ensure online safety improves.”

Preparing for the Online Safety Act

To prepared for the OSA, IT companies will need to determine whether they are in-scope and, if so, familiarize themselves with the various duties under the OSA and related guidance from Ofcom. They should then begin carrying out risk assessments as soon as possible. “This is a significant task and should not be put off since the timelines are quite tight and many deadlines for compliance have already passed,” explains Harrison.

By 25 July, affected online services, which are likely to be accessed by UK children, need to complete their risk assessment and comply with Ofcom’s Protection of Children Codes.

“The July 2025 deadline to protect children from harmful content adds urgency,” says Tassone. “Organizations should act now—conducting gap analyses, assigning accountability, and building cross-functional governance—to meet expectations and reduce compliance risk.”

Dan Oliver
Freelance writer

Dan Oliver is a writer and B2B content marketing specialist with years of experience in the field. In addition to his work for ITPro, he has written for brands including TechRadar, T3 magazine, and The Sunday Times.