Ofcom’s draft guidelines on illegal online content set stringent rules for big tech
The codes of practice gives an insight into what the Online Safety Act will mean in practice


UK regulator Ofcom has published its first draft codes of practice in the wake of the Online Safety Act gaining Royal Assent as the regulator looks to tackle harmful online content.
The draft codes unveiled by Ofcom seek to crack down on illegal content distributed online. This includes terrorist or extremist content, as well as child abuse and fraudulent content.
Technology secretary Michelle Donelan said the new rules will obligate online businesses, such as social media firms, to address long-standing issues around harmful materials online.
"Before the Bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first," she said.
"By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today."
Ofcom research shows that three-in-five children in secondary school have been contacted online in a way that made them feel uncomfortable. The new rules propose introducing measures to control friend requests on social media platforms, for example.
'Larger and higher-risk services' should not present children with lists of suggested friends; children shouldn't appear as suggested friends; they shouldn't be visible in other users' connection lists; and their own connection lists shouldn't be visible.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Accounts outside a child’s connection list should not be able to send them direct messages, and their location information shouldn't be visible to anybody else.
Meanwhile, these larger platforms should use hash matching to check for child sexual abuse material (CSAM) against a database of illegal images, and use automated tools to detect URLs that have been identified as hosting CSAM.
All services should block accounts run by banned terrorist organizations.
With regard to fraudulent content, services should use keyword detection to find and remove posts linked to the sale of stolen credentials, such as credit card details.
Under the new guidelines, services that offer to verify accounts will also be forced to explain how they do it.
Ofcom’s move receives industry approval
Industry stakeholders such as Which? have given their seal of approval for the move. The consumer rights group said the new guidelines will force online businesses to adhere to more stringent rules and protect users.
"The Online Safety Act becoming law is a vital moment in the fight back against fraud," said director of policy and advocacy Rocio Concha.
"It should force tech giants to take more responsibility for tackling fraudulent adverts on their platforms, and it is positive Ofcom is progressing with the regulatory codes so quickly to make this happen."
RELATED RESOURCE
Put cyber security at the heart of your organization
The codes also include a series of requirements already laid out in the Online Safety Act. All services will have to name somebody responsible for compliance with their duties on illegal content, reporting and complaints.
Content and search moderation teams must be well resourced and trained, performance targets must be set, and progress monitored, Ofcom said.
In addition, the regulator will require organizations to draft and implement policies on how content is reviewed to ensure transparency.
Ofcom seeks user-led feedback
A key aspect of the new rules highlighted by Ofcom was the desire to ensure user-led feedback and reporting of harmful content.
Users will be entitled to report harmful content to businesses in an easy manner, the regulator said. This includes making complaints, blocking other users on social media sites, and the ability to disable comments on posts.
The new code of conduct is now up for consultation and expected to come into force at the end of 2024. Ofcom said it will propose guidance on how adult sites should make sure children can't access pornographic content later this year.
In spring 2024, it will also launch a consultation on more protections for children from harmful content such as suicide, self-harm, eating disorders and cyber bullying.

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.
He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.
For news pitches, you can contact Ross at ross.kelly@futurenet.com, or on Twitter and LinkedIn.
-
Should AI PCs be part of your next hardware refresh?
AI PCs are fast becoming a business staple and a surefire way to future-proof your business
By Bobby Hellard
-
Westcon-Comstor and Vectra AI launch brace of new channel initiatives
News Westcon-Comstor and Vectra AI have announced the launch of two new channel growth initiatives focused on the managed security service provider (MSSP) space and AWS Marketplace.
By Daniel Todd
-
New Ofcom guidelines show it’s getting tougher on big tech
News New Ofcom guidance outlining its plans for the Online Safety Act show the regulator is toughening up on big tech.
By Emma Woollacott
-
UK gov urged to ease "tremendous" and 'unfair' costs placed on mobile network operators
News Annual licence fees, Huawei removal costs, and social media network usage were all highlighted as detrimental to telco success
By Rory Bathgate
-
UK regulator to investigate Amazon, Microsoft, Google cloud services competition
News The regulator is hoping to publish a final report, including its concerns or proposed recommendations, within 12 months
By Zach Marzouk
-
We're addicted to our phones, according to Ofcom
News Although always being connected means flexible working, some think it's having a negative impact on relationships
By Clare Hopping
-
Ofcom reveals automatic compensation for ripped-off broadband customers
News £142 million will be automatically paid out to customers receiving a delayed service
By Clare Hopping
-
Three fined £1.9m for 999 call flaw
News Ofcom investigation reveals emergency calls were routed through a single data centre
By Dale Walker
-
Ofcom fines EE £2.7m for overcharging 40,000 customers
News Customers dialling 150 number abroad were overcharged £245,700
By Joe Curtis
-
BT ordered to spin off Openreach
News Ofcom rules BT has failed to address competition concerns
By Jane McCallion