If your business is registered in the United Kingdom, the UK Online Safety Act is a legal requirement you cannot ignore.
The law holds businesses accountable for protecting children, moderating harmful content, verifying user ages, and being transparent about their safety measures. Failure to comply can result in fines up to 10% of global revenue or £18 million (whichever is higher).
That means understanding and acting on this Act isn’t optional; it’s essential for staying in business. So let’s break down what this law means for your business, who it applies to, and the practical steps you can take to stay compliant.
The UK Online Safety Act in a Nutshell
Source: We Protect Global Alliance
The UK Online Safety Act was passed on 26 October 2023 to address growing concerns about online content. Its creation was driven by cases of children encountering harmful material online and the spread of illegal content.
This includes fraud, child abuse, terrorism, adult content, and racial or religious offences, among others. The primary aim of the Safety Act is clear. Companies need to take accountability for the content published, protect children against harmful content, and tackle illegal content.
OFCOM, the UK’s communication services regulator, is responsible for enforcing the Act through a set of final codes and guidance. It monitors and takes action against companies that don’t follow the rules by imposing fines of up to £18 million.
As Chris Sherwood, Chief Executive of NSPCC, states:
“This will – without a doubt – create safer, more age-appropriate online experiences for young users across the UK.”
Who Does the Online Safety Act Apply To?
Research by OFCOM reveals that more than 100,000 online services, offered through apps and websites, fall under the scope of the UK Online Safety Act. It also applies to social media platforms, online forums, instant messaging services, cloud storage, and even search engines.
In simple terms, any business in the United Kingdom that allows users to share, interact, or access content online is covered by the Act. Businesses that are not operating in the UK but are only registered overseas must also comply if their services are accessible to UK users or target the UK market.
Key Rules and Offences Businesses Must Know
Source: Ondato
The UK Online Safety Act has established online safety rules in accordance with the GDPR. Here are some key rules and offences defined under the Act:
1. Criminal Offences
OFCOM has mandated that all businesses complete a comprehensive risk assessment of illegal content on their platforms by mid-March 2025. Companies should take measures to proactively remove illegal content from their platforms or act promptly when users report it.
They also need to design their platforms in such a way that reduces the risk of being used for illegal activities. Some of the criminal offences that require monitoring on priority include content related to:
- Child sexual abuse
- Adult content
- Terrorism
- Fraud
- Religious or cultural public order offences
- Private image abuse
- Promoting suicide
2. Self-Harm Content
What children see online directly affects their mental health. Recent research shows that harmful online content is one of the ‘biggest culprits‘ behind depression and self-harm. The UK Online Safety Act addresses this by placing the responsibility on businesses.
OFCOM requires platforms to actively monitor and moderate harmful content, especially when it targets children. Big companies must also stay transparent about their moderation practices and provide easy-to-use reporting tools, so users can flag harmful material quickly.
3. Harmful Content for Children
Under the UK Online Safety Act, businesses need to make sure children aren’t exposed to harmful or age-inappropriate content on their platforms. Not all of this content is illegal, but it can still harm children mentally, emotionally, or physically.
That’s why companies must:
- Completely block children from accessing the most harmful content (called Primary Priority Content, like adult content or instructions for self-harm).
- Restrict and filter content that’s less severe but still dangerous (called Priority Content, like bullying, violent videos, or dangerous challenges). This is to ensure children only see what’s appropriate for their age.
4. Age Appropriate Content
A recent survey reveals that children as young as 8 have accessed adult content online, which highlights the need for stricter protection. To prevent underage exposure and ensure children are only exposed to age-appropriate content, platforms should enforce age restrictions.
Businesses are required to carry out a children’s risk assessment in light of the Protection of Children Codes of Practice. Platforms can verify the age of users by utilising facial scans, photo IDs, driver’s licenses, and credit card checks.
5. User Control
In addition to child protection, the UK Online Safety Act requires Category 1 services to provide adult users with enhanced control over their interactions. Adults should be able to verify identities, filter content, and prevent non-verified users from engaging with them.
Optional tools will allow users to limit exposure to harmful content, such as abusive, violent, self-harm, or eating disorder-related content.
Simple Step Every Business Must Take Now
The UK Online Safety Act has revolutionised the operations of digital platforms. However, it has also made it harder for non-compliant businesses to survive. Therefore, as an SME in the UK, timely measures are necessary to avoid costly fines and penalties.
If navigating the act feels overwhelming, partner with a trusted third-party provider, such as Rejuvenate IT. A reliable compliance partner can help conduct risk audits and provide relevant IT support for content moderation. This ensures your platform remains compliant with the Safety Act without facing legal penalties.
Stay Ahead by Securing Your Platform
Compliance with the UK Online Safety Act is not only about avoiding penalties. Following the rules and regulations provided by OFCOM demonstrates that your business values safety and transparency.
Partnering with experienced providers like Rejuvenate IT makes the process less exhausting. From website development to IT management, we help your business fulfil all legal requirements.
FAQs
1. What is section 77 of the Online Safety Act?
Section 77 of the Safety Act requires businesses to submit an annual transparency report to OFCOM for all the relevant services. OFCOM issues a notice to the providers specifying the type of information, format, submission deadline, and publishing protocol. These reports establish accountability by providing insights into how the business manages illegal content and age verification.
2. How Does the UK Online Safety Act Protect Children?
The UK Online Safety Act protects children from illegal and harmful content by limiting or restricting their online exposure. It makes it mandatory for platforms to conduct age verification through IDs and facial scans, and prevents children from accessing age-inappropriate content. This includes adult content, self-harm, violence, terrorism, and other harmful content.
3. How is the Safety Act Enforced?
The UK Online Safety Act is enforced via OFCOM, which is the UK’s official communications regulator. All UK businesses have to follow the final codes and guidance released by OFCOM to ensure compliance with the Safety Act. It also monitors the effectiveness of the safety measures and has the authority to take action against those who don’t follow the rules.