The British government has quietly rolled out a mandatory ‘Digital ID’ system under the Online Safety Act, effective July 25, 2025, designed to purge independent media websites by blocking thousands of non-compliant platforms from UK users. This insidious law, framed as age verification for adult content, enforces ID checks and third-party services to suppress dissenting voices, threatening free speech under the guise of safety.
The act requires all sites hosting or distributing potentially “harmful” content to implement robust age verification or face severe penalties, including fines up to £18 million or 10% of global revenue, and outright blocking for UK audiences. Sites failing to verify users’ ages through proper technology—such as government-linked ID systems or external services—will be deemed non-compliant, fundamentally altering online operations and raising alarms about broader censorship of independent journalism.
Msn.com reports: “The days of ticking a box to say you’re over 18 are over,” said Adam Jones, Internet Law Specialist at HD Claims “Any adult site operating in the UK must now take meaningful steps to prevent underage access, or risk being banned entirely.”
This applies not just to traditional porn sites, but also to platforms that host user-generated explicit content such as Reddit, X, and OnlyFans.
Under the Act, Ofcom now has the power to:
- Issue content takedown orders and demand access restrictions
- Fine platforms that fail to prevent underage access
- Hold UK-based users and distributors accountable for uploads that breach the law
Even sites hosted overseas can be blocked if they are accessible in the UK and fail to comply. Adam said: “While sites must verify age, they are not permitted to retain sensitive personal data without clear user consent. The law includes privacy protections, and any platform storing unnecessary data risks breaching GDPR.”
Users can expect to see an increase in secure, privacy-first age verification tools such as digital wallets, credit checks, or government ID-based systems.
Companies that don’t meet Ofcom’s standards could face significant penalties (up to £18 million or 10% of global turnover), whichever is higher. In the most serious cases, senior managers or executives could be held personally responsible.
Senior Commercial Technology and Data Protection Solicitor at law firm Harper James, David Sant, said: “This is just the latest phase of the online safety rollout. Businesses should already be well-advanced in their compliance work if they operate a search engine or any online service which allows users to interact with each other or encounter their content (“user-to-user” services). Those businesses should already have completed an illegal content risk assessment by March 2025 and implemented appropriate safety measures. And they should also have completed a children’s access assessment by April 2025 to determine whether children might use their platforms, triggering comprehensive children’s risk‑assessment requirements.
“Any services which missed the deadlines for earlier assessments could already be at risk of enforcement action from Ofcom, and may now be automatically deemed as likely to be accessed by children, triggering the children’s risk‑assessment requirements.
“Even if a business missed the deadline, they should note that the ‘child access assessments’ need to be carried out annually to determine whether a user-to-user service or search service is likely to be accessed by a significant number of children.
“To determine whether a service is likely to be accessed by children, the business must consider whether it’s technically possible for children to use the service – something that can only be confidently ruled out if the service uses highly effective age‑assurance tools such as facial age estimation, digital ID verification or photo‑ID matching at first use; simply stating a minimum age in your terms of service is not enough. If it’s technically possible for children to access the service, the business must consider other factors, like whether their commercial strategy relies on children accessing the service and whether the content or design of the service is likely to appeal to children. If any of those apply, then it is likely to be accessed by children, which means that the business must carry out a further risk assessment and implement appropriate measures.
David added: “While the headline fines under the Online Safety Act (up to £18m or 10% of worldwide revenue) may sound daunting, enforcement will focus on cases where non-compliance poses real risk, particularly to children. Business should also consider the reputational consequences of Ofcom publishing details of investigations.
“Ofcom has already launched enforcement programmes to monitor whether businesses are complying with their illegal content risk assessment duties and their children’s risk assessment duties.”