Today, the UK’s new Online Safety Act comes into effect, with the aim of making the internet safer – especially for children. It will apply to all “search services” and services that allow users to interact with one another, including social media, video-sharing platforms, and of course – video games.
While it’s a UK law, it will apply to any company or service with a significant number of UK users, with Ofcom – the UK’s independent regulator of online safety – enforcing compliance.
Specifically, the Act requires companies to take “robust action” against illegal content (including the likes of child sexual abuse, extreme pornography, or terrorism); and against content harmful for children (including pornography, bullying, or abusive content). Adults will also be impacted, allowing greater control over the type of content and the people they’re able to engage with online.
But what does this mean for the video game industry? How will online game services be affected? And just how enforceable will this new Act be?
The Online Safety Act will be wide-reaching in its impact across the internet, but to what extent has the games industry really been considered during the hearing process?
“From the moment the government had the idea of an Online Safety Bill and published its white paper on it in 2019, social media firms were in the crosshairs,” says George Osborn, editor of Video Games Industry Memo who was head of campaign and communications at Ukie during the time of bill negotiations.
“This makes loads of sense because those platforms are more likely to be somewhere you can share harmful stuff. They have a bigger reach to more people. They let you share images and video content more easily. And it was hard to hold them to account, despite things like terror content, child sexual exploitation material and disinformation popping up on them.”
This is why the likes of Discord, Reddit and BlueSky are rolling out new safety features. Just this week, BlueSky added age verification to its direct messaging.
“However, the decision to bring all services that have user-to-user communication meant that games would get folded in,” Osborn continues. “And even though the industry did respond to consultations, spoke with ministers and with the new regulator Ofcom, there was little to no effort to understand how the bill would hit games companies or much focus on them.
“It’s why the 300-page act barely mentions games (I think it literally references it once), even though it puts loads of responsibilities on companies.”

Dr. Celia Pontin, director of public policy and public affairs at Flux Digital Policy, notes that such a wide-ranging regulation needs a variety of companies to be involved in the regulatory process – including video game companies of all sizes for diversity of voice.
“Although trade bodies played their vital part in engaging with legislators and regulators, when you look at the responses submitted to public consultations there were no individual voices from the video games industry beyond a couple of the largest organisations,” says Pontin. “Video games companies of all types and sizes need to feel empowered to get involved in policy development because when it comes down to implementation, they are the experts in their field.”
From today, any game with user-to-user communication (such as voice or text chat) that’s available in the UK will need to follow the law. Broadly, Osborn explains, studios will need moderation tools to remove harmful content, better reporting processes, and measures in place to protect children if a game is accessible to them.
“That’s the bare-bones version of the law,” Osborn warns. “The reality is that a lawyer will likely tell you that you’ve got to do a hell of a lot more to meet the provisions of the 300-page act and the many volumes of guidance put out by Ofcom. And given that the regulator can dish out fines of up to £18m or 10 percent of your global turnover, and in some rare cases bang up a senior exec, it is something that you really need good advice on quickly.”
He adds: “The only crumb of comfort is that Ofcom is still rolling out the final bits of the act and has said it’ll take its time to fully enforce. But with games having historically been a target for moral panics and policymaker madness, you don’t want to bet too closely on not being picked up.”

Speed is certainly of the essence, as Mike Pappas, CEO of Modulate explains. Modulate is a voice moderation service for regulation and compliance, used in Activision’s Call of Duty and Rockstar’s GTA Online. Pappas states the new requirements “re-affirm the value of what we’re doing.”
“While most online platforms today have some flavour of content moderation, the new codes require things like ‘swift action’ on harmful content and determining appeals ‘promptly’, requiring a renewed focus on speed,” he says.
“This reinforces the need for AI moderation (ideally in combination with humans!), as AI is the only way to maintain high speed at high scale when conducting this work. Further, these codes push platforms to more proactively monitor for harmful content, relying less on user reports alone – which is something we’ve been pushing for years, and a capability ToxMod is built to support from the very beginning.”
AI will likely feature heavily in age verification tools. Just last week, Roblox introduced new safety features including an AI to estimate a teenager’s age by taking a video selfie (a similar feature is being used on BlueSky, as of 25th July, too).
k-ID is another company focused on digital compliance, listing the likes of Capcom and Discord as clients. “This is a significant moment that puts online safety at the heart of game design,” says CEO and co-founder Kieran Donovan. “It has been encouraging to see some businesses not view these obligations as a compliance burden and instead focus on the positive opportunities that effective child safety systems provide. Forward-thinking studios understand that it will enable them to create age-appropriate experiences that empower young users and build trust, unlocking audiences they couldn’t safely serve before.”
There are plenty of challenges ahead, then, for developers. Yet while Ofcom may have the biggest AAA games in its sites, will this new legislation disproportionately affect indie developers who may not have the resources to meet requirements?

“It is my understanding there are different expectations placed on studios of different sizes to help mitigate these challenges, both in terms of what they are expected to do and in what timeframe,” notes Dr. Rachel Kowert, PhD, research psychologist and founder of Psychgeist.
However, Osborn states the Act is “far too big for indies who have chat functions to comply with easily”.
“Back when it was being passed,” he says, “the bill was constantly referred to as a ‘Christmas tree’ because people kept whacking baubles onto it while it took years and years and years to pass.
“And the result of that is it empowered Big Tech and social media companies, rather than putting a muzzle on them. Yes, there are loads of ways they can be collared now. But because they have big legal teams, great external counsel and policy pros and public affairs agencies lobbying for them, the biggest businesses have been able to pay up for the right advice to adapt their services to meet the rules in advance.”
He adds: “But for small businesses including indie game developers who have the misfortune to have things like text and voice chat in their games, following all the rules is really hard.” Even with the likes of Modulate and k-ID to assist, it remains hard for small businesses to comply in the short and long term, says Osborn.
Pontin concurs: “Although Ofcom has acknowledged concerns over varying levels of impact and adopted what they term a ‘risk based and proportionate approach’, there’s no getting away from the fact that some requirements will have disproportionately larger effects on indie developers in terms of resources or even technical feasibility. Depending on what services they provide, they may need to implement or license age assurance systems, content moderation tools and secure reporting mechanisms, potentially requiring substantial investment in technology and engineering.
“There are also costs in understanding and executing the auditing and reporting requirements, since risk assessments, audits, and reports require specific staff and processes. The compliance cost associated with the OSA could represent a significant barrier to entry or continued presence in the market, particularly for small developers, who might seek to minimise this by reducing the use of social and UGC features in their games. Importantly, some indie studios may not have access to the legal expertise to understand the extensive requirements of the Act in the first place.”

Live-service games will likely feel the brunt of legislation, due to their heavy use of online communication. And already it’s proving difficult to launch and maintain live-service games in the current climate, especially as a handful of big names dominate the genre.
Kowert believes this new legislation will have a positive impact on the genre. “It will require greater investment and prioritisation in their trust and safety efforts, which in the end is a win-win for everyone,” says Kowert. “Research has consistently shown that players do not want to spend time in social spaces that are characterised by disruptive or exploitative behaviour. Making them safer by design, and integrating more robust trust and safety features, will not only create a better player experience but a safer online experience for everyone.”
Still, Osborn notes the barrier for entry for developers has now been raised, and therefore favours “established big players over new market entrants”.
“This means that if you want to enter the market with anything that’s live-service and brings users together, you have to think about designing games with no meaningful chat functionality at all, limiting interactions to pre-programmed button based inputs, or embed something like Discord’s social SDK into your game to shift the compliance burden elsewhere,” he says.
“And if you’re thinking about doing anything user-generated content related or with rich media sharing – like video or images – you’re going to have to think even harder about what you need to do to comply.”

Perhaps the biggest question of all is, just how enforceable is the Online Safety Act? And will users simply find workarounds for new safety features? After all, underage children have long played adult games without parental supervision.
“This is the big question and a valid one,” says Kowert. “While it’s true that kids often find ways to bypass age restrictions, this act is still a step in the right direction. It’s certainly more effective than an outright ban, which not only fails to stop access but also removes any opportunity for regulation or safeguards. At least with a structured framework like this, there’s room for oversight, education, and harm reduction. It’s not a perfect solution, but it’s far better than leaving players completely unprotected.”
Pontin notes that age restrictions won’t have as much of a bearing considering they don’t take into account the functions that bring a game under the Online Safety Act – it’s specifically focused on online communication, which not all games have regardless of rating.
“All in all, the OSA is very enforceable when it comes to the actions that providers are expected to take,” says Pontin. “In a clear signal of how seriously they intend to take their role, Ofcom has already started an enforcement programme on aspects of the OSA and is engaging with a number of games companies – we are not flying under the radar as a sector and companies cannot assume that they will be overlooked. Ofcom has a lot of tools for enforcement, including requiring ISPs to block sites in extreme cases, and penalties can be high.”
Osborn concludes Ofcom has a “serious challenge” ahead of it, as it’s essentially “regulating the whole internet” and it’s “practically impossible for it to stay on top of everything online.”
“But generally, the point of laws like this is not to catch every infraction,” he continues. “Instead, it gives regulators tools to tackle harmful stuff when it hits the biggest platforms (i.e. lots of people can be exposed to it), to tackle the most obviously harmful stuff when it pops up (e.g. Ofcom pursuing forums that promote self-harm and suicide), or address consumer complaints when they see something that the law doesn’t allow.”
This, he believes, is a “mixed picture” for games and the lack of attention paid to the industry during the passage of the bill highlights a perceived lack of expertise from Ofcom. In all, Osborn describes the Act as “disappointing”.
“The Act has really good intentions, but it’s been bounced into a place where it is bafflingly large, confusing and ultimately strengthens services it was meant to contain,” he says. “That’s not what anyone who is interested in online safety wants, and I hope the act is amended in the years ahead.”