Home / World News / UK to force tech companies to take down abusive images in 48 hours
UK to force tech companies to take down abusive images in 48 hours
Companies that fail to remove the content in time could be fined as much as 10 per cent of their global revenue or have their service blocked in the UK, the government said
Elon Musk’s X came under widespread criticism last month after its Grok AI tool was used to create and distribute sexualised images of real people Image: Bloomberg
3 min read Last Updated : Feb 19 2026 | 7:52 AM IST
By Amy Thomson
The UK government has proposed rules that would require tech companies to remove abusive images from their sites within 48 hours, weeks after X users flooded the social media platform with thousands of pictures of undressed women they’d generated with the company’s artificial intelligence tool.
Companies that fail to remove the content in time could be fined as much as 10 per cent of their global revenue or have their service blocked in the UK, the government said in a statement on Wednesday. Regulator Ofcom is also considering whether to digitally tag intimate images of people shared without their consent so they are automatically removed, akin to child sexual abuse and terrorism content.
“The online world is the front line of the 21st century battle against violence against women and girls. That’s why my government is taking urgent action against chatbots and ‘nudification’ tools,” Prime Minister Keir Starmer said in the statement.
Elon Musk’s X came under widespread criticism last month after its Grok AI tool was used to create and distribute sexualised images of real people in underwear or bathing suits. Child safety groups found sexualized AI-generated images of children on the dark web.
X restricted access to Grok and blocked the feature, referred to as “bikini mode,” after an outcry from governments worldwide and emphasized that it takes down illegal content from its site. But it sparked a regulatory crackdown on the company and provided fodder for a movement to restrict social media companies more broadly. Several governments in Europe are now weighing banning social media for younger teenagers, building on a law that was passed in Australia last year.
Sharing non-consensual intimate images is already illegal in the UK. It’s defined as sharing pictures of people in an “intimate state,” like a sex act, who are nude or partially nude or who are in a “culturally intimate” state, such as a Muslim woman who has removed her hijab.
The UK is proposing adding the new rules as amendments to the crime and policing bill, giving police greater powers to enforce takedown measures.
The UK’s Revenge Porn Helpline, a group that assists people in getting this content removed, has said that while it succeeds more than 90 per cent of the time, platforms aren’t always compliant, and it can take several requests.
“When the team are supporting victims, I am continually reminded that when they want help and support, they want help and support now. They want their content removed now, not in a few hours or a few days,” David Wright, chief executive officer of the UK Safer Internet Centre, said in a report to Parliament last year.