Ofcom calls on tech firms to make online world safer for females
February 25, 2025

Ofcom has proposed “concrete measures” that tech firms should take to tackle online harms against women and girls, setting a new standard for their online safety. With insights from victims, survivors, women’s advocacy groups and safety experts, the new draft guidance sets out practical, ambitious but achievable measures that providers can implement to improve women’s and girls’ safety. It focuses on four issues:
- Online misogyny – content that actively encourages or cements misogynistic ideas or behaviours, including through the normalisation of sexual violence.
- Pile-ons and online harassment – when a woman or groups of women are targeted with abuse and threats of violence. Women in public life, including journalists and politicians, are often affected.
- Online domestic abuse – the use of technology for coercive and controlling behaviour within an intimate relationship.
- Intimate image abuse – the non-consensual sharing of intimate images – including those created with AI; as well as cyberflashing – sending explicit images to someone without their consent.
Ofcom’s guidance identifies a total of nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.
It promotes a safety-by-design approach, demonstrating how providers can embed the concerns of women and girls throughout the operation and design of their services, as well as their features and functionalities. To illustrate the specific changes that providers can make to improve women and girls’ safety, Ofcom is including practical examples of good industry practice, such as:
- ‘Abusability’ testing to identify how a service or feature could be exploited by a malicious user;
- Technology to prevent intimate image abuse, such as identifying and removing non-consensual images based on databases;
- User prompts asking them to reconsider before posting harmful material – including detected misogyny, nudity or content depicting illegal gendered abuse and violence;
- Easier account controls, such as bundling default settings to make it easier for women experiencing pile-ons to protect their accounts;
- Visibility settings, allowing users to delete or change the visibility of their content, including material they uploaded in the past;
- Strengthening account security, for example using more authentication steps, making it harder for perpetrators to monitor accounts without the owner’s consent;
- Removing geolocation by default, because this information leaking can lead to serious harms, stalking or threats to life;
- Training moderation teams to deal with online domestic abuse;
- Reporting tools that are accessible and support users who experience harm;
- User surveys to better understand people’s preferences and experiences of risk, and how best to support them; and
- More transparency, including publishing information about the prevalence of different forms of harms, user reporting and outcomes.
Why this matters
The online world can be a hostile and dangerous place for women and girls. Online spaces can facilitate online domestic abuse, silence women who wish to express themselves, create communities where misogynistic views thrive, and sometimes affect women’s ability to do their jobs. Women report more harm and greater concerns about the internet than men.
Under the UK’s online safety laws, services such as social media, gaming, dating apps, discussion forums and search engines have new responsibilities to protect people in the UK from illegal content, and children from harmful content – including harms that disproportionately affect women and girls.
This means companies must assess the risk of gender-based illegal harms, such as controlling or coercive behaviour, stalking and harassment, and intimate image abuse on their services. They must then take action to protect users from this material, including by taking it down once they become aware of it. Sites and apps must also protect children from harmful material, such as abusive, hateful, violent and pornographic content.
To help services meet these duties, Ofcom has already published final Codes and guidance on how it expects tech firms to tackle illegal content, and it will shortly publish final Codes and guidance on the protection of children. Once these duties come into force, Ofcom’s role will be to hold tech companies to account, using the full force of its enforcement powers, whenever and wherever necessary.
But beyond enforcing these core legal duties, the Act also requires Ofcom to produce additional, dedicated industry guidance setting out how providers can take action against harmful content and activity that disproportionately affects women and girls, in recognition of the unique risks they face.
What happens now
Ofcom is now inviting feedback on its draft Guidance, as well as further evidence on any additional measures that could be included to address harms that disproportionately affect women and girls. Responses must be submitted by May 23rd. Once Ofcom has examined all responses, it will publish a statement with its decisions, along with final guidance, later this year.
The media regulator also expects technology firms regularly to assess new or emerging threats, and it will report on how well they have tackled harms to women and girls around 18 months after the final guidance comes into effect.
Dame Melanie Dawes, Ofcom Chief Executive, commented: “No woman should have to think twice before expressing herself online, worry about an abuser tracking her location, or face the trauma of a deepfake intimate image of herself being shared without her consent. Yet these are some of the very real online risks that women and girls face today – and many tech companies are failing to act. Our practical guidance is a call to action for online services – setting a new and ambitious standard for women and girls’ online safety. There’s not only a moral imperative for tech firms to protect the interests of female users, but it also makes sound commercial sense – fostering greater trust and engagement with a significant proportion of their customer base.”
Cally Jane Beech, campaigner and influencer, who experienced deepfake intimate image abuse, said: “I want things to be better, for my daughter, and for women and girls all over the UK. We should all be in control of our own online experience so we can enjoy the good things about it. Tech companies need to be made more accountable for things being hosted on their sites.”
Domestic Abuse Commissioner, Dame Nicole Jacobs, added: “Everyone should be free to live out their lives online without the fear that they will be abused, stalked or harassed. But far too often, victims and survivors are expected to keep themselves safe from online abuse, rather than tech companies taking steps to protect their users. I’m pleased that Ofcom are stepping up to start the process of providing guidance to tech companies on how to tackle this. It’s now on these firms to implement these recommendations and ensure that perpetrators can no longer weaponise online platforms for harm. By taking meaningful practical action, not only will people be safer online, but it will demonstrate that tech companies are ready to play their part in tackling domestic abuse.”
Other posts by :
- Russian satellite tumbling out of control
- FCC boss praises AST SpaceMobile
- Rakuten makes historic satellite video call
- Rocket Lab confirms D2C ambitions
- Turkey establishes satellite production ecosystem
- Italy joins Germany in IRIS2 alternate thoughts
- Kazakhstan to create museum at Yuri Gagarin launch site
- AST SpaceMobile gets $42 or $1500 price target
- Analyst: GEO bloodbath taking place