Insights Online Safety: Ofcom publishes guidance on keeping women and girls safe online

Contact

Ofcom has published practical Guidance for tech companies on how to keep women and girls safe online.

One of the many obligations that are imposed on online services by the Online Safety Act 2023 (“OSA”) is to protect people in the UK from illegal content and content harmful to children, including harms that disproportionately affect women and children. The Guidance is intended to assist organisations achieve this aim by setting out a series of practical steps to follow, supplementing the recently-published codes and guidance on illegal content (which we have discussed here) and soon-to-be-published codes and guidance on protecting children.

The Guidance is divided into nine sections, each setting out ‘foundational’ and ‘good practice’ steps that organisations could take. Ofcom is clear that it does not expect service providers to implement all of these steps (whether because of their size or because a specific step may not be relevant to it), but it nonetheless “strongly encourages” providers to implement the relevant good practice steps.1.

1. Ensure governance and accountability processes address online gender-based harms

Good practice steps include: setting policies that are designed to tackle forms of online gender-based harms that are prevalent on the service (such as stalking, harassment, or intimate image abuse); consulting with experts on gender-based harms when setting policies; and training staff.

2. Conduct risk assessments that capture harms to women and girls

In addition to conducting risk assessments as required under the OSA, Ofcom recommends that services conduct user research to better understand users’ preferences and experiences of risk; engage with victims to understand their experiences; and work with external assessors to “monitor the threat landscape”.

3. Be transparent about women and girls’ online safety

Good practice steps in this area include sharing information about the prevalence of online gender-based harms, and exercising caution in sharing information that perpetrators of abuse could exploit to circumvent safety measures.

4. Conduct abusability evaluations and product testing

According to Ofcom, ‘abusability’ evaluations “test how easy it is to abuse a tool or feature for harm, and therefore point to ways that abusability can be minimised in design”. For example, the guidance suggests using red teaming to find vulnerabilities, working with experts who understand the behaviour of perpetrators, and using personas to discover how different users may experience features on the service.

5. Set safer defaults

Ofcom explains that safer defaults can be a powerful safety measure, and provides examples of action that service providers could take including: setting strong and customisable defaults which determine who can contact a user or ask a user’s permission before being added to a group chat; setting strong defaults concerning privacy, such as what information users can see about another user; employing two-factor authentication; and combining relevant safety and privacy settings into ‘bundles’ to make it easier for users to make informed choices about their online settings.

6. Reduce the circulation of online gender-based harms

Service providers will be expected to use a combination of measures to address harms, from persuasion, to reduction, to removal. Persuasive measures might include ‘nudges’ which prompt users to reconsider posting particular content. As for reducing the prevalence of harmful content, Ofcom recommends deprioritising it in recommender algorithms, removing links, and de-monetising such content. Finally, in terms of removing content, Ofcom recommends the use of hash matching to prevent the uploading of known intimate image abuse, implementing ‘time out’ features for those who misuse the service, or implementing prompt and output filters for generative AI models.

7. Give users better control over their experiences

Various tools are recommended to give particularly vulnerable users more control over what they experience on a service. They include allowing users to delete or change the visibility settings of content they upload; providing them with tools to block or mute accounts; allowing users to filter out content from users who have not completed identity verification checks; and allowing users to set preferences over the type of content they wish to see.

8. Enable users who experience online gender-based harm to make reports

Ofcom stresses that improving reporting systems is crucial, and that providers can encourage and enable users to make reports by designing systems that are “accessible, transparent, easy-to-use, and account for the specific dynamics of online gender-based harms”. Examples of good practice include allowing users to track and manage their reports, providing mechanisms for users to give feedback to the service provider on the reporting process, and establishing a ‘trusted flagger programme’ in partnership with organisations that have expertise in online gender-based harms.

9. Take appropriate action when online gender-based harm occurs

Finally, providers should embed an understanding of online gender-based harms into their systems and processes, allowing them to respond appropriately to incidents and minimising the risks of any future occurrences. This includes taking enforcement action against those who continually violate the terms of service, adding watermarks and metadata to indicate where content if synthetic, sending high-risk user reports of gender-based harms for review by specialist moderators, and creating dedicated reporting and review channels for online gender-based harms.

The Guidance is open for consultation until 23 May 2025, and more information can be found here.