Self-harm and suicide one click away on search engines

Damning report from regulator Ofcom reveals how search engines Google, Microsoft Bing, DuckDuckGo, Yahoo! and AOL can act as gateways to harmful websites, images and videos. 

Communications regulator Ofcom warns that ‘content that glorifies or celebrates self injury’ is widely available and easily accessed online, through the major search engines. This, of course, presents a huge challenge in efforts to keep children safe online.

boy in gray shirt using black laptop computer

Photo by Thomas Park

The report, ‘One Click Away: a study on the prevalence of non-suicidal self injury, suicide, and eating disorder content accessible by search engines’ (PDF, 1.2 MB), was commissioned from the Network Contagion Research Institute (NCRI) 

To conduct the study, researchers entered common search queries for self-injurious content, as well as more cryptic terms that are typically used by online communities to conceal their real meaning. The team then analysed more than 37,000 links returned by the search engines. 

Across the five major search engines – Google, Microsoft Bing, DuckDuckGo, Yahoo! and AOL – results were broadly similar, with four key findings: 

  • Harmful content related to self-injury is prevalent and can be found. As much as 22% of top-five, page one results led, in a single click, to content that ‘celebrates, glorifies, or offers instruction about non-suicidal self-injury, suicide or eating disorders.’ 
  • Image searches delivered a much higher proportion (50%) of harmful or extreme results than web pages (28%) and video (22%). The report also says that images can be more likely to inspire acts of self-injury and that it is hard to automate the policing of such content, because algorithms struggle to distinguish between images glorifying self-harm and those used in a recovery or medical context 
  • Cryptic search terms produce results with six times as much harmful content about self-injury. Again, the nature of these terms make detection difficult. The report highlights the problem of ‘data voids’, where safe, reliable information is not available in response to certain, obscure terms – so searches for these terms only or mostly produce extreme results. 
  • Despite these challenges, search engines also signpost help, support and educational content. Some 22% of search results were categorised as ‘preventative’, linking to resources such as mental health services or educational material. 

In conducting the study, the researchers did not employ the safety measures available on some search engines, such as ‘safe search’ settings and image blurring, which would of course affect the results. 

But in publishing the report, Ofcom underlines that search services must fulfil their requirements under the new Online Safety Act to minimise the chances of children encountering harmful content on their service. That specifically includes content promoting self-harm, suicide and eating disorders. 

Almudena Lara, Online Safety Policy Development Director at Ofcom, says: ‘Search engines are often the starting point for people’s online experience, and we’re concerned they can act as one-click gateways to seriously harmful self-injury content. 

Search services need to understand their potential risks and the effectiveness of their protection measures – particularly for keeping children safe online – ahead of our wide-ranging consultation due in spring.’ 

That consultation is on Ofcom’s proposed Protection of Children Codes of Practice.  

In related news:

£1.5m for digital factory hub to aid manufacturing in Wales

Apps really work on anxiety and depression, new analysis shows 

Interview: Keeping children safe online with Ontaro


Leave a Reply

Your email address will not be published. Required fields are marked *

Help us break the news – share your information, opinion or analysis
Back to top