Expert steps down over platforms' failure to protect users from harmful content
- Mar 18, 2024
- 2 min read
Updated: Mar 20, 2024
A psychologist believes Meta has been ‘turning a blind eye’ to posts she believes represent further danger to vulnerable women

A leading psychologist; who used to act as an advisor to Meta on suicide prevention and self-harm, has stepped down from her role after accusing the tech giant of “turning a blind eye” to harmful content on Instagram.
Lotte Rubæk, who has been on Meta’s global expert group, told the Observer in an interview that Meta was failure to remove images of self-harm and their inaction was is “triggering” vulnerable young women and girls to further harm themselves.
The Danish psychologist has since resigned from the group, claiming Meta does not care about its users’ wellbeing and safety.
She reportedly said, the company is using harmful content to keep vulnerable young people hooked to their screens in the interest of company profit.
In her resignation letter, she wrote: “I can no longer be part of Meta’s SSI expert panel, as I no longer believe that our voice has a real positive impact on the safety of children and young people on your platforms.”
Ms Rubæk said in an interview with the Observer: “On the surface it seems like they care, they have these expert groups and so on, but behind the scenes there’s another agenda that is a higher priority for them.”
That agenda, she said, was “how to keep their users’ interaction and earn their money by keeping them in this tight grip on the screen, collecting data from them, selling the data and so on.”

Psychologist Lotte Rubæk says Meta ‘uses a lot of tricks’ to avoid removing content. (Photograph: Linda Kastrup/Ritzau Scanpix/Alamy).
A Meta spokesperson said: “Suicide and self-harm are complex issues and we take them incredibly seriously. We’ve consulted with safety experts, including those in our suicide and self-harm advisory group, for many years and their feedback has helped us continue to make significant progress in this space.
“Most recently we announced we’ll hide content that discusses suicide and self-harm from teens, even if shared by someone they follow, one of many updates we’ve made after thoughtful discussion with our advisers.”
Rubæk’s warning comes as new research by Ofcom published last week found that violent online content is “unavoidable” for children in the UK, many of whom are first exposed when still in primary school. - Leading adviser quits over Instagram’s failure to remove self-harm content | Instagram | The Guardian
This article was shared with the consent of Lotte Rubæk.
Comments