Meta defends ‘safe’ Instagram posts seen by Molly Russell | Science & Tech News
Instagram content viewed by teenager Molly Russell before she took her own life was safe, the social media site’s head of health and wellbeing has told a court.
Elizabeth Lagone, a Meta executive, was taken through a number of posts the schoolgirl engaged with on the platform in the last six months of her life.
Meta is the parent company for Facebook, Instagram and WhatsApp.
Ms Lagone told the inquest at North London Coroner’s Court she thought it was “safe for people to be able to express themselves” – but conceded two of the posts shown to the court would have violated Instagram’s policies and offered an apology about some of the content.
Responding to questioning, she said: “We are sorry that Molly viewed content that violated our policies and we don’t want that on the platform.”
Molly, from Harrow in northwest London, was 14 when she died in November 2017, prompting her family to campaign for better internet safety.
During a heated exchange, the Russell family’s lawyer, Oliver Sanders KC, asked Ms Lagone “why on earth are you doing this?” over allowing children on its platforms.
Mr Sanders queried such access to the platform when it was “allowing people to put potentially harmful content on it” and suggested Meta “could just restrict it to adults”.
Ms Lagone said the topic of harm was an “evolving field” and that Instagram policies were designed with consideration to users aged 13 and over.
Referring to one post seen in May 2017, Mr Sanders asked: “Do you think it helped Molly to see this?”
Ms Lagone said: “I can’t speak to this.”
“Six months after seeing this, she was dead,” Mr Sanders continued.
“I can’t speak to the different factors that led to her tragic loss,” Ms Lagone responded.
The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.
Mr Sanders spent around an hour taking Ms Lagone through Instagram posts liked or saved by Molly and asked if she believed each post “promoted or encouraged” suicide or self-harm.
She said the content was “nuanced and complicated”, adding it was “important to give people that voice” if they were expressing suicidal thoughts.
Posts were ‘cry for help’
Addressing Ms Lagone as she sat in the witness box, Mr Sanders asked: “Do you agree with us that this type of material is not safe for children?”
Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.
“Do you think this type of material is safe for children?” Mr Sanders continued.
Ms Lagone said: “I think it is safe for people to be able to express themselves.”
After Mr Sanders asked the same question again, Ms Lagone said: “Respectfully, I don’t find it a binary question.”
Coroner Andrew Walker interjected and asked: “So you are saying yes, it is safe or no, it isn’t safe?”
“Yes, it is safe,” Ms Lagone replied.
The coroner continued: “Surely it is important to know the effect of the material that children are viewing.”
Ms Lagone said: “Our understanding is that there is no clear research into that. We do know from research that people have reported a mixed experience.”
‘Who has given you the permission?’
Questioning why Instagram felt it could choose which material was safe for children to view, the coroner then asked: “So why are you given the entitlement to assist children in this way?
“Who has given you the permission to do this? You run a business.
“There are a great many people who are … trained medical professionals. What gives you the right to make the decisions about the material to put before children?”
Ms Lagone responded: “That’s why we work closely with experts. These aren’t decisions we make in a vacuum.”
Last week, Pinterest’s head of community operations, Judson Hoffman, apologised after admitting the platform was “not safe” when Molly used it – and “deeply regrets” the posts she viewed before her death.
The inquest, due to last up to two weeks, continues.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org. Alternatively, letters can be mailed to: Freepost SAMARITANS LETTERS.
Recent Posts
- Universal Orlando Resort reveals first pictures of new Harry Potter ride
- Adaptive Leadership in an Era of Experiences
- Wanderland London Releases first-of-its-kind Kids Hospitality Report
- EasyJet unveils Cape Verde flights and packages for summer 2026
- Hilton Signs Strategic Licensing Agreement with Olive by Embassy to Bring 150 Spark by Hilton Hotels to India
Recent Comments