- A two-week inquest into the 2017 death of 14-year-old Molly Russell concluded Friday.
- It is “probable” that social media content viewed by Molly contributed to her death, the senior coroner said.
- Molly had been seeing graphic content about self-harm and suicide on Instagram and Pinterest in her last few weeks.
A courtroom in north London, England, was packed Friday with reporters awaiting the conclusion of a two-week inquest into the 2017 death of teenager Molly Russell.
Senior Coroner Andrew Walker told North London Coroner’s Court that Molly, 14, “died as an act of self-harm while suffering from depression and the negative effects of online content”.
The senior coroner said it was “likely” that social media content viewed by Molly, who was already suffering from a depressive illness, affected her mental health in a way that “contributed to her death in more than a minimal way” .
Walker added that it would not be “safe” to leave suicide as a conclusion for himself.
Molly saw content that “no 14-year-old should be able to see”
Molly took her own life in November 2017 after seeing 2,100 pieces of Instagram content about suicide, self-harm and depression in the last six months of her life, the inquest found. He had also included 469 images on similar topics on his Pinterest board.
Walker found that Molly signed up to a number of online sites and apps, including Instagram and Pinterest, which displayed content that was “not safe” for a teenager to view.
The algorithms of those websites and apps, Walker said, led to “eating sessions” of images, videos and text of a disturbing nature, some of which Molly hadn’t asked to see.
Walker said she had seen “extremely graphic” material online that “no 14-year-old should be able to see”.
On Tuesday, the child psychiatrist Dr. Navin Venugopal told the inquest that Molly had been considered “very disturbing, disturbing” which left him unable to sleep well “for a couple of weeks”.
Some of the content, Walker said in his conclusion, “romanticized” self-harm and discouraged seeking support from those who could help.
Some pieces of content depicted self-harm and suicide as an “inevitable consequence of a situation from which he could not recover,” he continued.
After the inquest concluded, Molly’s father Ian Russell made a brief statement to reporters outside the coroner’s court.
He said: “In the last week, we’ve heard a lot about a tragic story – Molly’s story. Sadly, there are so many others who are similarly affected right now.
“At this point, I just want to say, as dark as it may seem, there is always hope, and if you’re struggling, talk to someone you trust or one of the many wonderful support organizations instead of engaging with potentially harmful content online.”
Speaking at a press conference later in the day, Russell criticized comments made Monday in the investigation by a senior Meta executive. Elizabeth Lagone, head of health and wellbeing policy at Instagram’s parent company Meta, said most of Molly’s posts were “safe” for children to see.
“We heard a senior Meta executive describe this deadly stream of content that the platform’s algorithms pushed to Molly as safe and not in violation of the platform’s policies,” Russell said. “If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive.”
Last week, a senior Pinterest executive admitted during the investigation that the platform was “unsafe” when Molly was using it. Jud Hoffman, the company’s head of community operations, apologized and said he was “deeply sorry” that Molly was able to view graphic content on the platform.
The senior coroner will write to Pinterest and Meta, as well as regulator Ofcom and the UK Department for Digital, Culture, Media and Sport.
“The decision should shock Silicon Valley”
Activists say the landmark decision could push social media companies to take responsibility for keeping children safe on their platforms.
Sir Peter Wanless, chief executive of the NSPCC, a UK child protection charity, said “the decision should send shockwaves through Silicon Valley”. He added that tech companies “should expect to be held accountable when they put children’s safety second to commercial decisions,” according to Sky News.
On Friday, Prince William tweeted: “Online safety for our children and young people should be a priority, not an afterthought.”
—The Prince and Princess of Wales (@KensingtonRoyal) September 30, 2022
In statements released after the investigation, Meta and Pinterest described how they plan to respond to the finding.
A Meta spokesperson said: “We are committed to ensuring Instagram is a positive experience for everyone, particularly teenagers, and will carefully review the full coroner’s report when it is provided.
“We will continue our work with the world’s leading independent experts to ensure that the changes we make offer the best possible protection and support for teenagers.”
A Pinterest spokesperson said in a statement that the company “listened very carefully” to what the medical examiner and Molly’s family had to say during the investigation.
The statement said: “Pinterest is committed to making continuous improvements to ensure the platform is safe for everyone, and the coroner’s report will be taken seriously.
“Over the past few years, we’ve continued to strengthen our policies around self-harm content, provide ways of compassionate support for those in need, and invested heavily in creating new technologies that automatically detect and take action on self-harm content.”