Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner.

Heated words after Instagram chief says posts viewed by Molly Russell were safe

Undated family handout file photo of Molly Russell. Social media content viewed by a teenager in the weeks before she took her own life is too disturbing for even an adult to look at for a long period of time, a coroner’s court has heard.
Undated family handout file photo of Molly Russell. Social media content viewed by a teenager in the weeks before she took her own life is too disturbing for even an adult to look at for a long period of time, a coroner’s court has heard.

An Instagram executive was involved in a heated exchange about allowing children on the site during an inquest into the death of schoolgirl Molly Russell, in which the family’s lawyer shouted “why on earth are you doing this?”

Meta’s head of health and wellbeing, Elizabeth Lagone, said content viewed by the 14-year-old on the platform, which her family argued “encourages” suicide and self-harm, was safe.

Despite defending the site throughout the hearing, Ms Lagone apologised for content that “violated our policies” which was viewed by Molly before she died.

Molly Russell inquest
Elizabeth Lagone, Meta’s head of health and wellbeing (Beresford Hodge)

Towards the end of her evidence, the lawyer acting on behalf of Molly’s parents, Oliver Sanders KC, raised his voice before asking why Instagram permitted children on the platform when it was “allowing people to put potentially harmful content on it”.

Mr Sanders suggested Meta “could just restrict it to adults”, before forcibly putting down his folder and saying Instagram had “no right” to decide what content children could view.

He shouted: “You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this.”

Ms Lagone was taken through a number of posts the schoolgirl engaged with on the platform in the last six months of her life, which she described as “by and large, admissive”.

The senior executive told an inquest at North London Coroner’s Court she thought it was “safe for people to be able to express themselves”, but conceded a number of posts shown to the inquest would have violated Instagram’s policies.

Molly, from Harrow in north-west London, died in November 2017, prompting her family to campaign for better internet safety.

During Monday’s proceedings, videos the teenager accessed on Instagram were played to the court with the coroner once again warning the material had the “potential to cause great harm”.

He said the content “seeks to romanticise and in some way validate the act of harm to young people,” before urging anyone who wanted to leave the room to do so, with one person leaving.

The clips contained references to popular culture, with the witness saying it was allowed on the platform in 2017 because they were “fictional montages”.

Addressing Ms Lagone in the witness box, Mr Sanders prompted an interjection from Meta’s lawyer, Caoilfhionn Gallagher KC, who asked for the coroner to remind him of how witnesses should be questioned in an inquest.

Turing to Mr Sanders, the coroner said: “You have put your point.”

Before the interjection, Mr Sanders raised his voice and asked: “The point is this isn’t it, Instagram portrays itself as trying to strike this very, very difficult balance between who gets harmed by content you are posting, you are balancing and running risks and it really comes back to the question the coroner asked you; why on earth are you doing this?”

Ms Lagone told the inquest the topic of harm was an “evolving field” and that Instagram policies were designed with consideration to users aged 13 and over.

Earlier in her evidence, Mr Sanders said to the witness: “I suggest to you that it is inherently unsafe environment… dangerous and toxic for 13 to 14-year-olds alone in their bedrooms scrolling through this rubbish on their phones.”

“I respectfully disagree,” Ms Lagone responded.

The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.

Referring to all the material viewed by the teenager the family considered to be “encouraging” suicide or self-harm, Mr Sanders continued: “Do you agree with us that this type of material is not safe for children?”

Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.

“Do you think this type of material is safe for children?” Mr Sanders continued.

Ms Lagone said: “I think it is safe for people to be able to express themselves.”

After Mr Sanders asked the same question again, Ms Lagone said: “Respectfully, I don’t find it a binary question.”

The coroner interjected and asked: “So you are saying yes, it is safe or no, it isn’t safe?”

“Yes, it is safe,” Ms Lagone replied.

The coroner continued: “So having created this environment, you then seek to make it safe?

Ms Lagone replied: “Certainly, we take the safety of users very seriously…”

“What did people do before people created this environment for them?” the coroner asked.

“I’m not sure what you mean,” Ms Lagone said.

“You create the danger and then you take steps to lessen the risk and danger?” the coroner said.

Ms Lagone replied: “The technology has been developed and… we take our responsibility seriously to have the right policies and processes in place.”

Responding to questioning Mr Sanders KC about whether she was sorry about the content Molly saw, Ms Lagone told the court: “We are sorry that Molly saw content that violated our policies and we don’t want that on the platform.”

The inquest, expected to last two weeks, continues.