Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

Children’s commissioner urges social media to support better content control

© (iStock)30% of 18 to 24-year-olds quizzed said technology such as social media is causing them to feel lonely (iStock)

SOCIAL MEDIA companies must take more responsibility for protecting children from disturbing content on their platforms, the Children’s Commissioner for England has said.

In an open letter to the platforms most used by children – YouTube, Facebook (which owns Instagram), Snapchat and Pinterest – Anne Longfield urged the companies to back a statutory duty of care and a Digital Ombudsman to act as an independent arbiter between users and the platforms.

She said the death of teenager Molly Russell, whose family later found she had viewed content on social media linked to anxiety, depression, self-harm and suicide before taking her own life in November 2017, highlighted the “horrific amount of disturbing content that children are accessing online”.

Ms Longfield said the rapid growth of social media means it is possible to question whether the firms still have control over the content that appears online.

“If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage,” she wrote.

“The recent tragic cases of young people who had accessed and drawn from sites that post deeply troubling content around suicide and self-harm, and who in the end took their own lives, should be a moment of reflection. I would appeal to you to accept there are problems and to commit to tackling them – or admit publicly that you are unable to.”

Ms Longfield’s letter noted that, by law, “I have the power to demand data pertaining to children from public bodies”, and although this does not cover social media firms, she asked them to provide information on the amount of self-harm related content on each platform, and data on the number of under-18s and under-13s accessing it – the latter in breach of most social media site age restrictions.

She also asked for information on what support the platforms offer to those who seek images of self-harm and what criteria are used to decide on removing content and people from each site.

The letter asked for the firms to back the establishment of a Digital Ombudsman, which the commissioner said would be “able to respond to the concerns of children and parents by demanding greater transparency and action from internet companies so material that is detrimental to the wellbeing of children is removed quickly”.

“With great power comes great responsibility, and it is your responsibility to support measures that give children the information and tools they need growing up in this digital world – or to admit that you cannot control what anyone sees on your platforms,” the letter concludes.

A Snapchat spokesman said: “We work hard to keep Snapchat a safe and supportive place for everyone. From the outset we have sought to connect our community with content that is authoritative and credible and safeguard against harmful content and disinformation.

“All public content that appears on Discover is moderated and highly curated. It contains only content from leading media companies, the public accounts of well-known individuals and creators and Stories curated from users’ Snaps by our in-house news team or professional partners.”

A spokesman for Instagram and Facebook said: “We have a huge responsibility to make sure young people are safe on our platforms and working together with the Government, the Children’s Commissioner and other companies is the only way to make sure we get this right.

“Our thoughts are with Molly’s family and with the other families who have been affected by suicide or self-harm. We are undertaking a full review of our policies, enforcement and technologies and are consulting further with mental health experts to understand what more we can do.

“In the meantime, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags.”