Facebook Vice President Pledges to Do More to Prevent Self-Harm

Amy Coleman
Digital Communications Manager
Beginner
Nick Clegg

Facebook vice-president Sir Nick Clegg has vowed to do “whatever it takes” to make its platforms safer for young people and adults.

This is following several stories where it has been suggested that Facebook and other social media platforms had allowed youngsters to have access to online content about suicide and self-harm, which resulted in several young people losing their lives.

One story from 2017, about 14-year-old Molly Russell hit the headlines last week when her father, Ian Russell believed that his daughter was being fuelled by distressing images of depression, suicide and self-harm on social media platforms including Instagram and Pinterest.

These images were pushed through to Molly through her searches of key words and hashtags, with the social platform’s algorithm targeting these graphics towards her from her search history. Her father went on to say that he believes Instagram is partly responsible for his daughter’s death.

As a social platform, there is an element of social responsibility from each company to ensure the safety of their users. Granted, majority make it clear that they are not responsible for the third-party content which is shared on their platforms, but, if there is something that suggests harm to their users, they have tools and indicators for the user to report this and potentially get the content removed.

A spokesperson from Instagram states that the company does not allow such content that promotes or encourages self-harm, suicide or eating disorders on the platform and uses technology to find this content and remove it.

But the debate still goes on. Should social media platforms have more of a responsibility when it comes to the safety of its users? Whilst Instagram has announced that a full review will be undertaken regarding the content on the platform, this week, health secretary Matt Hancock has suggested there may be a ban on social platforms in the UK if action isn’t taken.

In response to the case of Molly Russell, Sir Nick touched upon the idea that it is wise to keep some images up on social platforms as they enable people to find support and aid for the problems or issues they may have. Sir Nick went on to suggest that all platforms need to investigate these cases and Facebook is already actively looking at ways to improve the safety of its users.

“We’re already taking steps soon to blur images, block a number of hashtags that have come to light, and thirdly to continue work… with the Samaritans and other organisations”.

With this in mind, and Facebook already hitting headlines last week over the Martin Lewis fake ads scandal, there is the hope that other social platforms will follow foot to protect their online followers.

Amy Coleman
Digital Communications Manager