Does social media harm its users?
In the six months before 14 year old Molly Russell’s suicide in 2017 she interacted with 2,100 depression, suicide or self-harm related items on Instagram.
14 year old Molly Russell took her own life in November 2017 and at the time investigators labelled her death as suicide. But five years later a coroner has found it was likely social media content was partly to blame. Senior coroner Andrew Walker said Russell died from “an act of self-harm whilst suffering from depression and the negative effects of online content”. The finding will put further pressure on social media platforms as their algorithms and business model attract greater scrutiny and criticism.
Social media platforms use software algorithms to identify the types of content a user interacts with and provide more of it. By interacting with depression, suicide and self-harm related content more of it will have been served up to Russell, without her necessarily having to seek it out. The coroner found this was likely to have contributed to her death.
What happens next?
Although the content has been found a likely contributory factor in Russell’s death, seemingly no one is being held accountable – yet. Social media sites have gone some way to clean up their act. And the Children’s Code (introduced in September 2021 under the Data Protection Act 2018) sets out rules for the handling children’s data. Meanwhile the Online Safety Bill currently before Parliament will force platforms to tackle harmful content and introduce new child protection measures, although many say more is needed. Further regulation seems likely and clients operating in this sphere will need to remain vigilant and proactive. For assistance in these endeavours please contact [email protected]