top of page

Is Social Media Monetising Misery?

Written by: Antony Bream, Executive Contributor

Executive Contributors at Brainz Magazine are handpicked and invited to contribute because of their knowledge and valuable insight within their area of expertise.

 

A recent court case covering the tragic suicide of Molly Russell, a 14-year-old from Harrow near London in the United Kingdom, has very much put the actions, principles, and profiteering of social media platforms such as Meta, previously known as FaceBook, SnapChat, Instagram, Pinterest and others in the dock. Molly viewed thousands of posts on social media platforms covering topics ranging from depression, self-harm, and suicide in the months before she very tragically took her own life in 2017. Sue Maguire, Holly’s headteacher from school, told North London Coroner's Court that social media had challenges that did not exist 10 or 15 years ago.

Danger Risks sign on social media.

Giving evidence, Ms. Maguire said: "Our experience of young people is that social media plays a huge, dominant role in their lives, and it causes no end of issues. "There's a level where I want to say it's almost impossible to keep track of social media, but we have to try and we have to respond to the information as we receive it." A child psychiatrist also asked to give evidence was "not able to sleep well" after seeing self-harm material viewed on social media Holly the inquest heard. Giving evidence at North London Coroner's Court, Dr. Venugopal said he saw no "positive benefit" to the material viewed by the teenager in the months before she died. Under questioning from coroner Andrew Walker, the psychiatrist said: "This material seems to romanticise, glamorise, and take the subject of self-harm, take it away from reality and make it seem almost unreal, take away from these terrible acts any kind of consequence." The coroner asked: "You have looked at the material, do you think that the material that Molly viewed had any impact on her state of mind?" Dr. Venugopal replied: "I suppose I will start off, I will talk about the effect the material had on my state of mind. I had to see it over a short period of time and it was very disturbing and distressing." "There were periods where I was not able to sleep well for a few weeks, so bearing in mind that the child saw this over a period of months, I can only say that she was [affected] – especially bearing in mind that she was a depressed 14-year-old.” Representatives from some of the social media platforms that Molly used, including twitter, have been asked to give evidence as to the morals, ethics and safety issues with the graphical images, videos, and posts Molly was able to search for, watch or read and connect with other strangers on the network allegedly or correctly suffering from the same mental health issues. The Instagram posts about suicide and depression viewed by Molly before she took her own life "were safe," an inquest has heard from Pinterest's global head of community operations, who said he was "not able to answer" how children could agree to potentially being exposed to content inappropriate for a child. But then also admitted that the platform the content was stored on was not safe. Algorithms are built specifically to drive more ‘related’ content from a search term, such as self-harm or depression so that users get hooked in.


Molly was being messaged directly by strangers allegedly suffering from the same conditions encouraging her to look more content, think and act a certain way and ultimately drove her to see her life as worthless, better off for her and her parent’s and family sake dead as she was a distraction, a failure, not worthy of their love anymore. This was a vulnerable 14-year-old girl living in a secure and loving family home with a bright future ahead of her, brainwashed by social media and sucked into its trap of machine learning algorithms with no controls or safety measures to stop Molly from being continually bombarded. Her family claims the content encouraged suicide and self-harm. Her Father is now heading up a campaign to remove content from social media promoting self-harm and has traveled to the USA to meet with other parents bereaved by suicide. They have themselves come to similar conclusions that social media had led to a significant rise in the probability that the effects of teenage depression could be worsened as a result of the content being easily searchable on the platforms with minimal and ineffective controls in place. An executive from Meta giving evidence at the trial, which also owns Instagram, said she believed it was "safe for people to be able to express themselves" online. She added the posts were "complex" and often a "cry for help." The inquest was told out of the 16,300 posts Molly saved, shared, or liked on Instagram in the six- months before her death,2,100 were depression, self-harm, or suicide-related. The coroner leading the case concluded Molly died from an act of self-harm while suffering from depression and the negative effects of online content. He said the images of self-harm and suicide she viewed "shouldn't have been available for a child to see." The father of Molly has called for urgent changes to make children safer online after the inquest found social media content contributed "more than minimally" to her death. After the hearing finished in September 2022, Molly's father Ian Russell, said: "It's time to protect our innocent young people instead of allowing [social media] platforms to prioritise their profits by monetising the misery of children." In a statement, the National Society for the Prevention of Cruelty to Children (NSPCC), the UK's leading children's charity preventing abuse and helping those affected to recover, Chief Executive Sir Peter Wanless said: "This should send shockwaves through Silicon Valley ‒ tech companies must expect to be held to account when they put the safety of children second to commercial decisions. The magnitude of this moment for children everywhere cannot be understated." Further reaction to the inquest's findings came from the Prince of Wales. "No parent should ever have to endure what Ian Russell and his family have been through," he said in a statement. They have been so incredibly brave. Online safety for our children and young people needs to be a prerequisite, not an afterthought."


There is a mixed set of views from adults and experts working on the mental care side for children and teenagers to the parents, who very clearly see a link between Molly’s suicide and her use of social media. The UK Government report on ‘The Mental Health of Children and Young People in England’ published in December 2016 found that 50% of those with lifetime mental illness (excluding dementia) will experience symptoms of mental ill health by the age of 14 and that 12% of young people live with a long-term condition. The report links resilience to help children and young people address, cope with and manage mental health conditions but at the same time states that there is a serious problem with the commissioning and provision of children’s and adolescents’ mental health services leading to support very much becoming a full-time role of parents as well as normal parenting responsibilities. This is also compounded by the strong influence social media platforms have with no legal controls or restrictions to manage the viewing and posting of damaging content resulting in mental ill health, or at worst, suicide. The statistics are compelling and worrying to read. The largest cause of death for males under 35 in the UK is a suicide, and there are 695,000 children aged 5 to 16 years in England with clinically significant mental health illness, as of 2016 and not including the impacts of the Covid pandemic and lockdowns, which probably means this number is now nearer to 1 million. How much this is related to social media usage is unknown, but nevertheless, the new evidence states it could be a determining factor for many. So, is this a moral or legal dilemma in the case of social media controls, governance and compliance? Sports betting went through a similar phase whereby betting addicts were racking up unpayable debts as the controls on minimum bets were not in place, nor the support networks also affected mental health and often as an addiction due to mental ill health. Yet this industry is now heavily regulated to address this issue and place levels of controls on its usage as deemed harmful without these controls. This then begs the very BIG question, when we are particularly talking about the lives and mental wellbeing of young children and teenagers, why aren’t the social media platforms also being put under this level of scrutiny to address the very obvious and long-term damage they are causing without consequence or conscience? Sadly, this case and many others will start to highlight and uncover the mental health risks social media pose to people who are or can be consumed not just by the content or messaging capabilities at stages in their life when they are vulnerable due to age, low self-esteem or through periods of mental ill health and depression, but for many, it’s already too late. What will happen next now the high-profile case of Molly Russell has shed light on the attitudes and ways of working for social media platforms? Often in these cases, new laws are tabled, debated and, in some cases, eventually made law. Unfortunately, they often take time, probably have political inferences and also are confronting a powerful multi-billion industry delivering significant tax revenues to countries putting these laws forward.


In Molly’s case, the UK is lucky her Father, wracked with grief, is starting to uncover the necessary evidence needed to shine a light on a major source and contributing factor, as determined by the coroner and associated child support groups, of self-harm and suicides globally. Maybe the tide is starting to turn, and much-needed and improved governance controls, laws, and restrictions will be put into place to protect those currently using and growing up to use what has been proven to be a very destructive channel of misery driving profits from some and dire personal and family consequences for others.


RIP Molly Russell (aged 14 at death)


Follow me on Facebook, LinkedIn, and visit my website for more info!


 

Antony Bream, Executive Contributor Brainz Magazine

Antony Bream, is a business advisor and executive coach working closely alongside founders, boards and their teams to help them and their businesses take the leap to their next level. With a passion for understanding the processes and psychology behind how companies sell their products and customers buy them, he formed Ribbit Consulting to bring that experience and knowledge to his customers to empower them to reach their full potential.

  • linkedin-brainz
  • facebook-brainz
  • instagram-04

CHANNELS

CURRENT ISSUE

Morgan O. smith.jpg
bottom of page