When over half the world’s population use the internet on a regular basis, creating a safe digital ecosystem is about more than simply banning harmful content. Following the publication of the Online Harms white paper, Elaine Bousfield (founder of XenZone), explains how social media companies can actively improve mental and emotional wellbeing.
The Department for Digital, Culture, Media and Sport (DCMS) has this month proposed an independent watchdog that will write a “code of practice” for tech companies, taking us one step closer to regulation of the internet. While it is unarguably necessary for the Government to look at ways in which it can act to police the digital realm to help keep young people safe online, social media companies need to finally realise they have a real responsibility to children, and should make safeguarding part of their organisational DNA.
There is no doubt that digital safeguarding hasn’t been a priority for tech and social media companies thus far – but these are huge organisations with near unlimited resources and extremely talented people using cutting edge technology. If they choose to make a concerted attempt to solve this issue, they can.
And we think they should.
While policing tech companies may force that choice in some cases, it is not possible to police all online conversations, nor is it desirable. Sanitising interaction and censoring online content isn’t the answer (although creating a culture where trolling and bullying is frowned upon and called out by the community is not a bad idea). Instead, in the midst of our rush to implement a digital ‘SWAT team’ for tackling online harms, we must prioritise developing a safe digital ecosystem for those who need it. Social media companies need to become proactive in working with partners to create safe spaces that help young people support each other.
Via Kooth – the mental health and wellbeing platform for children and young people available in over 100 Clinical Commissioning Group (CCG) areas in England and Wales – we have shown that with the right focus it is possible to provide such a space. The primary purpose of this online, peer and therapist-led service is to support children and young people’s mental and emotional well-being, and to do so digitally. Kooth also maintains a strong online community where discussion forums and a virtual magazine provide peer support and self-help.
The social space provided on Kooth is similar to that of other social media platforms, in that it provides a feed of information, gives young people the ability to scroll through content they want to engage in, and offers them the chance to comment and contribute.
However, unlike said platforms, Kooth offers regulated safeguarding, which is built into the very fabric of the service. This ensures harmful elements present on other popular platforms are filtered out. It also sits within a platform where all users are anonymous, where viral sharing of personal imagery is not possible, and where there is no functionality for peer to peer private messaging.
Crucially, all content is pre-moderated, rather than immediately posted to the site. While pre-moderation potentially creates a time lag between creation and publication, it is undoubtedly the most effective way of protecting children and young people. Pre-moderation of all content prevents trolling from those who scour the internet to actively cause harm, and it safeguards against triggering content, such as content depicting graphic self-harm.
Content boundaries are also shared with young people so they are aware of when their content may be declined publication if inappropriate or likely to cause harm. This act in itself replicates adherence to and respect for boundaries, as is necessary in daily life. In fact, much of what happens within Kooth reflects relational behaviour offline. This is particularly powerful when considering how young people engage supportively with a social media platform, and how this can be reflected in their behaviour across the wider web and beyond.
So the question is: do our safeguarding practises deter young people from visiting Kooth? On the contrary, the past year has seen Kooth receive record numbers of article views, with over 250,000 active users at any one time, and more than 1,700 logins every 24 hours. One young user, Chloe, even credits the platform with her being alive today, saying “Kooth has saved my life and I just wish I’d known about it years ago”. Chloe, who made multiple attempts to end her own life following a relentless campaign of bullying at her school, explains that the remote, screen-based way of communicating with other young people experiencing similar mental wellbeing issues, and with her Kooth therapist, has been a lifeline, enabling her to be open and honest without fear of feeling judged. “Thanks to Kooth, I have a future,” she says.
The digital world is now just as real as the physical world, with a growing proportion of children and young people having grown up as digital natives. Kooth recognises that virtual communities and social media engagement can offer validation, social connectedness, and improved knowledge as positive outcomes. If this is where young people are spending their time, we need to meet children and young people in this space and provide them the tools they require to remain safe and resilient. It’s time for social networking sites to take safeguarding seriously – to develop these tools and actively invest in safe spaces that nurture resilience and protect children and young people from harm.