Police warning over children in violent and racist WhatsApp groups
Police have issued a warning after children as young as nine were added to WhatsApp groups promoting self-harm, sexual violence and racism.
Schools in Tyneside said that pupils in years 5 and 6 (ages 9 to 11) had been added to the groups, some of which had 900 members and contained sexual images and pictures of mutilated bodies, a BBC investigation found. Northumbria police has sent a warning to schools across its area.
On Thursday WhatsApp reduced its age limit from 16 to 13, a move criticised by MPs, campaign groups, teachers and academics.
The government is set to release a consultation this month about banning the sale of smartphones to under-16s and empowering parents more to control child access to social media.
The plans would build on the Online Safety Act, being implemented by Ofcom, that forces tech companies to take down illegal content and shield children from harmful material.
Ofcom is proposing that services such as WhatsApp, which is owned by Meta, should have default settings that protect children more.
Mark Bunting, director of online safety strategy at the watchdog, told BBC Radio 4’s Today programme: “We’ve made proposals already about things that services can do to keep children safe. A lot of this is about the default settings that apply to child users.
“We’ve made recommendations that services shouldn’t prompt children to expand their network of friends, [should] not recommend children to other users, and crucially, [should] not allow people to send direct messages to children that they’re not already connected with.
“In the next phase of work, we’re going to be looking at these issues about risks that arise when children can be added to groups without their consent.”
Bunting said that Ofcom had the power to ultimately fine the companies that failed to comply with its rules, which will come into force next year. The act also has criminal sanctions for managers at companies that refuse to co-operate with Ofcom.
Rishi Sunak told the BBC that, as a father of two children, he believed it was “imperative that we keep them safe online”. He said the Online Safety Act was “one of the first anywhere in the world” and would be a step towards that goal.
“What it does is give the regulator really tough new powers to make sure that the big social media companies are protecting our children from this type of material,” he said.
• Inside the Kent school that locks smartphones away
“They shouldn’t be seeing it, particularly things like self-harm, and if they don’t comply with the guidelines that the regulator puts down, they will be in for very significant fines, because like any parent we want our kids to be growing up safely, out playing in fields or online.”
The decision by Meta, which owns WhatsApp, to lower the minimum age of users from 16 to 13 attracted criticism from teachers, MPs and campaigners, who accused the tech company of taking a “highly irresponsible” approach to child safety.
The campaign group Smartphone Free Childhood called on Meta, which also owns Facebook, to reverse a move it called “tone deaf”. Separately, a member of the Commons education select committee said the “unilateral decision” was reckless.
WhatsApp is the second most popular platform for children, with 55 per cent of those aged 3 to 17 using it, according to Ofcom research from last year. Closed messaging groups rarely receive the level of scrutiny that platforms such as TikTok or Instagram receive but, experts suggest, can be just as harmful.
Daisy Greenwell, a co-founder of Smartphone Free Childhood, said: “WhatsApp is putting shareholder profits first and children’s safety second. Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike.
“Among parents, WhatsApp is seen as the safest social media app, ‘because it’s just messaging, right?’ And in that way it works like a gateway drug for the rest of the social media apps. If you’re messaging your friends on WhatsApp, why not message them on Snapchat? WhatsApp is far from risk-free. It’s often the first platform where children are exposed to extreme content, bullying is rife and it’s the messaging app of choice for sexual predators due to its end-to-end encryption.”
Vicky Ford, a Conservative member of the education select committee, said: “Social media can be very damaging for young people. WhatsApp, because it’s end-to-end encrypted, is potentially even more dangerous, as illegal content cannot be easily removed. So for Meta to unilaterally decide to reduce the age recommendation for WhatsApp, without listening to affected parents, seems to me to be highly irresponsible.”
A 2020 study of 9 to 17-year-olds in Israel found that 56 per cent of the students experienced cyberbullying or victimisation in their WhatsApp classmate groups.
Mike Baxter, principal of the City of London Academy in Southwark, south London, said he had seen pupils invite other children to WhatsApp groups and subject them to a “barrage of abuse” before removing them. This activity sometimes happens in the early hours. “That’s happening at one in the morning,” he said. “That’s not conducive for any 12 or 13-year-old to sleep well.”
Dr Kaitlyn Regehr, associate professor at University College London’s department of information studies, said that behaviour in closed groups can often be worse than on open social media platforms. She recently carried out research with the Association of School and College Leaders on misogyny and hate speech among young people on social media.
“What we saw is that young people are more likely to share harmful content and for that harmful content to increase in the extreme nature of it on private message apps. Through the high dosages that young people consume by way of social media, those ideologies become normalised for them,” she said.
Miriam Cates, a Tory MP who has backed a social media ban for under-16s, said: “It is increasingly clear that social media and messaging platforms are detrimental to children’s wellbeing.
“WhatsApp and other social media platforms should not allow users under the age of 16. It’s clear that the tech companies have no interest in voluntarily acting to protect children, so governments must.”
WhatsApp said that it lowered the age limit to bring it in line with the majority of countries around the world and added that it had protections, such as the ability to block someone who messages you for the first time.
Separately, Meta has announced that it is testing a nudity filter for Instagram that seeks to limit the growing threat of “sextortion”, whereby users are duped into sending compromising photos of themselves and then blackmailed. The filter will blur nude photos sent to Instagram users and warn people about sending them when it detects one.
Post Comment