Police data provided to the children’s charity NSPCC indicates that the messaging application Snapchat serves as the primary platform for online grooming activities. Over 7,000 instances of Sexual Communication with a Child offences were documented throughout the UK during the period leading up to March 2024, marking the highest count since this specific offence was established. Snapchat accounted for almost 50% of the 1,824 incidents where law enforcement noted the particular platform utilized for grooming. The NSPCC stated that this data demonstrated society was “still waiting for tech companies to make their platforms safe for children.” Snapchat informed the BBC that it maintained “zero tolerance” for the sexual exploitation of young individuals and had implemented additional safety protocols for adolescents and their guardians. Becky Riggs, who leads child protection for the National Police Chief’s Council, characterized the information as “shocking.” She further stated, “It is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow.” While police did not consistently record the gender of victims in grooming offences, in the instances where this information was available, four in five victims were girls. Nicki, whose actual name is withheld by the BBC, was eight years old when a groomer contacted her via a gaming application, prompting her to move their conversation to Snapchat. Her mother, referred to as Sarah by the BBC, elaborated, “I don’t need to explain details, but anything that you can imagine happening happened in those conversation – videos, pictures. Requests of certain material from Nicki, etcetera.” Subsequently, she established a deceptive Snapchat profile, posing as her daughter, and the man sent messages, leading her to alert the police. Currently, she inspects her daughter’s devices and communications every week, even though her daughter objects. She informed the BBC, “It’s my responsibility as mum to ensure she is safe.” She also stated that parents “cannot rely” on applications and games to fulfill that protective role. Rani Govender, child safety online policy manager at the NSPCC, noted that Snapchat, despite being a smaller social media platform in the UK, enjoys significant popularity among children and teenagers. She added that this is “something that adults are likely to exploit when they’re looking to groom children.” However, Ms Govender highlighted “problems with the design of Snapchat which are also putting children at risk.” She explained that messages and images on Snapchat vanish after 24 hours, complicating the tracking of incriminating conduct, and senders are also notified if a recipient captures a screenshot of a message. Ms Govender mentioned that the NSPCC receives direct feedback from children who identify Snapchat as a source of worry. She told the BBC, “When they make a report [on Snapchat], this isn’t listened to, and that they’re able to see extreme and violent content on the app as well.” A spokesperson for Snapchat conveyed to the BBC that the sexual exploitation of young individuals was “horrific.” They further stated, “If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.” The documented occurrences of grooming have risen since the Sexual Communication with a Child offence was enacted in 2017, achieving a new peak of 7,062 this year. In the past year, among the 1,824 cases where the platform was identified, 48% were documented on Snapchat. According to the data, reported grooming offences on WhatsApp experienced a minor increase in the last year. Conversely, known cases on Instagram and Facebook have decreased in recent years. Meta owns all three of these platforms. WhatsApp informed the BBC that it has implemented “robust safety measures” to safeguard users on its application. Jess Phillips, the minister for safeguarding and violence against women and girls, asserted that social media companies “have a responsibility to stop this vile abuse from happening on their platforms”. She further stated in an announcement: “Under the Online Safety Act they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services or face significant fines.” The Online Safety Act incorporates a legal mandate for technology platforms to ensure the safety of children. Starting in December, major technology companies will be required to release their risk assessments concerning illegal harms present on their platforms. Ofcom, the media regulator tasked with enforcing these regulations, stated: “Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children. “We’re prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes.” Copyright 2024 BBC. All rights reserved. The BBC bears no responsibility for the content of external websites. Information regarding their approach to external linking is available. Post navigation Court Hears Stepmother of Sara Sharif Was Described as ‘Possessed’ Man Imprisoned for Deliberately Driving into Pedestrians