An analyst dedicated to removing child sexual abuse material from the internet states her ongoing effort to remain “one step ahead” of perpetrators. This individual, known by the pseudonym Mabel to safeguard her identity, is employed by the Internet Watch Foundation (IWF), a charitable organization situated in Histon, Cambridgeshire. The IWF operates with a team of frontline personnel whose role involves detecting and eliminating online child sexual abuse imagery globally. This year, the charity has identified more than half a million victims, though it notes that technological advancements are intensifying the challenges of its work. “The bad guys always seem to be one step ahead,” remarked Mabel, a mother and grandmother who commenced her role as an analyst four years ago. She added, “You’re always trying to uncover what they’re doing next to get ahead of them.” Mabel’s process for removing illicit content involves either responding to reports from the public or actively searching for such material. She observed that the methods criminals use to conceal content are “evolving all the time.” Mabel provided an example: “You might need to watch a video for instance to find a password to unlock another video somewhere else.” To counter these sophisticated methods, the IWF maintains an on-site technology team continuously researching new software to assist in decrypting codes found on the web. Mabel, a former police officer, described her work as demanding but also deeply fulfilling. She expressed, “The thought that I can protect my younger grandchildren from seeing this stuff, or maybe even being lured down that road, it’s a huge sense of pride.” IWF staff members who review imagery are required to attend monthly counselling sessions, with additional sessions available upon request. They are also provided with regular breaks and downtime during their shifts. Access to emails is restricted to the office environment, which is heavily secured; one floor of the building is accessible only to authorized staff members. Dan Sexton, the IWF’s chief technology officer, commented that progress in artificial intelligence (AI) is “only going to make our work harder.” He further explained that new technologies are constantly emerging and subsequently being exploited by criminals. “With generative AI there’s the capability for people creating effectively an infinite amount of new child sexual abuse (CSA) content,” Mr Sexton stated. This year alone, the organization has documented 563,590 child victims solely through images, with the majority being girls aged between seven and 10. Mr Sexton acknowledged that these figures might be alarming to some, but they were not unexpected for him. He noted, “Just this year we have more than 2.7m unique images of CSA that we’ve come across and that is only growing.” He attributed the removal of every image to “really incredibly hard-working staff” such as Mabel and other frontline personnel. For individuals affected by these issues, the NHS offers information on available support. Anyone with concerns regarding a child is advised to contact the NSPCC helpline. Follow Cambridgeshire news on BBC Sounds, Facebook, Instagram and X. Copyright 2024 BBC. All rights reserved. The BBC is not responsible for the content of external sites. Read about our approach to external linking. Post navigation Neo-Nazi with Home ‘Armoury’ Sentenced to 10 Years Man Deemed Unfit for Trial Receives Indefinite Hospital Order for Fatal Stabbing