What it’s like to remove images of child sexual abuse from the internet





Peter* loves his job. He helps children and is proud of it. But every day after he finishes work, he goes through a ritual that involves putting up as big a wall as possible between work and the rest of his life.

Often he has to forget what he has just seen. So Peter puts on his cycling leathers and begins the half-hour drive home along country lanes to his wife and children who give him the boundary he needs.

“If I change my clothes and get on my bike, I can get rid of everything that concerns me pretty quickly,” he says.

“By the time I get home, the kids are jumping on me for dinner and you’re right back into family life. You don’t have time to think about anything else. I’m proud of what I do: I talk to my wife when I’ve had a terrible day, but I won’t go into details.”

Peter’s job is to go online to identify child sexual abuse material (CSAM) so that it can be taken off the internet. He is a senior analyst for the Internet Watch Foundation (IWF), a charity dedicated to identifying and removing such material, funded by 171 organizations, including Facebook, Google, Apple, TikTok, Amazon and Microsoft.

Automated systems, including an intelligent crawler bot, help suspicious websites look for CSAM. But there is also an essential human element in the foundation’s work. Every page it removes has been reviewed by its human analysts.

Working out of a gleaming glass office building in a business park on the outskirts of Cambridge, they come from a variety of former professions, including the armed forces, police, office workers and prison guards.

Clara* was a high school teacher and a helpline volunteer for sexual abuse survivors before joining the IWF as an analyst five and a half years ago.

“We’re seeing the worst of the worst, in many ways, and you could escape the average workweek with a very bleak view of humanity,” she says.

But the seriousness of the job means the team bypasses the bickering and pettiness of ordinary offices and forges close bonds based on a mutual understanding of what they’re dealing with.

read more

Up to 850,000 pose sexual abuse threat to children in UK, NCA warns

“I’m not saying the longer you do it the easier it gets because we’re not immune and we see things that are disturbing. We all have bad days when we are more vulnerable. But we do have a good support system for those bad days, to avoid becoming too insensitive or complacent.”

The charity’s 15-person team is trained to assess whether images and videos flagged by the public and police constitute CSAM or not.

Identified material is converted into hashes – unique strings of alphanumeric characters – and collected into lists, which are shared with the industry to prevent it from being uploaded, shared or hosted. Another six analysts are working on a separate task force to categorize the most serious material from the government’s image bank on child abuse.

Worryingly, the amount of such material being detected by organizations around the world is increasing. Experts say the problem has been exacerbated by the pandemic and lockdowns. And while the increase can be partly attributed to improved detection methods, the evidence also points to an increase in online grooming and live streaming for a fee.

The IWF identified 153,383 reports of verified child sexual abuse material in 2020, a 16 percent increase from the previous year, while the US National Center for Missing and Exploited Children’s 30 analysts processed 60,000 reports per day.

“When I was in the army I thought I had seen it all, but I had never seen anything like what we see here”

Peter*

Days at the IWF office follow a tightly regulated routine, based on strict seven-hour days with no overtime or out-of-hours contact. Every morning, the team checks a URL list of images, videos, and sites that the foundation has marked for removal after detecting CSAM the previous day, and sees if it was taken offline by the host.

Some members process reports submitted by the public or the police, while others proactively search the web for CSAM to add to the hashing database. The nature of the work meant that the team continued to travel to the office during the lockdown restrictions.

Analysts should never work alone, partly as a security measure to ensure proper procedures are followed, but also to ensure support is available if an analyst sees something upsetting.

“Even analysts who have been in the business for a long time can see things that surprise them, and they shouldn’t be doing it alone,” the IWF says.

read more

Online grooming of young girls increases by 60% in three years as four in five victims are women

It takes months of careful checkups and training to become an analyst, with successful candidates required to attend mandatory monthly counseling sessions. Over two decades as an image analyst with the RAF led Peter to believe he was prepared for the kind of graphic material he might encounter, an assumption he now realizes was incorrect.

“I was very ignorant when I started because I had never seen child abuse before,” he says. “It didn’t register that it was that common, because it’s not just hidden in dark corners… Being in the military, I thought I’d seen it all, but I’d never seen anything like what we see here. ”

When asked why analysts would choose to work in what many people consider to be the worst job imaginable, the answer is surprisingly simple. Clara says she felt like if she could stand CSAM’s assessment she should do it, describing it as feeling close to duty.

“I do have days when I look at a blatant series of images or a repeat offender and think, ‘How are you able to do this?’ But I made up for that by looking around the room and realizing that I work with a bunch of really nice people who do really good things. It is very rewarding work and that feels a bit like a luxury. It is something wonderful, something very precious.”

Peter wasn’t sure he was always instructed to do the right thing in the military. But he doesn’t doubt his work now, even if the amount of content for review seems insurmountable.

Several years ago, he saved a 12-year-old girl from further abuse by tracing the gym clothes she was wearing to a school in the United Kingdom and by alerting the Child Exploitation Online Protection Command, the National Crime Agency’s division that deals with crimes. child exploitation.

“She was cared for for years, I will never forget the feeling of absolute joy that we had helped her. I have a young daughter of my own, and I am terrified of her.

“Every few months we seem to get more and more content and every year the amount we remove is growing, but we seem to be getting better and better at removing it. That’s exciting, I love being on the front line.”

*Names have been changed

According to the IWF, girls between the ages of 11 and 13 are most at risk of being targeted by sexual predators online.

“Self-generated” images captured by the child account for nearly half (44 percent) of the 153,383 pieces of CSAM reported in 2020.

This represents a 77 percent increase in the percentage of self-generated CSAM detected compared to 2019, in what the IWF considers an “exponential increase” in content captured by children who have been groomed, tricked, or extorted into committing a sexual act. produce and share video or image of themselves.




Leave a Comment