A person, and it’s a growing problem, because viewing that sort of material is damaging and traumatic. There have been strides in using AI to catalogue and identify “known” images or videos – it can screen through somebody’s hard drive and say “this jpg matches known existing images on file from previous crimes” and then a person doesn’t have to look at those images in order to make a case against the criminal.
But the economy of abuse material is built on participants being incentivized to constantly produce new and more extreme material. This excellent article talks about it more: [https://www.wired.com/story/tracers-in-the-dark-welcome-to-video-crypto-anonymity-myth/](https://www.wired.com/story/tracers-in-the-dark-welcome-to-video-crypto-anonymity-myth/). In order to access more material on the site, users could either pay in cryptocurrency… or create and upload new material of their own.
However, close examination of the illegal material is sometime necessary, unfortunately, as it can help to identify and rescue the children being exploited. There are law enforcement sites where images are posted of, like, an unusual jacket, carefully cropped out of a picture, in the hopes that someone will recognize it and help to narrow down the search for child who has been so carefully cropped out of the picture. I saw a TED talk of a researcher whose work in biometrics is currently focussed on identifying abusers by the scars and moles and freckles on their hands – often all that’s visible of them on camera.
But you’re correct that, very frequently, a person has to review it. And very often that person is traumatized, and regulations and support for it are all over the place, and it’s not ideal.
Latest Answers