Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.
If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.
Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.
> If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life
We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.
Well work on making those resources available instead of, again, informing CSAM creators how to better hide their activities. I fail to see how repeatedly removing CSAM from a single IP address is more of a boon to CSAM distributors than playing whackamole with multiple IP addresses. Wasting law enforcement resources on other things while CSAM producers are free to operate is a separate, and in my opinion much more pressing issue.
> Wasting law enforcement resources on other things while CSAM producers are free to operate is a separate
It's been a long time since I had anything remotely to do with this (thankfully) but... I'm pretty sure there are lots of resources devoted to this, including the major (and even small) platforms working with various authorities to catch these people? Certainly to say they're "free to operate" requires some convincing.
Pick a lane. Either we have the resources to go after CSAM producers, in which case we should be using them; or we don't, in which case we should be getting those resources. In either scenario, banning IPs is a counterproductive strategy to combat CSAM and it is a terrible justification for permitting IP bans.
> Either we have the resources to go after CSAM producers, in which case we should be using them; or we don't, in which case we should be getting those resources
We don’t have the resources and we don’t want to divert them.
> banning IPs is a counterproductive strategy to combat CSAM and it is a terrible justification for permitting IP bans
The simple reason for banning Russian and Chinese IPs is the same as the reason I block texts from Vietnam. I don’t have any legitimate business there and they keep spamming me.
I'm not the one you were arguing with initially, I just wanted to address the idea that child abusers are just free to do whatever they want, and we're not doing anything about it.
> informing CSAM creators how to better hide their activities
This adds to their risks and costs. That tips the economic balance at the margin. Actually going after all creators would require an international law-enforcement effort for which, frankly, there isn't political capital.
> This adds to their risks and costs. That tips the economic balance at the margin.
Charging would be bank robbers a fee to do practice runs of breaking into a vault adds to their costs; somehow that doesn't seem like an effective security measure.
> Actually going after all creators would require an international law-enforcement effort for which, frankly, there isn't political capital.
I'm not talking about going after all creators, just the ones you have the identifying information for which are so continuously pumping out such quantities of CSAM that it is impossible to stop the firehose by removing the content.
If you don't have the political capital to go after them, again you have bigger issues to deal with.
We're talking about websites like Youtube implementing hardware and IP bans. If your argument is that these are easily circumventable by CSAM distributors, that seems like all the more reason not to use them to combat CSAM.
I know that some services do this in addition to account ban.