from the no-easy-answers dept
The failures of the NCMEC CyberTipline to combat child sexual abuse material (CSAM) as well as it could are extremely frustrating. But as you look at the details, you realize there just aren’t any particularly easy fixes. While there are a few areas that could improve things at the margin, the deeper you look, the more challenging the whole setup is. There aren’t any easy answers.
And that sucks, because Congress and the media often expect easy answers to complex problems. And that might not be possible.
This is the second post about the Stanford Internet Observatory’s report on the NCMEC CyberTipline, which is the somewhat useful, but tragically limited, main way that investigations of child sexual abuse material (CSAM) online is done. In the first post, we discussed the structure of the system, and how the incentive structure regarding law enforcement is a big part of what’s making the system less impactful than it otherwise might be.
In this post, I want to dig in a little more about the specific challenges in making the CyberTipline work better.
The Constitution
I’m not saying that the Constitution is a problem, but it represents a challenge here. In the first post, I briefly mentioned Jeff Kosseff’s important article about how the Fourth Amendment and the structure of NCMEC makes things tricky, but it’s worth digging in a bit here to understand the details.
The US government set up NCMEC as a private non-profit in part because if it were a government agency doing this work, there would be significant concerns about whether or not the evidence it gets was collected with or without a warrant under the Fourth Amendment. If it’s a government agency, then the law cannot require companies to hand over the info without a warrant.
So, Congress did a kind of two-step dance here: they set up this “private” non-profit, and then created a law that requires companies that come across CSAM online to report it to the organization. And all of this seems to rely on a kind of fiction that if we pretend NCMEC isn’t a government agent, then there’s no 4th Amendment issue.
From the Stanford report:
So, it’s quite important that the service providers that are finding and reporting CSAM are not seen as agents of the government. It would destroy the ability to use that evidence in prosecuting cases. That’s important. And, as the report notes, it’s also why it would be a terrible idea to require social media to proactively try to hunt down CSAM. If the government required it, it would effectively light all that evidence on fire and prevent using it for prosecution.
That said, the courts (including in a ruling by Neil Gorsuch while he was on the appeals court) have made it clear that, while platforms may not be government agents, it’s pretty damn clear that NCMEC and the CyberTipline are. And that creates some difficulties.
This is all pretty important in making sure that the whole system stays on the right side of the 4th Amendment. As much as some people really want to force social media companies to proactively search for and report CSAM, mandating that creates real problems under the 4th Amendment.
As for the NCMEC and law enforcement side of things, the requirement to get a warrant for unopened communications remains important. But, as noted below, sometimes law enforcement doesn’t want to get a warrant. If you’ve been reading Techdirt for any length of time, this shouldn’t surprise you. We see all sorts of areas where law enforcement refuses to take that basic step of getting a warrant.
Understanding that framing is important to understanding the rest of this, including exploring where each of the stakeholders fall down. Let’s start with the biggest problem of all: where law enforcement fails.
Law Enforcement
In the first article on this report, we noted that the incentive structure has made it such that law enforcement often tries to evade this entire process. It doesn’t want to go through the process of getting warrants some of the time. It doesn’t want to associate with the ICAC task forces because they feel like it puts too much of a burden on them, and if they don’t take care of it, someone else on the task force will. And sometimes they don’t want to deal with CyberTipline reports because they’re afraid that if they’re too slow after getting a report, they might face liability.
Most of these issues seem to boil down to law enforcement not wanting to do its job.
But the report details some of the other challenges for law enforcement. And it starts with just how many reports are coming in:
And of course, making social media platforms more liable doesn’t help to fix much here. At best, it makes it worse because it encourages even more reporting by the platforms, which only further overloads law enforcement.
Given all those reports the cops are receiving, you’d hope they had a good system for managing them. But your hope would not be fulfilled:
That seems less than ideal.
Another problem, though, is that a lot of the reports are not prosecutable at all. Because of the incentives discussed in the first post, apparently certain known memes get reported to the CyberTipline quite frequently, and police feel they just clog up the system. But because the platforms fear significant liability if they don’t report those memes, they keep reporting them.
At best, this seems to annoy law enforcement, but it’s a function of how the system works:
Again, this all seems pretty messy. Of course you want companies to report anything they find that might be CSAM. And, of course, you want NCMEC to pass them on to law enforcement. But the end result is overwhelmed law enforcement with no clear process for triage and dealing with a lot of reports that were sent in an abundance of caution but which are not at all useful to law enforcement.
And, of course, there are other challenges that policymakers probably don’t think about. For example: how do you deal with hacked accounts? How much information is it right for the company to share with law enforcement?
The report details how local prosecutors are also loathe to bring cases, because it’s tricky to find a jury who can handle a CSAM case:
There are also issues outside the US with law enforcement. As noted in the first article, NCMEC has become the de facto global reporting center, because so many companies are based in the US and report there. And the CyberTipline tries to share out to foreign law enforcement too, but that’s difficult:
And in lower income countries, the problems can be even worse, including confusion about how the entire CyberTipline process works.
NCMEC Itself
The report also details some of the limitations of NCMEC and the CyberTipline itself, some of which are legally required (and where it seems like the law should be updated).
There appears to be a big issue with repeat reports, where NCMEC needs to “deconflict” them, but has limited technology to do so:
As the report notes, there are a variety of challenges, both economic and legal, in enabling NCMEC to upgrade its technology:
Platforms
And, yes, there are some concerns about the platforms. But while public discussion seems to focus almost exclusively on where people think that platforms have failed to take this issue seriously, the report suggests the failures of platforms are much more limited.
The report notes that it’s a bit tricky to get platforms up and running with CyberTipline reporting, and that even as NCMEC will do some onboarding, it’s very limited to avoid some of the 4th Amendment concerns talked about above.
And, again, some of the problem with onboarding is due to outdated tech on NCMEC’s side. I mean… XML? Really?
There are also challenges under the law about what needs to be reported. As noted above and in the first article, that can often lead to over-reporting. But it can also make things difficult for companies trying to make determinations.
And, of course, as has been widely discussed elsewhere, it’s not great that platforms have to hire human beings and expose them to this kind of content.
However, the biggest issue on reporting seems to not be a company’s unwillingness to do so, but how much information they pass along. And again, here, the issue is not so much unwillingness of the companies to be cooperative, but the incentives.
This is all pretty fascinating, and suggests that while there may be ways to improve things, it’s difficult to structure things right and make the incentives align properly.
And, again, the same incentives pressure the platforms to just overreport, no matter what:
All in all, the real lesson to be taken from this report is that this shit is super complicated, like all of trust & safety, and tradeoffs abound. But here it’s way more fraught than in most cases, both in terms of the seriousness of the issue, the potential for real harm, and the potentially destructive criminal penalties involved.
The report has some recommendations, though they mostly seem to deal with things at the margins: increase funding for NCMEC, allow it to update its technology (and hire the staff to do so), and have some more information to help platforms get set up.
Of course, what’s notable is that this does not include things like “make platforms liable for any mistake they make.” This is because, as the report shows, most platforms seem to take this stuff pretty seriously already, and the liability is already very clear, to the point that they are often over-reporting to avoid it, and that’s actually making the results worse, because they’re overwhelming both NCMEC and law enforcement.
All in all, this report is a hugely important contribution to this discussion, and provides a ton of real-world information about the CyberTipline that were basically only known to people working on it, leaving many observers, media and policymakers in the dark.
It would be nice if Congress reads this report and understands the issues. However, when it comes to things like CSAM, expecting anyone to bother with reading a big report and understanding the tradeoffs and nuances is probably asking too much.
Companies: ncmec