As Congress Grandstands Nonsense ‘Kid Safety’ Bills, Senator Wyden Reintroduces Legislation That Would Actually Help Deal With Kid Exploitation Online

from the no-one-will-pay-attention,-because-this-is-useful,-but-boring dept

As you’ve likely heard, this morning the Senate did one of its semi-regular hearings in which it drags tech CEOs in front of clueless Senators who make nonsense pronouncements in hopes of getting a viral clip to show up on the very social media they’re pretending to demonize, but which they rely on to pretend to their base that they’re leading the culture war moral panic against social media.
Meanwhile, Senator Ron Wyden has (yet again) released a bill that will get little (if any) attention, but which actually seeks to help protect children. Reps. Eshoo and Fitzpatrick have introduced the companion bill in the House.
As we’ve discussed multiple times, all evidence suggests that the internet companies are actually doing an awful lot to stop child exploitation online, which involves tracking it down, reporting it to NCMEC, and putting in place tools to automate and block such exploitation content from ever seeing the light of day. The real problem seems to be that after the content is reported to NCMEC, nothing happens.
Wyden’s bill aims to fix that part. The actual part where the system seems to fall down and fail to protect kids online. The part about what happens after the companies report such content, and NCMEC and the DOJ fail to take any action:
The bill includes a ton of pretty clear and obvious common sense approaches to helping deal with the actual crimes going on and to actually step in and protect children, rather than just grandstanding about it and magically pretending that if only Mark Zuckerberg nerded harder, he’d magically prevent child exploitation.
Of course, doing basic stuff like this isn’t the kind of thing that gets headlines, and so it won’t get even a fraction of the attention that terrible, unconstitutional, problematic bills like KOSA, EARN IT, STOP CSAM and others will get. Indeed, after doing a quick search online, I can find exactly no articles about Wyden’s bill. Dealing with actual problems isn’t the kind of thing this Congress does, nor something that the media cares about.
Having a show trial to pretend that terrible bills are great makes headlines. Actually presenting a bill that provides real tools to help… gets ignored.
notion image
Page 1 of invest_in_child_saftey_act_bill_text_118_congress

from the this-is-why-we-can't-have-nice-things dept

Last November, Maine residents voted overwhelmingly (83 percent) to pass a new state right to repair law designed to make auto repairs easier and more affordable. More specifically, the law requires that automakers standardize on-board diagnostic systems and provide remote access to those systems and mechanical data to consumers and third-party independent repair shops.
But as we’ve seen with other states that have passed right to reform laws (most notably New York), passing the law isn’t the end of the story. Corporate lobbyists have had great success not just watering these laws down before passage, but after voters approve them.
In Maine, lawmakers have been trying to convince Republican and Democratic lawmakers alike to introduce new amendments modifying (read: weakening) Maine’s law. While automakers and some lawmakers insist the point of the changes are to protect consumer privacy, right to repair advocates say the real point is to water the bill down to the point of near-uselessness:
The original law mandates the creation of a new portal that car owners and independent mechanics can access to reset car security systems. Automakers must also create a “motor vehicle telematics system notice” system informing new car owners how access will work. The bill also mandates that the AG create an oversight board to ensure automakers are complying with data share requests.
But automakers are now trying to claim that this new system would somehow be a threat to consumer privacy. So they’ve convinced some lawmakers to push for amendments that would eliminate the standardized database, and the oversight entity as part of a total rewrite of the bill:
Here’s the thing: automakers demonstrably don’t actually give a shit about consumer privacy. If you recall, a recent Mozilla study showcased how most modern vehicles are a privacy nightmare, collecting no limit of data on users (and their phones, when attached via Bluetooth). Without that collection being transparent with users, or that data being properly secured and encrypted by the manufacturers.
Most lawmakers couldn’t care less about those privacy violations by automakers either, and are nowhere to be found when consumers demand reform or companies require oversight. Yet here they are suddenly professing a deep interest in consumer privacy, only after voters approve a law making vehicles easier to repair. Pushing changes activists don’t like and voters didn’t approve. Funny, that.
Companies that oppose right to repair have a very long history of hiding behind consumer safety and privacy issues when trying to defend their profitable repair monopolies. It’s fairly standard operating procedure. In New York, that resulted in a bill with giant loopholes that exempted the biggest offenders. Hopefully Maine’s right to repair law, the fourth in the country, doesn’t suffer the same fate.
notion image

from the sometimes-tech-companies-exaggerate dept

One of the things we talk about quite a lot on Techdirt is how the “easy” policy ideas that many people have aren’t quite so easy, because everything has tradeoffs. You want strict privacy laws? Well, that might create issues for free speech and competition. You want stronger liability on social media services? Well, that’s going to limit competition.
Lately there have been some debates regarding interoperability and privacy/security. One of many examples of this is Apple blocking Beeper from reverse engineering iMessage, to allow iPhone users to more securely communicate with Android users. In that case, Apple claimed it had to do this for security reasons. This was Apple’s statement at the time:
Except that didn’t pass the sniff test. As noted, Beeper was actually increasing the security of iMessage users by making sure that their messages to Android users were end-to-end encrypted, as opposed to currently, where they are much less secure.
There are similar examples of this as well throughout the years. Right to repair laws are often lobbied against by big tech companies claiming they’ll create security and privacy problems. Companies like Facebook and LinkedIn have sued third parties for building new interfaces, claiming they were security risks.
Right before the holidays last year, however, the FTC (which I often criticize, but sometimes does the right thing) came out with a very interesting note, warning tech companies that it would be more carefully scrutinizing claims that they need to block interoperability in the name of security and privacy.
As the announcement points out, interoperability is important:
And then it notes that, obviously, security and privacy are also super important. But, the important part is that the FTC says it’s not just going to accept the claims of tech companies that they need to block interoperability for security and privacy reasons without at least something more to substantiate those claims.
This sounds like a smart, thoughtful, balanced, and nuanced approach. Obviously, there may be some cases where privacy and security is legitimately put at risk through reverse engineering or other kinds of “adversarial’ interoperability. But, historically, those risks have been way more limited than companies would have you believe.
The FTC taking a nuanced approach, making it clear that it won’t just accept such claims from companies on blind faith seems like the correct approach. We should live in a world where the default expectation is for interoperability, right to repair, etc. If there are real security and privacy concerns, companies should raise them, but we shouldn’t take those claims as accurate, because the companies have billions of reasons to exaggerate those risks.
It’s good that the FTC is making it clear that it’s going to scrutinize such claims more closely.