EU Has Lost the Plot, Will Ban Encryption — Think of the Children
The European Union “is failing to protect children.” This stunning admission came from socialist “commissioner,” Ylva Johansson (pictured). She says something must be done—and, yes, what they’re proposing is indeed something.
In essence, proposed regulations would require all internet services to seek permission from European authorities before they can operate. The EU’s “CSAM risk assessment” process is the 21st century equivalent of the Stasi’s papiere bitte.
They do realize criminals don’t follow the law, right? In today’s SB Blogwatch, we promulgate privacy for plebs.
Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: What if Wes Anderson directed The Sopranos?
Here We Go Again
What’s the craic? Ryan Browne reports—“Europe has a plan to fight online child abuse”:
“Erode encrypted communications”
[The EU] unveiled tough new proposals that would require online platforms to more aggressively screen and remove child abuse online. … A new EU Centre on Child Sexual Abuse … will maintain a database with digital “indicators” of child sexual abuse material.
…
“We are failing to protect children today,” Ylva Johansson, the EU commissioner for home affairs, said. … She called the plan a “groundbreaking proposal” that would make Europe a global leader in the fight.
…
[But] privacy activists fear the new EU bill may undermine end-to-end encryption [and they] believe measures to erode encrypted communications would be ineffective. [The EU] warned the consequences of leaving end-to-end encryption out of the requirements would be “severe” for children.
Wait, what? Natasha Lomas loads the fact cannon—“Europe’s CSAM scanning plan unpicked”:
“General monitoring obligations”
[The] draft legislation…will create a framework which could obligate digital services to use automated technologies to detect and report existing or new CSAM, and also identify and report grooming activity. … The Commission also cites a claim that 60%+ of sexual abuse material…is hosted in the EU.
…
In essence, the Commission’s proposal seeks to normalize CSAM mitigation. [It] looks intended by EU lawmakers to encourage services to proactively adopt a robust security- and privacy-minded approach towards users to better safeguard any minors from abuse/predatory attention.
…
EU law contains a prohibition on placing a general monitoring obligations on platforms because of the risk of interfering with…privacy. But the Commission’s proposal aims to circumvent that…by setting out what the regulation’s preamble describes as “targeted measures that are proportionate.”
Ahhh, so the ends justify the means? Got it. Ross Anderson says it’ll fail—“European Commission prefers breaking privacy to protecting kids”:
“Designed to circumvent EU laws”
It’s really stupid for the European Commission to mandate centralised takedown by a police agency for the whole of Europe. This will be make everything really hard to fix once they find out that it doesn’t work.
…
[Yes,] this approach does not work. … We have a reasonably good understanding of why this is the case: … Police forces want to use their own forms, and expect everyone to follow police procedure. … Police forces also focus on process more than outcome; they have difficulty hiring and retaining staff to do detailed technical clerical work; and they’re not much good at dealing with foreigners.
…
The covert purpose … is to enable the new agency to undermine end-to-end encryption. [It’s] an attack on cryptography, designed to circumvent EU laws against bulk surveillance by using a populist appeal to child protection, [which will] harm children instead.
It’s not just social media and messaging apps. Rejo Zenger says it’s ISPs, too—“European Commission wants to eliminate online confidentiality”:
“What can go wrong, will”
Of course, that’s not literally what the proposal says [but] the European Commission wants to cancel encryption. … Internet service providers can also be ordered to monitor their customers’ internet traffic. But the Commission omits, quite cleverly…just how they should do so. Effectively her message for companies is: “Do the impossible, you get to decide how.”
…
It is technically impossible to filter your internet as the European Commission would like providers to do. That is – unless we simply abolish encryption. … The proposed measures would severely undermine the confidentiality of our communications, and are being presented in the context of an incredibly sensitive and important topic, which is the protection of children.
…
Yes, there are a few safeguards built into the proposal. [But] years of experience unfortunately leads us to interpreting proposed legislation from the most pessimistic of viewpoints. What can go wrong, will sooner or later go wrong.
…
It goes without saying that everyone, including us, considers the fight against the sexual abuse of children to be extremely important. That’s precisely why it’s crucial to focus on effective and sustainable measures. This proposal does not meet those standards.
tl;dr? Syonyk gets to the point:
[There are] some particularly horrifying chunks regarding “scanning of conversations for content.” Plus, not only known, existing CSAM, but also identifying new material.
…
[Do] you trust machine learning that can’t tell the difference between a cat and a ferret to know what is and isn’t CSAM? That’s a very real path to, “You have a photo of your child in the bathtub and the police smash down your door.”
Sounds like China. Matthias Pfau agrees—“Total surveillance in the name of child protection”:
“Back door only for the good guys”
[It would be] the most sophisticated mass surveillance apparatuses ever deployed outside China. … Once again, the EU Commission is using child protection as a pretext to introduce mass surveillance.
…
We are all to be secretly monitored – all the time. … What could possibly go wrong? … The list can be expanded on demand. In the beginning, the laws will say that providers must scan for child pornography – this is what politicians always claim. … But in the next step, the authorities will also look for other things: terrorists, human traffickers, drug dealers, gang criminals…opposition members…journalists.
…
An important issue – and one that is completely neglected by the European Commission. Cybersecurity. … It must be clear to all of us that a “back door only for the good guys” is not possible.
It’s 2022. Why are we still trying to make this work? Ask AskJarv:
I thought the Apple approach was dubious [but] this is completely bonkers. I sometimes think the EU are doing a solid job in a fast moving field that is incredibly technical, but this offering makes me sincerely doubt the technical teams they lean on to get such legislation put forward.
Where else could the EU apply this strategy? Merk42 thinks of the children:
How about the EU puts a rule to randomly search vehicles on the road, in case they have kidnapped children in them?
Meanwhile, xeeeeeeeeeeenu pours sauce for the gander:
I don’t think this proposal is quite enough, we can do better. … The homes of the people who champion this proposal should be thoroughly searched for child abuse materials—every week. Naturally, their personal electronic devices should be taken for analysis. I understand this may cause them some inconvenience, but think of the children!
And Finally:
Hat tip: jonbob
You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or [email protected]. Ask your doctor before reading. Your mileage may vary. E&OE. 30.
Image sauce: European Union (cc:by-sa; leveled and cropped)