“That being so there is only one logical solution: client-side scanning where the content is examined when it is decrypted on the user’s device for them to view/read,” Woodward says. Last year, Apple announced it would introduce client-side scanning—scanning done on people’s iPhones rather than Apple’s servers—to check photos for known CSAM being uploaded to iCloud. The move prompted anger at potential surveillance from civil rights groups to Edward Snowden, and led to Apple pausing its plans a month after initially announcing them. (Apple declined to comment for this story.)
For tech companies, detecting CSAM on their platforms and scanning some communications is not new. Companies operating in the United States are required to report any CSAM they find or is reported to them by users to the National Center for Missing and Exploited Children (NCMEC), a US-based nonprofit. More than 29 million reports, containing 39 million images and 44 million videos, were made at NCMEC last year alone. Under the new EU rules, the EU Center will receive CSAM reports from tech companies.
“A lot of companies are not doing the detection today,” Johansson said in a press conference introducing the legislation. “This is not a proposal on encryption, this is a proposal on child sexual abuse material,” Johansson said, adding that the law is “not about reading communication” but detecting illegal abuse content.
At the moment, tech companies find CSAM online in different ways. And the amount of CSAM found is increasing as tech companies get better at detecting and reporting abuse—although some are much better than others. In some cases, AI is being used to hunt down previously unseen CSAM. Duplicates of existing abuse photos and videos can be detected using “hashing systems,” where abuse content is assigned a fingerprint that can be spotted when they’re uploaded to the web again. More than 200 companies, from Google to Apple, use Microsoft’s PhotoDNA hashing system to scan millions of files shared online. However, to do this, systems need to have access to the messages and files people are sending, which is not possible when end-to-end encryption is in place.
“In addition to detecting CSAM, obligations will exist to detect the solicitation of children (‘grooming’) in the scope can only mean that conversations will need to be read 24/7,” says Diego Naranjo, head of policy at civil liberties group European Digital Rights. “This is a disaster for confidentiality of communications. Companies will be asked to (via detection orders) or incentivized to (via risk mitigation measures) to offer less secure services for everyone if they want to comply with these obligations.”
Discussions around protecting children online and how this can be done with end-to-end encryption are hugely complex, technical, and combined with the horrors of the crimes against vulnerable young people. Research from Unicef, the UN’s children’s fund, published in 2020 says encryption is needed to protect people’s privacy—including children—but adds it “impedes” efforts to remove content and identify the people sharing it. For years, law enforcement agencies around the world have pushed to create ways to bypass or weaken encryption. “I’m not saying privacy at any cost, and I think we can all agree child abuse is abhorrent,” Woodward says, “but there needs to be a proper, public, dispassionate debate about whether the risks of what might emerge are worth the true effectiveness in fighting child abuse.”
Increasingly, researchers and tech companies have been focusing on safety tools that can exist alongside end-to-encryption. Proposals include using metadata from encrypted messages—the who, how, what, and why of messages, not their content—to analyze people’s behavior and potentially spot crime. One recent report by nonprofit group Business for Social Responsibility (BSR), which was commissioned by Meta, found that end-to-end encryption is an overwhelmingly positive force for upholding people’s human rights. It suggests 45 recommendations for how encryption and safety can go together and not involve access to people’s communications. When the report was published in April, Lindsey Andersen, BSR’s associate director for human rights, told WIRED: “Contrary to popular belief, there actually is a lot that can be done even without access to messages.”
#Big #Tech #Scan #Private #Chats #Child #Abuse