How Cops Use Apple's NEW Tech to Spy on Your Phone
Source: Jeff Hampton, "The People's Lawyer" (YouTube)
Channel: Jeff Hampton
Introduction: The Threat of Client-Side Scanning
The video opens with a scenario where a parent photographs a child's medical condition to send to a doctor, only to have police arrive because the phone flagged the image. This illustrates client-side scanning—a technology where devices scan content locally before encryption occurs, effectively allowing surveillance of messages, photos, and files without a warrant. Unlike server-side scanning that checks content in the cloud after upload, client-side scanning analyzes everything on the device itself, before messages are sent or files are backed up.
How Major Tech Companies Have Built the Infrastructure
Apple introduced a system in 2021 to scan photos on-device against a government database of known abusive images involving children. After public backlash, Apple paused the rollout but left all the software and hardware infrastructure in place. The operating system still syncs with AI services that analyze photos and texts locally—officially for features like photo recognition, but the same infrastructure could be repurposed for broader surveillance with a software update.
Microsoft launched Windows Recall in 2024, which takes screenshots of everything users do and uses AI to make it searchable. After criticism, Microsoft made it opt-in only. Google quietly implemented Android System Safety Core, which powers "sensitive content warnings" in Google Messages. The system scans images in chats—including end-to-end encrypted ones—to detect and blur content deemed inappropriate. Google frames this as privacy protection since detection happens on-device, but the pipeline for flagging and reporting content to authorities exists.
Legislative Efforts to Mandate Scanning
The European Union has proposed "Chat Control" legislation that would require scanning all private messages, including those in encrypted apps, using client-side scanning. Platforms would be forced to report any flagged content directly to law enforcement.
The United Kingdom passed the Online Safety Act, giving regulators power to demand that platforms detect and remove questionable content, even encrypted content. The government acknowledged enforcement will wait until client-side scanning is "technically feasible." Non-compliance carries fines up to 10% of global revenues. Signal and WhatsApp have stated they would rather exit the UK market than compromise their encryption.
In the United States, the Earn It Act takes a different approach: rather than mandating scanning directly, it threatens to remove legal protections from platforms that don't use "government-approved scanning tools," effectively coercing compliance.
Constitutional and Practical Concerns
The presenter raises Fourth Amendment concerns: if devices are forced to scan all data and report questionable content, this constitutes a warrantless search operating 24/7. The government's defense—that the provider conducts the search, not the government—represents a legal workaround to warrant requirements.
A real case illustrates the dangers: Mark, a father in the United States, photographed his toddler's infection to send to a doctor. Google flagged the image, permanently banned his account, reported him to police, and he faced a criminal investigation—despite doing exactly what a responsible parent should do.
European studies found that most AI-generated abuse reports were false positives. These false flags result in platform moderators reviewing intimate private content that users never intended anyone to see. Even when police never become involved, the system creates opportunities for abuse.
Recommended Defenses
The presenter recommends two fronts for resistance. For personal privacy practices: use truly encrypted messaging apps like Signal or WhatsApp that oppose scanning mandates; disable surveillance features like Windows Recall; consider privacy-focused operating systems; use full disk encryption with strong passphrases; be cautious about AI assistants requesting broad access; and use VPNs when appropriate.
For political action: contact lawmakers to oppose legislation that trades privacy for security through client-side scanning; demand explicit protections for strong encryption without backdoors; and spread awareness that CSAM scanning means everyone's phones scanning everything all the time.
Conclusion
The video concludes that privacy dies not because bad laws exist, but because people don't object to them. The presenter directs viewers to a related video about facial recognition technology and surveillance.