Matt Martin: Tech companies have ability to save children

Published 12:00 am Thursday, January 9, 2020

By Matt Martin

If you had the power to stop a child from being sexually exploited and a court order giving law enforcement access to information to take down the perpetrators, would you do it?

The answer to that question should be obvious. But regrettably, when it comes to today’s technology and social media companies, it is not.

Our office recently sought to protect two such victims — minors under the age of 10 — from exploitation. Our attorneys prosecuted a 38-year-old man in federal court for distribution of child pornography. The man used a social media application on his Google Pixel smart phone to distribute extremely disturbing videos of very young girls being sexually exploited. A member of an online group known to traffic in child exploitation images, he claimed to have produced images of himself sexually assaulting two young girls. He was eventually convicted in a state court for sexually assaulting one of them.

What was on his Google Pixel smart phone? Which other predators was he communicating with? What children might be rescued from the nightmare of sexual abuse? Sadly, we still don’t know, because his phone was encrypted — and, therefore, “warrant-proof.” We obtained a search warrant from a federal judge to search the phone, but even a federal judge’s order doesn’t bypass the encryption that certain tech companies have implemented on their products.

Tragically, we will never know the extent of this man’s crimes against these two (and potentially other) children because the encryption on the phone prevented the FBI from being able to conduct a search. Thankfully, the FBI’s valiant efforts produced enough evidence to send this man to federal prison for over 13 years and to rescue two minors, but our inability to get lawful access to the phone likely allowed him to shield illegal behavior.

Had the search warrant allowed entry into a house to save a child and get evidence against the abuser, we would’ve knocked down the door if needed. Unfortunately, the same rules do not apply to the virtual world due to warrant-proof encryption. Criminals know it.

A second, similar challenge relates to so-called “end-to-end” encrypted communication. This is the technology employed by messaging apps like Apple iMessage and Facebook’s WhatsApp that prevents anyone but a message’s sender and recipient from seeing its contents.

This second category is equally dangerous because of the high volume of child exploitive material that passes through these message applications each year. In 2018, the National Center for Missing and Exploited Children’s CyberTipline received over 18 million reports that contained more than 45 million pieces of exploitive content — 23 million images and 22 million videos. To their credit, many of these reports exist only because certain social media companies have found ways to filter the communications on their systems for child exploitation, while still keeping their platforms secure from hackers and criminals.

Notably, of the 18 million reports, the vast majority come from messenger programs that do not employ end-to-end encryption. By comparison, extremely small numbers of reports are generated from end-to-end encrypted messenger programs such as Apple’s iMessage. These two forms of encryption create lawless spaces where child predators can produce and distribute unspeakable horrors beyond the reach of search warrants and the law.

Please do not misunderstand: we should celebrate the incredible encryption technology that American businesses have developed in recent years. It ensures our online privacy and security so that we make online purchases safely, communicate privately and store data securely. Our office regularly prosecutes criminals who steal electronic data and hack computer networks. We believe strongly in cybersecurity and privacy.

But there must be a balance. Fortunately, the Constitution’s Fourth Amendment includes a time-tested formula. It protects our “right … to be secure in [our] persons, houses, papers and effects,” only allowing law enforcement searches “upon probable cause.” This requires law enforcement to prove to a judge that crime is likely afoot. It has proven to be a brilliant protection against police overreach, while at the same time allowing police to keep us safe.

We have a choice to make. Should technology and social media companies design their products so that they can honor search warrants and protect children?

Facebook has announced plans to soon move to end-to-end encryption on its Messenger platform, which had over 1 billion users per month last year. As a result, Facebook’s 12 million annual CyberTipline reports will certainly dwindle. Already, the end-to-end encryption of Apple iMessage, Facebook’s WhatsApp and others has created a lawless environment. Facebook and other companies may soon go further in the same direction.

Technology companies also have a choice to make. Surely, the developers at technology and social media companies can create a solution to allow lawful access when law enforcement presents a warrant, while still protecting user privacy and cybersecurity. Returning to the question posed: they have the power to save children. What will they choose to do?

Matt Martin is the U.S. attorney for the Middle District of North Carolina.