Category Archives: Encryption

Encryption actually protects law-abiding Canadian citizens - Toronto Star 20160710

When it comes to policing and national security, far too often Canadians are asked to let fear trump their rights.

Recently, the front page of the Toronto Star featured the headline, “Encryption creating a barrier for police ...,” potentially convincing some readers that the technology’s only purpose is to aid criminals. Rarely do we see headlines such as, “Encryption protects thousands of Canadians’ credit card information,” or “Encryption enables secure communications for every Canadian.” or even the aspirational, “Canada leads the way in cybersecurity for its citizens.”

Increasingly, when we hear about encryption in the media, or from public safety officials, it’s presented as a danger — something that prevents those whose job it is to keep us safe from fulfilling their role. However, in the vast majority of transactions online by ordinary, law-abiding citizens, encryption is a good thing that makes personal, sensitive data harder to capture and decipher. Indeed, if more data were stored in encrypted form, sensational breaches of privacy — like the one that drove some Ashley Madison users to suicide — could be avoided.

Acknowledging that encryption can be a good thing for society doesn’t erase police concerns about data access; it contextualizes them. We at the Canadian Civil Liberties Association (CCLA) have long been supporters of warrants, the process by which police can go before a judge to demonstrate that their need to intercept a suspect’s private communications is reasonable and proportionate.

While we understand that warrants aren’t helpful if data can’t be decrypted, reports indicate police now have the tools, and are working with technology companies, to gain access to even the most complex of encrypted data. For example, as we learned from the Project Clemenza investigation, police can now decrypt BlackBerry communications and are making extensive use of Stingray technology, which allows for the mass interception of cellphone data.

We also know the FBI has developed a hack to intercept messages on Tor networks, which are designed for secure, private communications. Even the infamous Apple v. FBI case ended with the FBI getting what it wanted.

An increasing lack of public trust, that invasive technologies will be used proportionately by security and law enforcement agencies, is attributed to an excessive attention to privacy rights, encouraged by privacy advocates. What we hear from concerned citizens, however, is not that they prioritize privacy over all else, not that they don’t value security, and not that they don’t appreciate the need for police to use new technologies to deal with new threats.

Rather, they tell us, there is way too much secrecy and way too little accountability surrounding the ways these technologies are used. This is not an invention concocted by privacy advocates, such as CCLA; it’s the result of an increasing disjunction between the stories people hear and their expectations of appropriate conduct in the name of public safety.

For example, when the Communications Security Establishment used information from the free internet service at a major Canadian airport to track the wireless devices of thousands of ordinary airline passengers for days after they left the terminal, many Canadians felt intuitively it was intrusive and wondered if it was illegal. But it wasn’t. That’s the kind of situation that erodes the trust that is fundamentally necessary for the social license law enforcement needs to function effectively.

Another example is the aforementioned Stingray technology, which apparently has been quietly used in Canada for a number of years. Police maintain that secrecy gives them the edge they need against increasingly sophisticated criminals. However, Canadians have legitimate concerns that when a powerful technology is used in secret, it’s impossible to ascertain whether it’s being used wisely and proportionately, and if necessary safeguards are in place.

While it would clearly be more convenient for police to have instant access to all the information they want it wouldn’t ensure crimes are investigated justly, or with respect for the innocent bystanders whose data gets swept up, and that matters too.

A recent survey on Canadian identity, published in October by the national statistics agency found that the Charter of Rights and Freedoms was chosen as Canada’s most important national symbol, with 93 per cent support.

In other words, Canadians consider rights protection to be core to their sense of who we are as a people. Thus, it’s time to stop looking at rights, the technologies that protect them, and people who argue for them, as barriers.

Indeed, it’s time we talked about public safety, new technologies, and reasonable expectations in a way that rebuilds trust and provides a solid foundation for a Canada in which our persons, property and rights all have strong and effective protection.

Dr. Brenda McPhail is the director of the privacy, technology and surveillance project at the Canadian Civil Liberties Association.

Chen, John - Protecting customer privacy is a core BlackBerry principle - 20160418

Chen, John - Protecting customer privacy is a core BlackBerry principle - 20160418

When it comes to doing the right thing in difficult situations, BlackBerry’s guiding principle has been to do what is right for the citizenry, within legal and ethical boundaries. We have long been clear in our stance that tech companies as good corporate citizens should comply with reasonable lawful access requests. I have stated before that we are indeed in a dark place when companies put their reputations above the greater good.

This very belief was put to the test in an old case that recently resurfaced in the news, which speculated on and challenged BlackBerry’s corporate and ethical principles. In the end, the case resulted in a major criminal organization being dismantled. Regarding BlackBerry’s assistance, I can reaffirm that we stood by our lawful access principles. Furthermore, at no point was BlackBerry’s BES server involved. Our BES continues to be impenetrable – also without the ability for backdoor access – and is the most secure mobile platform for managing all mobile devices. That’s why we are the gold standard in government and enterprise-grade security.

For BlackBerry, there is a balance between doing what’s right, such as helping to apprehend criminals, and preventing government abuse of invading citizen’s privacy, including when we refused to give Pakistan access to our servers. We have been able to find this balance even as governments have pressured us to change our ethical grounds. Despite these pressures, our position has been unwavering and our actions are proof we commit to these principles.

Threatpost - Blackberry CEO defends lawful access principles, supports phone hack - 20160419

Threatpost - Blackberry CEO defends lawful access principles, supports phone hack - 20160419

BlackBerry’s CEO made the company’s stance on lawful access requests clear this week and is defending actions to provide Canadian law enforcement with what it needed to decrypt communications between devices.

The company’s CEO John Chen penned a statement on Monday, reiterating that one of BlackBerry’s core principles is customer privacy but also acknowledged that BlackBerry stood by its “lawful access principles” in a recently publicized criminal investigation where it was alleged that BlackBerry assisted law enforcement in retrieving data from a phone.

“We have long been clear in our stance that tech companies as good corporate citizens should comply with reasonable lawful access requests,” Chen said. Then, in a thinly veiled jab at Apple, Chen added, “I have stated before that we are indeed in a dark place when companies put their reputations above the greater good.” Speculation around the inner workings of the case, which deals with a mafia-related murder in Montreal, has intensified over the last week following a Vice report on Thursday. According to the news outlet, the Royal Canadian Mounted Police (RCMP) – the country’s federal police force – successfully intercepted and decrypted over one million BlackBerry messages relating to the case between 2010 and 2012.

Reporters combed through thousands of court documents that strongly suggest that both BlackBerry and Rogers, a Canadian communications company, cooperated with law enforcement to do so. Particularly telling was a reference in the documents to a “decryption key” that deals with “BlackBerry interception.”

The RCMP oversees a server in Ottawa that “simulates a mobile device that receives a message intended for [the rightful recipient]” according to court filings. In another document, an affidavit, RCMP Sergeant Patrick Boismenu said the server is referred to by the RCMP as a “BlackBerry interception and processing system,” and that it “performs the decryption of the message using the appropriate decryption key.”

BlackBerry has long used a global encryption key – a PIN that it uses to decrypt messages – for its consumer devices.

It’s unclear how exactly the RCMP secured access to a BlackBerry decryption key, or for that matter if it still has the key, but BlackBerry “facilitated the interception process,” according to RCMP inspector Mark Flynn, who testified in a transcript.

Defense lawyers believe the technology the RCMP is using to target BlackBerry devices mimics a cell phone tower and can be manipulated to intercept devices and forward information to police. Largely known as Stingray tracking devices or International Mobile Subscriber Identity (IMSI) catchers, the RCMP refers to the devices as “mobile device identifiers” or “MDIs.” The Globe and Mail did a deep dive on the technology on Monday, noting the technology has been in use in Canada since 2011 and is capable of knocking people calling 911 offline.

If the RCMP is still in possession of the global key, it’s likely that Mounties could still use it to decrypt PIN-to-PIN communications on consumer devices.

While Chen didn’t get into specifics around his company’s move, he lauded it on Monday.

“Regarding BlackBerry’s assistance, I can reaffirm that we stood by our lawful access principles,” Chen wrote, further likening it to doing the right thing in a difficult situation and boasting that it helped lead to a “major criminal organization being dismantled.”

Conversely, privacy experts questioned Chen’s statement and pondered whether it could signal the beginning of the end for the company.

“I think Chen is traveling down a very dangerous path here,” Richard Morochove, a computer forensics investigator with Toronto-based computer consulting firm Morochove & Associates said Tuesday on Canada’s Business News Network, “With this announcement he’s just pounded a big nail into BlackBerry’s coffin.”

BlackBerry uses a global key for its consumer devices, but Chen insists that the company’s BlackBerry Enterprise Server (BES) was not involved in the case and that messages sent from corporate BlackBerry phones cannot be decrypted.

“Our BES continues to be impenetrable – also without the ability for backdoor access – and is the most secure mobile platform for managing all mobile devices,” Chen wrote.

While that means that many of the company’s higher end clientele, government workers and corporations, are protected, any consumers who own BlackBerry devices may have been open, or could still be open to spying by the Canadian police.

Chen’s position of course marks a stark delineation between BlackBerry and Apple, another company that’s been waging its own battle with the government over granting access to customer information.

While Apple refused to break its own crypto to let the FBI bypass the iPhone’s encryption, it sounds like all law enforcement has to do to break into a BlackBerry is ask.

The Senate’s Draft Encryption Bill Is ‘Ludicrous, Dangerous, Technically Illiterate' - Wired 20160408

The Senate’s Draft Encryption Bill Is ‘Ludicrous, Dangerous, Technically Illiterate' - Wired 20160408

AS APPLE BATTLED the FBI for the last two months over the agency’s demands that Apple help crack its own encryption, both the tech community and law enforcement hoped that Congress would weigh in with some sort of compromise solution. Now Congress has spoken on crypto, and privacy advocates say its “solution” is the most extreme stance on encryption yet.

On Thursday evening, the draft text of a bill called the “Compliance with Court Orders Act of 2016,” authored by offices of Senators Diane Feinstein and Richard Burr, was published online by the Hill.1 It’s a nine-page piece of legislation that would require people to comply with any authorized court order for data—and if that data is “unintelligible,” the legislation would demand that it be rendered “intelligible.” In other words, the bill would make illegal the sort of user-controlled encryption that’s in every modern iPhone, in all billion devices that run Whatsapp’s messaging service, and in dozens of other tech products. “This basically outlaws end-to-end encryption,” says Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology. “It’s effectively the most anti-crypto bill of all anti-crypto bills.”

It's effectively the most anti-crypto bill of all anti-crypto bills.

Kevin Bankston, the director of the New America Foundation’s Open Technology Institute, goes even further: “I gotta say in my nearly 20 years of work in tech policy this is easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen,” he says.

The bill, Hall and Bankston point out, doesn’t specifically suggest any sort of backdoored encryption or other means to even attempt to balance privacy and encryption, and actually claims to not require any particular design limitations on products. Instead, it states only that communications firms must provide unencrypted data to law enforcement or the means for law enforcement to grab that data themselves. “To uphold the rule of law and protect the security and interests of the United States, all persons receiving an authorized judicial order for information or data must provide, in a timely manner, responsive and intelligible information or data, or appropriate technical assistance to obtain such information or data.”

Hall describes that as a “performance standard. You have to provide this stuff, and we’re not going to tell you how to do it,” he says. George Washington Law School professor Orin Kerr points out on Twitter that the text doesn’t even limit tech firms’ obligations to “reasonable assistance” but rather “assistance as is necessary,” a term that means the bill goes beyond current laws that the government has used to try to compel tech firms to help with data access such as the All Writs Act.

Even more extreme, the draft bill also includes the requirement that “license distributors” ensure all “products, services, applications or software” they distribute provide that same easy access for law enforcement. “Apple’s app store, Google’s play store, any platform for software applications somehow has to vet every app to ensure they have backdoored or little enough security to comply,” says Bankston. That means, he says, that this would “seem to also be a massive internet censorship bill.”

I could spend all night listing the various ways that Feinstein-Burr is flawed & dangerous. But let's just say, "in every way possible."

— matt blaze (@mattblaze) April 8, 2016

If Grandpa Simpson was a Senator who was afraid of and confused by encryption, I think he'd write something like the Feinstein/Burr bill.

— Kevin Bankston (@KevinBankston) April 8, 2016

It's not hard to see why the White House declined to endorse Feinstein-Burr. They took a complex issue, arrived at the most naive solution.

— Matthew Green (@matthew_d_green) April 8, 2016

Burr and Feinstein’s bill disappoints its privacy critics in part because it seems to entirely ignore the points already made in a debate that’s raged for well over a year, and has its roots in the crytpo wars of the 1990s. Last summer, for instance, more than a dozen of the world’s top cryptographers published a paper warning of the dangers of weakening encryption on behalf of law enforcement. They cautioned that any backdoor created to give law enforcement access to encrypted communications would inevitably be used by sophisticated hackers and foreign cyberspies. And privacy advocates have also pointed out that any attempt to ban strong encryption in American products would only force people seeking law-enforcement-proof data protection to use encryption software created outside the U.S., of which there is plenty to choose from. Apple, in its lengthy, detailed arguments with the FBI in front of Congress and in legal filings, has called that weakening of Americans’ security a “unilateral disarmament” in its endless war with hackers to protect its users’ privacy.

White House Silence on an Anti-Encryption Bill Means Nothing
White House Silence on an Anti-Encryption Bill Means Nothing
Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People
Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People
The Apple-FBI Battle Is Over, But the New Crypto Wars Have Just Begun
Proposed State Bans on Phone Encryption Make Zero Sense
Proposed State Bans on Phone Encryption Make Zero Sense
Tom Mentzer, a spokesman for Senator Feinstein, told WIRED in a statement on behalf of both bill sponsors that “we’re still working on finalizing a discussion draft and as a result can’t comment on language in specific versions of the bill. However, the underlying goal is simple: when there’s a court order to render technical assistance to law enforcement or provide decrypted information, that court order is carried out. No individual or company is above the law. We’re still in the process of soliciting input from stakeholders and hope to have final language ready soon.”

The Burr/Feinstein draft text may in fact be so bad for privacy that it’s good for privacy: Privacy advocates point out that it has almost zero likelihood of making it into law in its current form. The White House has already declined to publicly support the bill. And Adam Schiff, the top Democratic congressman on the House of Representatives’ intelligence committee, gave WIRED a similarly ambivalent comment on the upcoming legislation yesterday. “I don’t think Congress is anywhere near a consensus on the issue,” Schiff said, “given how difficult it was to legislate the relatively easy [Cyber Information Sharing Act], and this is comparatively far more difficult and consequential.”

Bankston puts it more simply. “The CCOA is DOA,” he says, coining an acronym for the draft bill. But he warns that privacy activists and tech firms should be careful nonetheless not to underestimate the threat it represents. “We have to take this seriously,” he says. “If this is the level of nuance and understanding with which our policymakers are viewing technical issues we’re in a profoundly worrisome place.”

1Correction 4/8/2016 1:00pm EST: A previous version of this story stated that the draft bill text had been released by the senators, which a spokesperson for Senator Burr has since said in a statement to WIRED she didn’t “believe was consistent with the facts.”

Apple Motion to Vacate - 20160225



This is not a case about one isolated iPhone. Rather, this case is about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld: the ability to force companies like Apple to undermine the basic security and privacy interests of hundreds of millions of individuals around the globe. The government demands that Apple create a back door to defeat the encryption on the iPhone, making its users' most confidential and personal information vulnerable to hackers, identity thieves, hostile foreign agents, and unwarranted government surveillance. The All Writs Act, first enacted in 1789 and on which the government bases its entire case, "does not give the district court a roving commission" to conscript and commandeer Apple in this manner. Plum Creek Lumber Co. v. Hutton, 608 F.2d 1283, 1289 (9th Cir. 1979). In fact, no court has ever authorized what the government now seeks, no law supports such unlimited and sweeping use of the judicial process, and the Constitution forbids it.

Since the dawn of the computer age, there have been malicious people dedicated to breaching security and stealing stored personal information. Indeed, the government itself falls victim to hackers, cyber-criminals, and foreign agents on a regular basis, most famously when foreign hackers breached Office of Personnel Management databases and gained access to personnel records, affecting over 22 million current and former federal workers and family members. 1 In the face of this daily siege, Apple is dedicated to enhancing the security of its devices, so that when customers use an iPhone, they can feel confident that their most private personal information—financial records and credit card information, health information, location data, calendars, personal and political beliefs, family photographs, information about their children will be safe and secure. To this end, Apple uses encryption to protect its customers from cyber-attack and works hard to improve security with every software release because the threats are becoming more frequent and sophisticated. Beginning with iOS 8, Apple added additional security features that incorporate the passcode into the encryption system. It is these protections that the government now seeks to roll back by judicial decree.

There are two important and legitimate interests in this case: the needs of law enforcement and the privacy and personal safety interests of the public. In furtherance of its law enforcement interests, the government had the opportunity to seek amendments to existing law, to ask Congress to adopt the position it urges here. But rather than pursue new legislation, the government backed away from Congress and turned to the courts, a forum ill-suited to address the myriad competing interests, potential ramifications, and unintended consequences presented by the government's unprecedented demand. And more importantly, by invoking "terrorism" and moving ex parte behind closed courtroom doors, the government sought to cut off debate and circumvent thoughtful analysis.

The order demanded by the government compels Apple to create a new operating system—effectively a "back door" to the iPhone—that Apple believes is too dangerous to build. Specifically, the government would force Apple to create new software with functions to remove security features and add a new capability to the operating system to attack iPhone encryption, allowing a passcode to be input electronically. This would make it easier to unlock the iPhone by "brute force," trying thousands or millions of passcode combinations with the speed of a modern computer. In short, the government wants to compel Apple to create a crippled and insecure product. Once the process is created, it provides an avenue for criminals and foreign agents to access millions of iPhones. And once developed for our government, it is only a matter of time before foreign governments demand the same tool.

The government says: "Just this once" and "Just this phone." But the government knows those statements are not true; indeed the government has filed multiple other applications for similar orders, some of which are pending in other courts. 2. And as news of this Court's order broke last week, state and local officials publicly declared their intent to use the proposed operating system to open hundreds of other seized devices—in cases having nothing to do with terrorism. 3. If this order is permitted to stand, it will only be a matter of days before some other prosecutor, in some other important case, before some other judge, seeks a similar order using this case as precedent. Once the floodgates open, they cannot be closed, and the device security that Apple has worked so tirelessly to achieve will be unwound without so much as a congressional vote. As Tim Cook, Apple's CEO, recently noted: "Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks—from restaurants and banks to stores and homes. No reasonable person would find that acceptable." Declaration of Nicola T. Hanna ("Hanna Decl."), Ex. D [Apple Inc., A Message to Our Customers (Feb. 16, 2016)].

Despite the context of this particular action, no legal principle would limit the use of this technology to domestic terrorism cases—but even if such limitations could be imposed, it would only drive our adversaries further underground, using encryption technology made by foreign companies that cannot be conscripted into U.S. government service —leaving law-abiding individuals shouldering all of the burdens on liberty, without any offsetting benefit to public safety. Indeed, the FBI's repeated warnings that criminals and terrorists are able to "go dark" behind end-to-end encryption methods proves this very point. See Hanna Decl. Ex. F [FBI, Operational Technology, Going Dark Issue (last visited Feb. 23, 2016) ("FBI, Going Dark")].

Finally, given the government's boundless interpretation of the All Writs Act, it is hard to conceive of any limits on the orders the government could obtain in the future. For example, if Apple can be forced to write code in this case to bypass security features and create new accessibility, what is to stop the government from demanding that Apple write code to turn on the microphone in aid of government surveillance, activate the video camera, surreptitiously record conversations, or turn on location services to track the phone's user? Nothing.

As FBI Director James Comey expressly recognized:

Democracies resolve such tensions through robust debate. . . . It may be that, as a people, we decide the benefits [of strong encryption] outweigh the costs and that there is no sensible, technically feasible way to optimize privacy and safety in this particular context, or that public safety folks will be able to do their job well enough in the world of universal strong encryption. Those are decisions Americans should make, but I think part of my job is [to] make sure the debate is informed by a reasonable understanding of the costs.

Hanna Decl. Ex. G [James Comey, Encryption, Public Safety, and "GoingDark," Lawfare (July 6, 2015, 10:38 AM) ("Comey, Going Dark")]; see also Hanna Decl. Ex. H [James Comey, We Could Not Look the Survivors in the Eye if We Did Not Follow This Lead, Lawfare (Feb. 21, 2016, 9:03 PM) ("Comey, Follow This Lead")] (reiterating that the tension between national security and individual safety and privacy "should not be resolved by the FBI, which investigates for a living[, but rather] . . . by the American people . . . ."). The government, by seeking an order mandating that Apple create software to destabilize the security of the iPhone and the law-abiding citizens who use it to store data touching on every facet of their private lives, is not acting to inform or contribute to the debate; it is seeking to avoid it.

Apple strongly supports, and will continue to support, the efforts of law enforcement in pursuing justice against terrorists and other criminals—just as it has in this case and many others. But the unprecedented order requested by the government finds no support in the law and would violate the Constitution. Such an order would inflict significant harm—to civil liberties, society, and national security—and would preempt decisions that should be left to the will of the people through laws passed by Congress and signed by the President. Accordingly, the Court should vacate the order and deny the government's motion to compel.

II. BACKGROUND A. Apple's Industry-Leading Device Security.
Apple is committed to data security. Encryption provides Apple with the strongest means available to ensure the safety and privacy of its customers against threats known and unknown. For several years, iPhones have featured hardware- and
software-based encryption of their password-protected contents. Declaration of Erik Neuenschwander ("Neuenschwander Decl.") f 8. These protections safeguard the encryption keys on the device with a passcode designated by the user during setup. Id. f 9. This passcode immediately becomes entangled with the iPhone's Unique ID ("UID"), which is permanently assigned to that one device during the manufacturing process. Id. f 13. The iPhone's UID is neither accessible to other parts of the operating system nor known to Apple. See generally Hanna Decl. Ex. K [Apple Inc., iOS Security: iOS 9.0 or later (September 2015)]. These protections are designed to prevent anyone without the passcode from accessing encrypted data on iPhones. Neuenschwander Decl. f 8.

Cyber-attackers intent on gaining unauthorized access to a device could break a user-created passcode, if given enough chances to guess and the ability to test passwords rapidly by automated means. To prevent such "brute-force" attempts to determine the passcode, iPhones running iOS 8 and higher include a variety of safeguards. Id. f 10. For one, Apple uses a "large iteration count" to slow attempts to access an iPhone, ensuring that it would take years to try all combinations of a six- character alphanumeric passcode. Id. f 11. In addition, Apple imposes escalating time delays after the entry of each invalid passcode. Id. f 12. Finally, Apple also includes a setting that—if activated—automatically deletes encrypted data after ten consecutive incorrect attempts to enter the passcode. Id. This combination of security features protects users from attackers or if, for example, the user loses the device.
B. The Government Abandoned Efforts To Obtain Legal Authority For Mandated Back Doors.
Some in the law enforcement community have disparaged the security improvements by Apple and others, describing them as creating a "going dark" problem in which law enforcement may possess the "legal authority to intercept and access communications and information pursuant to court orders" but lack the "technical ability to carry out those orders because of a fundamental shift in communications services and technologies." As a result, some officials have advanced the view that companies should be required to maintain access to user communications and data and provide that information to law enforcement upon satisfaction of applicable legal requirements. This would give the government, in effect, a back door to otherwise encrypted communications—which would be precisely the result of the government's position in this case.

(Feb. 17, 2016)]; cf. Hanna Decl. Ex. J [Damian Paletta How the U.S. Fights Encryption—and Also Helps Develop It, Wall St. J. (Feb. 22, 2016)] (describing funding by U.S. government of stronger encryption technologies).

Apple and other technology companies, supported by leading security experts, have disagreed with law enforcement's position, observing that any back door enabling government officials to obtain encrypted data would also create a vulnerability that could be exploited by criminals and foreign agents, weakening critical security protections and creating new and unforeseen access to private information. For these reasons, Apple and others have strongly opposed efforts to require companies to enable the government to obtain encrypted information, arguing that this would compromise the security offered to its hundreds of millions of law-abiding customers in order to weaken security for the few who may pose a threat.

As leading former national security officials have made clear, Apple's "resistance to building in a back door" in whatever form it may take is well-justified, because "the greater public good is a secure communications infrastructure protected by ubiquitous encryption at the device, server and enterprise level without building in means for government monitoring."

In recent years, however, the government, led by the Department of Justice, has considered legislative proposals that would have mandated such a back door. Those proposals sought to significantly expand the reach of the Communications Assistance for Law Enforcement Act ("CALEA"), 47 U.S.C. § 1001 et seq., in which Congress defined the circumstances under which private companies must assist law enforcement in executing authorized electronic surveillance and the nature of—and limits on—the assistance such companies must provide. In addressing the twin needs of law enforcement and privacy, Congress, through CALEA, specified when a company has an obligation to assist the government with decryption of communications, and made clear that a company has no obligation to do so where, as here, the company does not retain a copy of the decryption key. 47 U.S.C. § 1002(b)(3). Congress, keenly aware of and focusing on the specific area of dispute here, thus opted not to provide authority to compel companies like Apple to assist law enforcement with respect to data stored on a smartphone they designed and manufactured.

The government's proposed changes to CALEA would have dramatically expanded the law's scope by mandating that companies install back doors into their products to ensure that authorities can access encrypted data when authorized to do so. In the face of this proposal—commonly referred to as "CALEA II"—leading technology companies, including Apple, as well as public interest organizations like the ACLU and Human Rights Watch, urged President Obama to "reject any proposal that U.S. companies deliberately weaken the security of their products . . . [and] instead focus on developing policies that will promote rather than undermine the wide adoption of strong encryption technology."

The Executive Branch ultimately decided not to pursue CALEA II, and Congress has left CALEA untouched, meaning that Congress never granted the authority the government now asserts. Moreover, members of Congress have recently introduced three pieces of legislation that would affirmatively prohibit the government from forcing private companies like Apple to compromise data security. On October 8, 2015, FBI Director Comey confirmed that the Obama Administration would not seek passage of CALEA II at that time. Instead, Director Comey expressed his view that the "going dark" debate raises issues that "to a democracy should be very, very concerning" and therefore the issue is "worthy of a larger public conversation." President Obama has also remarked that it is "useful to have civil libertarians and others tapping us on the shoulder in the midst of this process and reminding us that there are values at stake as well," noting further that he "welcome[s] that kind of debate." As the President has recognized, these issues are part of "a public conversation that we should end up having."
C. Apple's Substantial Assistance In The Government's Investigation
Apple was shocked and saddened by the mindless savagery of the December 2, 2015 terrorist attack in San Bernardino. In the days following the attack, the FBI approached Apple for help in its investigation. Apple responded immediately, and devoted substantial resources on a 24/7 basis to support the government's investigation of this heinous crime. Declaration of Lisa Olle ("Olle Decl.") ^ 5-9.
Apple promptly provided all data that it possessed relating to the attackers' accounts and that the FBI formally requested via multiple forms of legal process, in keeping with Apple's commitment to comply with all legally valid subpoenas and search warrants that the company receives. Id. Additionally, Apple has furnished valuable informal assistance to the government's investigation—participating in teleconferences, providing technical assistance, answering questions from the FBI, and suggesting potential alternatives for the government to attempt to obtain data from the iPhone at issue. Id. f 6.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker's accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks. Had the FBI consulted Apple first, this litigation may not have been necessary.

D. The Government's Ex Parte Application Under The All Writs Act, And This Court's Order
On February 16, 2016, the government filed an ex parte application and proposed order asking the Court to compel Apple to assist in the government's investigation under the authority of the All Writs Act, codified at 28 U.S.C. § 1651.

With no opposition or other perspectives to consider, the Court granted the government's request and signed the government's proposed order, thereby compelling Apple to create new software that would allow the government to hack into an iPhone 5c used by one of the attackers. Order Compelling Apple Inc. to Assist Agents in Search, In the Matter of the Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS300, Cal. License Plate 35KGD203, No. ED 15- 0451M (Feb. 16, 2016), Dkt. at 19 (the "Order").

The Order directs Apple to provide "reasonable technical assistance to assist law enforcement agents in obtaining access to the data" on the device. Id. | 1. The Order further defines this "reasonable technical assistance" to include creating custom software that can be loaded on the iPhone to accomplish three goals: (1) bypass or disable the iPhone's "auto-erase" function, designed to protect against efforts to obtain unauthorized access to the device's encrypted contents by deleting encrypted data after ten unsuccessful attempts to enter the iPhone's passcode, (2) enable the FBI to electronically submit passcodes to the device for testing, bypassing the requirement that passcodes be manually entered, and (3) remove any time delays between entering incorrect passcodes. Id. | 2. Because the government proceeded ex parte, Apple had no opportunity to weigh in on whether such assistance was "reasonable," and thus the government's request was assumed to be.

The software envisioned by the government simply does not exist today. Thus, at bottom, the Order would compel Apple to create a new version of the iPhone operating system designed to defeat the critical security features noted previously for the specific purpose of accessing the device's contents in unencrypted form—in other words, to write new software to create a back door to the device's encrypted data.

E. The Resources And Effort Required To Develop The Software Demanded By The Government
The compromised operating system that the government demands would require significant resources and effort to develop. Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks. Neuenschwander Decl. f 22. Members of the team would include engineers from Apple's core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer. Id.
No operating system currently exists that can accomplish what the government wants, and any effort to create one will require that Apple write new code, not just disable existing code functionality. Id. f 24-25. Rather, Apple will need to design and implement untested functionality in order to allow the capability to enter passcodes into the device electronically in the manner that the government describes. Id. f 24. In addition, Apple would need to either develop and prepare detailed documentation for the above protocol to enable the FBI to build a brute-force tool that is able to interface with the device to input passcode attempts, or design, develop and prepare documentation for such a tool itself. Id. f 25. Further, if the tool is utilized remotely (rather than at a secure Apple facility), Apple will also have to develop procedures to encrypt, validate, and input into the device communications from the FBI. Id. This entire development process would need to be logged and recorded in case Apple's methodology is ever questioned, for example in court by a defense lawyer for anyone charged in relation to the crime. Id. f 28.

Once created, the operating system would need to go through Apple's quality assurance and security testing process. Id. f 29. Apple's software ecosystem is incredibly complicated, and changing one feature of an operating system often has ancillary or unanticipated consequences. Id. f 30. Thus, quality assurance and security testing would require that the new operating system be tested on multiple devices and validated before being deployed. Id. Apple would have to undertake additional testing efforts to confirm and validate that running this newly developed operating system to bypass the device's security features will not inadvertently destroy or alter any user data. Id. | 31. To the extent problems are identified (which is almost always the case), solutions would need to be developed and re-coded, and testing would begin anew. Id. | 32. As with the development process, the entire quality assurance and security testing process would need to be logged, recorded, and preserved. Id. | 33. Once the new custom operating system is created and validated, it would need to be deployed on to the subject device, which would need to be done at an Apple facility. Id. ^ 34-35. And if the new operating system has to be destroyed and recreated each time a new order is issued, the burden will multiply. Id. ^ 44-45.

A. The All Writs Act Does Not Provide A Basis To Conscript Apple To Create Software Enabling The Government To Hack Into iPhones.

The All Writs Act (or the "Act") does not provide the judiciary with the boundless and unbridled power the government asks this Court to exercise. The Act is intended to enable the federal courts to fill in gaps in the law so they can exercise the authority they already possess by virtue of the express powers granted to them by the Constitution and Congress; it does not grant the courts free-wheeling authority to change the substantive law, resolve policy disputes, or exercise new powers that Congress has not afforded them. Accordingly, the Ninth Circuit has squarely rejected the notion that "the district court has such wide-ranging inherent powers that it can impose a duty on a private party when Congress has failed to impose one. To so rule would be to usurp the legislative function and to improperly extend the limited federal court jurisdiction." Plum Creek, 608 F.2d at 1290 (emphasis added).
Congress has never authorized judges to compel innocent third parties to provide decryption services to the FBI. Indeed, Congress has expressly withheld that authority in other contexts, and this issue is currently the subject of a raging national policy debate among members of Congress, the President, the FBI Director, and state and local prosecutors. Moreover, federal courts themselves have never recognized an inherent authority to order non-parties to become de facto government agents in ongoing criminal investigations. Because the Order is not grounded in any duly enacted rule or statute, and goes well beyond the very limited powers afforded by Article III of the Constitution and the All Writs Act, it must be vacated.

1. The All Writs Act Does Not Grant Authority To Compel Assistance Where Congress Has Considered But Chosen Not To Confer Such Authority.
The authority the government seeks here cannot be justified under the All Writs Act because law enforcement assistance by technology providers is covered by existing laws that specifically omit providers like Apple from their scope. The All Writs Act authorizes courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law," 28 U.S.C. § 1651(a), but as the Supreme Court has held, it "does not authorize [courts] to issue ad hoc writs whenever compliance with statutory procedures appears inconvenient or less appropriate," Pa. Bureau of Corr. v. U.S. Marshals Serv, 474 U.S. 34, 38, 43 (1985) (holding that the Act did not confer power on the district court to compel non- custodians to bear the expense of producing the prisoner-witnesses); see also In the Matter of an Application of U.S. of Am. for an Order Authorizing Disclosure of Location Info. of a Specified Wireless Tel., 849 F. Supp. 2d 526, 578 (D. Md. 2011) (holding that the Act does not authorize an "end run around constitutional and statutory law"). The Ninth Circuit likewise has emphasized that the "All Writs Act is not a grant of plenary power to federal courts. Rather, it is designed to aid the courts in the exercise of their jurisdiction." Plum Creek, 608 F.2d at 1289 (holding that the Act "does not give the district court a roving commission to order a party subject to an investigation to accept additional risks at the bidding" of the government); see also Ex parte Bollman, 8. U.S. 75 (1807) ("[C]ourts which are created by written law, and whose jurisdiction is defined by written law, cannot transcend that jurisdiction.").

Thus, in another pending case in which the government seeks to compel Apple to assist in obtaining information from a drug dealer's iPhone, Magistrate Judge Orenstein issued an order stating that while the Act may be appropriately invoked "to fill in a statutory gap that Congress has failed to consider," it cannot be used to grant the government authority "Congress chose not to confer." In re Order Requiring Apple, Inc. to Assist in the Execution of a Search Warrant Issued by this Court ("In re Order"), No. 15-MC-1902, 2015 WL 5920207, at *2 (E.D.N.Y. Oct. 9, 2015).

Congress knows how to impose a duty on third parties to facilitate the government's decryption of devices. Similarly, it knows exactly how to place limits on what the government can require of telecommunications carriers and also on manufacturers of telephone equipment and handsets. And in CALEA, Congress decided not to require electronic communication service providers, like Apple, to do what the government seeks here. Contrary to the government's contention that CALEA is inapplicable to this dispute, Congress declared via CALEA that the government cannot dictate to providers of electronic communications services or manufacturers of telecommunications equipment any specific equipment design or software configuration.

In the section of CALEA entitled "Design of features and systems configurations," 47 U.S.C. § 1002(b)(1), the statute says that it "does not authorize any law enforcement agency or officer—
(1) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.
(2) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.
Apple unquestionably serves as a provider of "electronic communications services" through the various messaging services it provides to its customers through iPhones.
See Quon v. Arch Wireless Operating Co., Inc., 529 F.3d 892, 901 (9th Cir. 2008). Apple also makes mobile phones. As such, CALEA does not allow a law enforcement agency to require Apple to implement any specific design of its equipment, facilities, services or system configuration. Yet, that is precisely what the government seeks here. Thus, CALEA's restrictions are directly on point.
Moreover, CALEA also intentionally excludes "information services providers," like Apple, from the scope of its mandatory assistance provisions. This exclusion precludes the government from using the All Writs Act to require Apple to do that which Congress eschewed. But even if Apple were covered by CALEA, the law does not require covered telecommunication carriers (which Apple is not) to be responsible for "decrypting, or ensuring the government's ability to decrypt, any communication encrypted by a subscriber or customer unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication." 47 U.S.C. § 1002(b)(3) (emphasis added).

Thus, here again, CALEA makes a specific choice to allow strong encryption (or any other security feature or configuration) with keys chosen by end users to be deployed, and prevents the government from mandating that such encryption schemes contain a "back door." See also H.R. Rep. 103-827(I), at 24, 1994 U.S.C.C.A.N. 3489, 3504 (emphasizing that CALEA does not "prohibit a carrier from deploying an encryption service for which it does not retain the ability to decrypt communications for law enforcement access"; "[n]or does the Committee intend this bill to be in any way a precursor to any kind of ban or limitation on encryption technology. To the contrary, [§ 1002] protects the right to use encryption.").

Similarly, outside of CALEA, Congress also knows how to require third parties to provide "technical assistance," see Wiretap Act, 18 U.S.C. § 2518(4) (providing that upon the lawful execution of a wiretap, the government can seek an order compelling a third party to furnish "all information, facilities, and technical assistance necessary to accomplish the interception"); Pen/Trap Statute, id. § 3123(b)(2) (similar), but Congress has intentionally opted not to compel third parties' assistance in retrieving stored information on devices. That Congress, confronted over the years with the contentious debate about where to draw the lines among competing security and privacy interests, made this decision, "indicates a deliberate congressional choice with which the courts should not interfere." Cent. Bank of Denver, N.A. v. First Interstate Bank of Denver, N.A., 511 U.S. 164, 184 (1994). The Executive Branch, having considered and then declined to urge Congress to amend CALEA to enable it to compel the type of assistance demanded here, cannot seek that same authority via an ex parte application for a court order under the Act.

For the courts to use the All Writs Act to expand sub rosa the obligations imposed by CALEA as proposed by the government here would not just exceed the scope of the statute, but it would also violate the separation-of-powers doctrine. Just as the "Congress may not exercise the judicial power to revise final judgments," Clinton v. Jones, 520 U.S. 681, 699 (1997) (citing Plaut v. Spendthrift Farm, Inc., 514 U.S. 211 (1995)), courts may not exercise the legislative power by repurposing statutes to meet the evolving needs of society, see Clark v. Martinez, 543 U.S. 371, 391 (2005) (court should "avoid inventing a statute rather than interpreting one") (citation, quotation marks, and alterations omitted); see also Alzheimer's Inst. of Am. Inc. v. Elan Corp, 2013 WL 8744216, at *2 (N.D. Cal. Jan. 31, 2013) (Congress alone has authority "to update" a "technologically antiquated" statute "to address the new and rapidly evolving era of computer and cloud-stored, processed and produced data"). Nor does Congress lose "its exclusive constitutional authority to make laws necessary and proper to carry out the powers vested by the Constitution" in times of crisis (whether real or imagined). Youngstown Sheet & Tube Co. v. Sawyer, 343 U.S. 579, 588-89 (1952). Because a "decision to rearrange or rewrite [a] statute falls within the legislative, not the judicial prerogative[,]" the All Writs Act cannot possibly be deemed to grant to the courts the extraordinary power the government seeks. Xi v. INS, 298 F.3d 832, 839 (9th Cir. 2002).

If anything, whether companies like Apple should be compelled to create a back door to their own operating systems to assist law enforcement is a political question, not a legal one. See Baker v. Carr, 369 U.S. 186, 217 (1962) (holding that a case is a nonjusticiable political question if it is impossible to decide "without an initial policy determination of a kind clearly for nonjudicial discretion"); see also Vieth v. Jubelirer, 541 U.S. 267, 277-290 (2004) (plurality opinion) (dismissing claims of political gerrymandering under the political question doctrine because there was no "judicially discoverable and manageable standard for resolving" them); Diamond v. Chakrabarty, 447 U.S. 303, 317 (1980) ("The choice [the court is] urged to make is a matter of high policy for resolution within the legislative process after the kind of investigation, examination, and study that legislative bodies can provide and courts cannot."); Saldana v. Occidental Petroleum Corp., 774 F.3d 544, 552 (9th Cir. 2014) (per curiam) (affirming district court's holding that the claims were "inextricably bound to an inherently political question" and thus were "beyond the jurisdiction of our courts").

In short, a decision to "short-circuit public debate on this controversy seems fundamentally inconsistent with the proposition that such important policy issues should be determined in the first instance by the legislative branch after public debate—as opposed to having them decided by the judiciary in sealed, ex parte proceedings." In re Order, 2015 WL 5920207, at *3 n.1. Such an important decision with such widespread global repercussions goes well beyond the purview of the All Writs Act, which merely provides courts with a limited grant of ancillary authority to issue orders "in aid of their respective jurisdictions." 28 U.S.C. § 1651(a).
2. New York Telephone Co. And Its Progeny Confirm That The All Writs Act Does Not Authorize Courts To Compel The Unprecedented And Unreasonably Burdensome Conscription Of Apple That The Government Seeks.

The government relies heavily on the Supreme Court's decision in United States v. New York Telephone Co., 434 U.S. 159 (1977), to assert that the All Writs Act permits the Court to compel private third parties like Apple to assist the government in effectuating a search warrant by writing new software code that would undermine the security of its own product. The government misapplies this case.
In New York Telephone Co., the district court compelled the company to install a simple pen register device (designed to record dialed numbers) on two telephones where there was "probable cause to believe that the [c]ompany's facilities were being employed to facilitate a criminal enterprise on a continuing basis." 434 U.S. at 174. The Supreme Court held that the order was a proper writ under the Act, because it was consistent with Congress's intent to compel third parties to assist the government in the use of surveillance devices, and it satisfied a three-part test imposed by the Court.

First, the Court found that the company was not "so far removed from the underlying controversy that its assistance could not be permissibly compelled." Id. Second, the assistance sought was "meager," and as a public utility, the company did not "ha[ve] a substantial interest in not providing assistance." Id. Third, "after an exhaustive search," the FBI was unable to find a suitable location to install its own pen registers without tipping off the targets, and thus there was "no conceivable way in which the surveillance authorized by the District Court could have been successfully accomplished" without the company's meager assistance. Id. at 175. Applying these factors to this case confirms that the All Writs Act does not permit the Court to compel the unprecedented and unreasonably burdensome assistance that the government seeks.

a. Apple's Connection To The Underlying Case Is "Far Removed" And Too Attenuated To Compel Its Assistance
Nothing connects Apple to this case such that it can be drafted into government service to write software that permits the government to defeat the security features on Apple's standard operating system. Apple is a private company that does not own or possess the phone at issue, has no connection to the data that may or may not exist on the phone, and is not related in any way to the events giving rise to the investigation. This case is nothing like New York Telephone Co., where there was probable cause to believe that the phone company's own facilities were "being employed to facilitate a criminal enterprise on a continuing basis." Id. at 174.

The government relies on United States v. Hall, 583 F. Supp. 717 (E.D. Va. 1984), and In re Application of U.S. of Am. for an Order DirectingX to Provide Access- to Videotapes ("Videotapes "), 2003 WL 22053105 (D. Md. Aug. 22, 2003), but these cases involved mere requests to produce existing business records, not the compelled creation of intellectual property. In Hall, the court found that the All Writs Act permitted an order compelling a credit card company to produce the credit card records of a federal fugitive's former girlfriend, because the government had reason to believe that she was harboring and supporting the fugitive, and thus potentially using her credit card to perpetrate an ongoing crime. 583 F. Supp. at 720 (reasoning that a credit card issuer "has an interest" in a transaction "when a credit card is used for an illegal purpose even though the act itself be not illegal"). Similarly, in Videotapes, the court compelled an apartment complex to provide access to videotape surveillance footage of a hallway in the apartment to assist with executing an arrest warrant on a fugitive. 2003 WL 22053105, at *3. This case is nothing like Hall and Videotapes, where the government sought assistance effectuating an arrest warrant to halt ongoing criminal activity, since any criminal activity linked to the phone at issue here ended more than two months ago when the terrorists were killed.

Further, unlike a telecommunications monopoly, Apple is not a "highly regulated public utility with a duty to serve the public." New York Telephone Co., 434 U.S. at 174; see also Application of U.S. of Am. for an Order Authorizing an In- Progress Trace of Wire Commc'ns over Tel. Facilities ("Mountain Bell"), 616 F.2d 1122, 1132 (9th Cir. 1980) (discussing New York Telephone Co. and noting that its
ruling compelling assistance under the All Writs Act relied "[t]o a great extent . . . upon the highly regulated, public nature" of the phone company); In re Order, 2015 WL 5920207, at *4-5. Whereas public utilities have no "substantial interest in not providing assistance" to the government, 434 U.S. at 174, and "enjoy a monopoly in an essential area of communications," Mountain Bell, 616 F.2d at 1131, Apple is a private company that believes that encryption is crucial to protect the security and privacy interests of citizens who use and store their most personal data on their iPhones, "from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going." Hanna Decl. Ex. D at 1 [Apple Inc., A Message to Our Customers (Feb. 16, 2016)].

That Apple "designed, manufactured and sold the SUBJECT DEVICE, and wrote and owns the software that runs the phone," Memorandum of Points and Authorities in Support of Government's Ex Parte Application for Order Compelling Apple Inc. to Assist Agents in Search, In the Matter of the Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS300, Cal. License Plate 35KGD203, No. ED 15-0451M (Feb. 16, 2016), Dkt. 18 at 11 (the "Ex Parte App"), is insufficient to establish the connection mandated by New York Telephone Co. The All Writs Act does not allow the government to compel a manufacturer's assistance merely because it has placed a good into the stream of commerce. Apple is no more connected to this phone than General Motors is to a company car used by a fraudster on his daily commute. Moreover, that Apple's software is "licensed, not sold," Ex Parte App. at 5, is "a total red herring," as Judge Orenstein already concluded, Hanna Decl. Ex. DD at 42:4-10 [In re Order Requiring Apple Inc. to Assist in the Execution of a Search Warrant Issued by the Court, E.D.N.Y No. 15 MC 1902, Dkt. 19 ("October 26, 2015 Transcript")]. A licensing agreement no more connects Apple to the underlying events than a sale. The license does not permit Apple to invade or control the private data of its customers. It merely limits customers' use and redistribution of Apple's software. Indeed, the government's position has no limits and, if accepted, would eviscerate the "remoteness" factor entirely, as any company that offers products or services to consumers could be conscripted to assist with an investigation, no matter how attenuated their connection to the criminal activity. This is not, and never has been, the law.

b. The Order Requested By The Government Would Impose An Unprecedented And Oppressive Burden On Apple And Citizens Who Use The iPhone.
An order pursuant to the All Writs Act "must not [1] adversely affect the basic interests of the third party or [2] impose an undue burden." Hall, 583 F. Supp. at 719. The Order violates both requirements by conscripting Apple to develop software that does not exist and that Apple has a compelling interest in not creating. The government's request violates the first requirement—that the Act "must not adversely affect the basic interests of the third party"—because Apple has a strong interest in safeguarding its data protection systems that ensure the security of hundreds of millions of customers who depend on and store their most confidential data on their iPhones. An order compelling Apple to create software that defeats those safeguards undeniably threatens those systems and adversely affects Apple's interests and those of iPhone users around the globe. See id.

The government's request violates the second requirement—that the Act "must not . . . impose an undue burden"—because the government's unprecedented demand forces Apple to develop new software that destroys the security features that Apple has spent years building. As discussed supra in section II.E, no operating system currently exists that can accomplish what the government wants, and any effort to create one would require that Apple write new code, not just disable existing functionality. Neuenschwander Decl. ^ 23-25. Experienced Apple engineers would have to design, create, test, and validate the compromised operating system, using a hyper-secure isolation room within which to do it, and then deploy and supervise its operation by the FBI to brute force crack the phone's passcode. Id. ^ 21-43; Olle Decl. | 14. The system itself would have to be tested on multiple devices to ensure that the operating system works and does not alter any data on the device. Neuenschwander ^ 30-31. All aspects of the development and testing processes would need to be logged and recorded in case Apple's methodology is ever questioned. Id. ^ 28, 33.

Moreover, the government's flawed suggestion to delete the program and erase every trace of the activity would not lessen the burden, it would actually increase it since there are hundreds of demands to create and utilize the software waiting in the wings. Id. ^ 38-45. If Apple creates new software to open a back door, other federal and state prosecutors—and other governments and agencies—will repeatedly seek orders compelling Apple to use the software to open the back door for tens of thousands of iPhones. Indeed, Manhattan District Attorney Cyrus Vance, Jr., has made clear that the federal and state governments want access to every phone in a criminal investigation. See Hanna Decl., Ex. Z [(Cyrus R. Vance, Jr., No Smartphone Lies Beyond the Reach of a Judicial Search Warrant, N.Y. Times (Feb. 18, 2016)]; Hanna Decl. | 5 at 18:28 [Charlie Rose, Television Interview of Cyrus Vance (Feb. 18, 2016)] (Vance stating "absolutely" that he "want[s] access to all those phones that [he thinks] are crucial in a criminal proceeding"). This enormously intrusive burden—building everything up and tearing it down for each demand by law enforcement—lacks any support in the cases relied on by the government, nor do such cases exist.

The alternative—keeping and maintaining the compromised operating system and everything related to it—imposes a different but no less significant burden, i.e., forcing Apple to take on the task of unfailingly securing against disclosure or misappropriation the development and testing environments, equipment, codebase, documentation, and any other materials relating to the compromised operating system. Id. | 47. Given the millions of iPhones in use and the value of the data on them, criminals, terrorists, and hackers will no doubt view the code as a major prize and can be expected to go to considerable lengths to steal it, risking the security, safety, and privacy of customers whose lives are chronicled on their phones. Indeed, as the Supreme Court has recognized, "[t]he term 'cell phone' is itself misleading shorthand; . . . these devices are in fact minicomputers" that "could just as easily be called cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums, televisions, maps, or newspapers." Riley v. California, 134 S. Ct. 2473, 2488-89 (2014) (observing that equating the "data stored on a cell phone" to "physical items" "is like saying a ride on horseback is materially indistinguishable from a flight to the moon"). By forcing Apple to write code to compromise its encryption defenses, the Order would impose substantial burdens not just on Apple, but on the public at large. And in the meantime, nimble and technologically savvy criminals will continue to use other encryption technologies, while the law-abiding public endures these threats to their security and personal liberties—an especially perverse form of unilateral disarmament in the war on terror and crime. See n.4 supra (describing ISIS's shift to more secure communication methods).

In addition, compelling Apple to create software in this case will set a dangerous precedent for conscripting Apple and other technology companies to develop technology to do the government's bidding in untold future criminal investigations. If the government can invoke the All Writs Act to compel Apple to create a special operating system that undermines important security measures on the iPhone, it could argue in future cases that the courts should compel Apple to create a version to track the location of suspects, or secretly use the iPhone's microphone and camera to record sound and video. And if it succeeds here against Apple, there is no reason why the government could not deploy its new authority to compel other innocent and unrelated third-parties to do its bidding in the name of law enforcement. For example, under the same legal theories advocated by the government here, the government could argue that it should be permitted to force citizens to do all manner of things "necessary" to assist it in enforcing the laws, like compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant, or requiring a journalist to plant a false story in order to help lure out a fugitive, or forcing a software company to insert malicious code in its auto- update process that makes it easier for the government to conduct court-ordered surveillance. Indeed, under the government's formulation, any party whose assistance is deemed "necessary" by the government falls within the ambit of the All Writs Act and can be compelled to do anything the government needs to effectuate a lawful court order. While these sweeping powers might be nice to have from the government's perspective, they simply are not authorized by law and would violate the Constitution.

Moreover, responding to these demands would effectively require Apple to create full-time positions in a new "hacking" department to service government requests and to develop new versions of the back door software every time iOS changes, and it would require Apple engineers to testify about this back door as government witnesses at trial. See, e.g., United States v. Cameron, 699 F.3d 621, 643¬44 (1st Cir. 2012) (holding that reports generated by an Internet provider were testimonial, and thus could not be admitted without "giving [defendant] the opportunity to cross-examine the [provider's] employees who prepared the [] [r]eports"). Nothing in federal law allows the courts, at the request of prosecutors, to coercively deputize Apple and other companies to serve as a permanent arm of the government's forensics lab. Indeed, the government fails to cite any case—because none exists—to support its incorrect contention that courts have invoked the All Writs Act to conscript a company like Apple to "to write some amount of code in order to gather information in response to subpoenas or other process." Ex Parte App. at 15.

The burden imposed on Apple is thus in sharp contrast to New York Telephone Co., where the public utility was compelled to provide "meager assistance" in setting up a pen register—a step which "required minimal effort on the part of the [c]ompany and no disruption to its operations." 434 U.S. at 174-75 (noting that the company routinely employed pen registers without court order for purposes of checking billing operations and detecting fraud); see also Mountain Bell, 616 F.2d at 1132 (order compelling the phone company to use a tracing technique akin to a pen register did not impose a substantial burden because it "was extremely narrow in scope," and "prohibit[ed] any tracing technique which required active monitoring by company personnel"). The very limited orders in those cases thus "should not be read to authorize the wholesale imposition upon private, third parties of duties pursuant to search warrants." Id.

The other cases the government relies on involve similarly inconsequential burdens where third parties were asked to turn over records that were already in their possession or readily accessible, Videotapes, 2003 WL 22053105, at *3 (directing apartment complex owner to share surveillance footage "maintained in the ordinary course of business"); Hall, 583 F. Supp. at 722 (directing bank to produce credit card records), or where the third party provided minimal assistance to effect a lawful wiretap, In re Application of U.S. of Am. for an Order Directing a Provider of Commc'n Servs. to Provide Tech. Assistance to Agents of the U.S. Drug Enf't Admin., 2015 WL 5233551, at *5 (D.P.R. Aug. 27, 2015). But unlike those cases, where the government directed a third party to provide something that already existed or sought assistance with a minimal and routine service, here the government wants to compel Apple to deploy a team of engineers to write and test software code and create a new operating system that undermines the security measures it has worked so hard to establish—and then to potentially do that over and over again as other federal, state, local and foreign prosecutors make demands for the same thing.

The government's reliance on two phone "unlocking" cases is similarly misplaced. Ex Parte App. at 9 (citing United States v. Navarro, No. 13-CR-5525 (W.D. Wash. Nov. 13, 2013), ECF No. 39; In re Order Requiring [XXX], Inc. to Assist in the Execution of a Search Warrant Issued by This Court by Unlocking a Cellphone, 2014 WL 5510865, at *2 (S.D.N.Y. Oct. 31, 2014) ("Order Requiring [XXX]"). As an initial matter, the Navarro order is a minute order that does not contain any analysis of the All Writs Act, and it is unclear whether its limitations were ever raised or considered. The Navarro order is also distinguishable because it involved the government's request to unlock an iPhone on an older operating system that did not require the creation of any new software. Order Requiring [XXX], which was also issued without the benefit of adversarial briefing, is equally unavailing. 2014 WL 5510865, at *3 (granting ex parte application to compel a third party to bypass a lock screen on a phone to effectuate a search warrant). Although the court purported to apply New York Telephone Co., it did not analyze all of the factors set forth in that case, such as whether the All Writs Act could be used to compel third parties to hack into phones, whether the cellphone company was "too far removed" from the matter, or whether hacking into the phone adversely affected the company's interests. Rather, the court simply concluded the technical service sought was not "burdensome," akin to "punching a few buttons" or installing a pen register. 2014 WL 5510865, at *2 (internal quotation marks omitted). As Apple has explained, the technical assistance sought here requires vastly more than simply pressing a "few buttons."

The government has every right to reasonably involve the public in the law enforcement process. Indeed, each year Apple complies with thousands of lawful requests for data and information by law enforcement, and on many occasions has extracted data from prior versions of its operating system for the FBI's use. See Olle Decl. ^ 15-16. But compelling minimal assistance to surveil or apprehend a criminal (as in most of the cases the government cites), or demanding testimony or production of things that already exist (akin to exercising subpoena power), is vastly different, and significantly less intrusive, than conscripting a private company to create something entirely new and dangerous. There is simply no parallel or precedent for it.

c. The Government Has Not Demonstrated Apple's Assistance Was Necessary To Effectuating The Warrant.
A third party cannot be compelled to assist the government unless the government is authorized to act and the third party's participation is imperative. The order in New York Telephone Co. satisfied that requirement because the court had authorized surveillance, and "there [was] no conceivable way" to accomplish that surveillance without the company's assistance. 434 U.S. at 175 (noting that FBI had conducted "an exhaustive search" for a way to install a pen register in an undetectable location). The order compelling the phone company's assistance was therefore necessary "to prevent nullification of the court's warrant" and "to put an end to this venture." Id. at 174, 175 & n.23; see also Mountain Bell, 616 F.2d at 1129 (holding that an order compelling a third party to assist with tracing was necessary to carry out a wiretap and halt ongoing criminal activity); Mich. Bell Telephone Co. v. United States, 565 F.2d 385, 389 (6th Cir. 1977) (concluding that telephone company was "the only entity that c[ould] effectuate the order of the district court to prevent company-owned facilities from being used in violation of both state and federal laws").

Here, by contrast, the government has failed to demonstrate that the requested order was absolutely necessary to effectuate the search warrant, including that it exhausted all other avenues for recovering information. Indeed, the FBI foreclosed one such avenue when, without consulting Apple or reviewing its public guidance regarding iOS, the government changed the iCloud password associated with an attacker's account, thereby preventing the phone from initiating an automatic iCloud back-up. See supra II.C. Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks. See Hanna Decl. Ex. DD at 34-36 [October 26, 2015 Transcript] (Judge Orenstein asking the government "to make a representation for purposes of the All Writs Act" as to whether the "entire Government," including the "intelligence community," did or did not have the capability to decrypt an iPhone, and the government responding that "federal prosecutors don't have an obligation to consult the intelligence community in order to investigate crime"). As such, the government has not demonstrated that "there is no conceivable way" to extract data from the phone. New York Tel. Co., 434 U.S. at 174.

3. Other Cases The Government Cites Do Not Support The Type Of Compelled Action Sought Here.
The government does not cite a single case remotely approximating the demand it makes here; indeed, its cases only confirm the wild overreach of the Order.

The government relies, for example, on cases compelling a criminal defendant to take certain actions—specifically, United States v. Fricosu, 841 F. Supp. 2d 1232 (D. Colo. 2012) and United States v. Catoggio, 698 F.3d 64 (2d Cir. 2012) (per curiam)—but those cases say nothing about the propriety of compelling an innocent third party to do so. In Fricosu the government moved to require the defendant to produce the "unencrypted contents" of her laptop computer. 841 F. Supp. 2d at 1235. This order placed no undue burden on the defendant because she could access the encrypted contents on her computer, and the court preserved her Fifth Amendment rights by not compelling the password itself, which was testimonial in nature. See id. at 1236-38. By contrast, the government's request here creates an unprecedented burden on Apple and violates Apple's First Amendment rights against compelled speech, as discussed below. And unlike the compelled creation of a compromised operating system for iOS devices, the order in Fricosu merely required the defendant to hand over her own personal files, and thus posed no risk to third parties' privacy or security interests.

The government's reliance on Catoggio, which involved the seizure of defendant's property, is also inapt. Though the district court had not invoked the All Writs Act, the appellate court cited the Act in affirming the district court's order retaining a convicted defendant's property in anticipation of a restitution order. 698 F.3d at 68-69. But whereas courts have uniformly held that the Act enables a court to restrain a convicted defendant's property pending a restitution order, id. at 67, no court has ever held that the All Writs Act permits the government to conscript a private company to build software for it.

Finally, the government relies on the Ninth Circuit's decision in Plum Creek— but that case only serves to illustrate the government's vast overreach under the All Writs Act. There, the Ninth Circuit affirmed the district court's order declining OSHA's request to compel an employer to rescind a company policy forbidding employees from wearing OSHA air-quality and noise-level testing devices, so that OSHA could more efficiently investigate the company's premises. 608 F.2d at 1289¬90. The court reasoned that a government agency's interest in conducting an efficient investigation is not grounds for issuing a writ requiring a company to comply with the government's demands. Id. at 1290. This was particularly true where OSHA "c[ould] not guarantee that these devices would [not] cause" industry accidents, and the company bore the costs of those accidents. Id. at 1289 & n.4 (internal quotation marks omitted). Even though the investigation would take five times as long to complete without the use of the equipment OSHA sought to compel, the court could not compel their use absent a law requiring it. Id. at 1289 & n.6. The court held that the All Writs Act "does not give the district court a roving commission to order a party subject to an investigation to accept additional risks at the bidding of OSHA inspectors." Id. at 1289. Plum Creek thus provides no support for the government's attempt to compel Apple to create new software "when Congress has failed to impose" such a duty on Apple. Id. at 1290. Forcing Apple to write software that would create a back door to millions of iOS devices would not only "usurp the legislative function," id., but also unconstitutionally compel speech and expose Apple iPhone users to exceptional security and privacy risks.

B. The Order Would Violate The First Amendment And The Fifth Amendment's Due Process Clause.
1. The First Amendment Prohibits The Government From Compelling Apple To Create Code

The government asks this Court to command Apple to write software that will neutralize safety features that Apple has built into the iPhone in response to consumer privacy concerns. Order | 2. The code must contain a unique identifier "so that [it] would only load and execute on the SUBJECT DEVICE," and it must be "'signed' cryptographically by Apple using its own proprietary encryption methods." Ex Parte App. at 5, 7. This amounts to compelled speech and viewpoint discrimination in violation of the First Amendment.
Under well-settled law, computer code is treated as speech within the meaning of the First Amendment. See, e.g., Universal City Studios, Inc. v. Corley, 273 F.3d 429, 449 (2d Cir. 2001); Junger v. Daley, 209 F.3d 481, 485 (6th Cir. 2000); 321 Studios v. Metro Goldwyn Mayer Studios, Inc., 307 F. Supp. 2d 1085, 1099-1100 (N.D. Cal. 2004); United States v. Elcom Ltd, 203 F. Supp. 2d 1111, 1126 (N.D. Cal. 2002); Bernstein v. Dep't of State, 922 F. Supp. 1426, 1436 (N.D. Cal. 1996).

The Supreme Court has made clear that where, as here, the government seeks to compel speech, such action triggers First Amendment protections. As the Court observed in Riley v. Nat'l Fed. of the Blind of N.C., Inc., 487 U.S. 781,796 (1988), while "[t]here is certainly some difference between compelled speech and compelled silence, . . . in the context of protected speech, the difference is without constitutional significance." Compelled speech is a content-based restriction subject to exacting scrutiny, id. at 795, 797-98, and so may only be upheld if it is narrowly tailored to obtain a compelling state interest, see Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 662 (1994).

The government cannot meet this standard here. Apple does not question the government's legitimate and worthy interest in investigating and prosecuting terrorists, but here the government has produced nothing more than speculation that this iPhone might contain potentially relevant information. Hanna Decl. Ex. H [Comey, Follow This Lead] ("Maybe the phone holds the clue to finding more terrorists. Maybe it doesn't."). It is well known that terrorists and other criminals use highly sophisticated encryption techniques and readily available software applications, making it likely that any information on the phone lies behind several other layers of non-Apple encryption. See Hanna Decl. Ex. E [Coker, Tech Savvy] (noting that the Islamic State has issued to its members a ranking of the 33 most secure communications applications, and "has urged its followers to make use of [one app's] capability to host encrypted group chats").

Even more problematically, the Court's Order discriminates on the basis of Apple's viewpoint. When Apple designed iOS 8, it wrote code that announced the value it placed on data security and the privacy of citizens by omitting a back door that bad actors might exploit.

See, e.g., Hanna Decl. Ex. AA [Apple Inc., Privacy, Government Information Requests]. The government disagrees with this position and asks this Court to compel Apple to write new software that advances its contrary views. This is, in every sense of the term, viewpoint discrimination that violates the First Amendment. See Members of City Council v. Taxpayers for Vincent, 466 U.S. 789, 804 (1984).
Finally, the FBI itself foreclosed what would have likely been a promising and vastly narrower alternative to this unprecedented order: backing up the iPhone to iCloud. Apple has extensively cooperated and assisted law enforcement officials in the San Bernardino investigation, but the FBI inadvertently foreclosed a ready avenue by changing the passcode, which precluded the iCloud back-up option.
To avoid the serious First Amendment concerns that the government's request to compel speech presents, this Court should vacate the Order.

2. The Fifth Amendment's Due Process Clause Prohibits The Government From Compelling Apple To Create The Request Code
In addition to violating the First Amendment, the government's requested order, by conscripting a private party with an extraordinarily attenuated connection to the crime to do the government's bidding in a way that is statutorily unauthorized, highly burdensome, and contrary to the party's core principles, violates Apple's substantive due process right to be free from "'arbitrary deprivation of [its] liberty by government.'" Costanich v. Dep't of Soc. & Health Servs., 627 F.3d 1101, 1110 (9th Cir. 2010) (citation omitted); see also, e.g., Cnty. of Sacramento v. Lewis, 523 U.S. 833, 845-46 (1998) ("We have emphasized time and again that '[t]he touchstone of due process is protection of the individual against arbitrary action of government,' . . . [including] the exercise of power without any reasonable justification in the service of a legitimate governmental objective." (citations omitted)); cf. id. at 850 ("Rules of due process are not . . . subject to mechanical application in unfamiliar territory.").

Apple has great respect for the professionals at the Department of Justice and FBI, and it believes their intentions are good. Moreover, Apple has profound sympathy for the innocent victims of the attack and their families. However, while the government's desire to maximize security is laudable, the decision of how to do so while also protecting other vital interests, such as personal safety and privacy, is for American citizens to make through the democratic process. Indeed, examples abound of society opting not to pay the price for increased and more efficient enforcement of criminal laws. For example, society does not tolerate violations of the Fifth Amendment privilege against self-incrimination, even though more criminals would be convicted if the government could compel their confessions. Nor does society tolerate violations of the Fourth Amendment, even though the government could more easily obtain critical evidence if given free rein to conduct warrantless searches and seizures.

At every level of our legal system—from the Constitution, to our statutes, common law, rules, and even the Department of Justice's own policies —society has acted to preserve certain rights at the expense of burdening law enforcement's interest in investigating crimes and bringing criminals to justice. Society is still debating the important privacy and security issues posed by this case. The government's desire to leave no stone unturned, however well intentioned, does not authorize it to cut off debate and impose its views on society.

28 See, e.g., U.S. Const. amend. IV (limitations on searches and seizures), amend. V (limitations on charging; prohibition on compelling testimony of accused).

29 See, eg., 18 U.S.C. § 3282 (prohibition on prosecuting crimes more than five years' old), CALEA (limitations on ability to intercept communications).

30 E.g., attorney-client privilege, spousal privilege, and reporter's privilege, and priest- penitent privilege, all of which limit the government's ability to obtain evidence.

31 See, e.g., Fed. R. Evid. 404 (limitations on use of character evidence), 802 (limitations on use of hearsay).

32 See, e.g., U.S. Attorneys' Manual §§ 9-13-200 (limitations on communicating with witnesses represented by counsel), 9-13.400 (limitations on subpoenaing news media), 9-13-410 (limitations on subpoenaing attorneys), 9-13-420 (limitations on searches of attorneys' offices).

  1. See e.g. Hanna Decl. Ex. A {Ellen Nakashima, Hacks of OPM Databases Compromised 22.1 Million People, Federal Authorities Say, Wash. Post (July 9, 2015)} (explaining that hackers used stolen logins and passwords to gain access to federal employee records databases for six months before detection).
  2. Hanna Decl. Ex. B {Letter to Court, In re Order Requiring Apple, Inc. to Assist in the Execution of a Search Warrant Issued by this Court, E.D.N.Y. No. 15-MC-1902, Kdt. 27}
  3. E.g. Hanna Decl. Ex. C {Seung Lee, The Murder Victim Whose Phone Couldn't Be Cracked and Other Apple Encryption Stories, Newsweek (Feb. 19, 2016)} (Cyrus Vance, Manhattan District Attorney stating that he has "155 to 160 devices that he would like to access, while officials in Sacramento have "well over 100" devices for which they would like Apple to produce unique software so that they can access the devices' contents), Hanna Decl. para 5 at 18:28} Charlie Rose, Television Interview of Cyrus Vance (Feb 18, 2016)} (Vance stating "absolutely" that he "want{s} access to all those phones that {he thinks} are crucial in a criminal proceeding")

What Claims Did the Intelligence Community Make about the Paris Attack to Get the White House to Change on Encryption? - EmptyWheel 20160225

What Claims Did the Intelligence Community Make about the Paris Attack to Get the White House to Change on Encryption? - EmptyWheel 20160225

I’m going to do a series of posts laying out the timeline behind the Administration’s changed approach to encryption. In this, I’d like to make a point about when the National Security Council adopted a “decision memo” more aggressively seeking to bypass encryption. Bloomberg reported on the memo last week, in the wake of the FBI’s demand that Apple help it brute force Syed Rezwan Farook’s work phone.

But note the date: The meeting at which the memo was adopted was convened “around Thanksgiving.”

Silicon Valley celebrated last fall when the White House revealed it would not seek legislation forcing technology makers to install “backdoors” in their software — secret listening posts where investigators could pierce the veil of secrecy on users’ encrypted data, from text messages to video chats. But while the companies may have thought that was the final word, in fact the government was working on a Plan B.

In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.

The approach was formalized in a confidential National Security Council “decision memo,” tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that may need to be changed to counter what FBI Director James Comey calls the “going dark” problem: investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Details of the memo reveal that, in private, the government was honing a sharper edge to its relationship with Silicon Valley alongside more public signs of rapprochement. [my emphasis]

That is, the meeting was convened in the wake of the November 13 ISIS attack on Paris.

We know that last August, Bob Litt had recommended keeping options open until such time as a terrorist attack presented the opportunity to revisit the issue and demand that companies back door encryption.

Privately, law enforcement officials have acknowledged that prospects for congressional action this year are remote. Although “the legislative environment is very hostile today,” the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, “it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.”

There is value, he said, in “keeping our options open for such a situation.”

Litt was commenting on a draft paper prepared by National Security Council staff members in July, which also was obtained by The Post, that analyzed several options. They included explicitly rejecting a legislative mandate, deferring legislation and remaining undecided while discussions continue.

It appears that is precisely what happened — that the intelligence community, in the wake of a big attack on Paris, went to the White House and convinced them to change their approach.

So I want to know what claims the intelligence community made about the use of encryption in the attack that convinced the White House to change approach. Because there is nothing in the public record that indicates encryption was important at all.

It is true that a lot of ISIS associates were using Telegram; shortly after the attack Telegram shut down a bunch of channels they were using. But reportedly Telegram’s encryption would be easy for the NSA to break. The difficulty with Telegram — which the IC should consider seriously before they make Apple back door its products — is that its offshore location probably made it harder for our counterterrorism analysts to get the metadata.

It is also true that an ISIS recruit whom French authorities had interrogated during the summer (and who warned them very specifically about attacks on sporting events and concerts) had been given an encryption key on a thumb drive.

But it’s also true the phone recovered after the attack — which the attackers used to communicate during the attack — was not encrypted. It’s true, too, that French and Belgian authorities knew just about every known participant in the attack, especially the ringleader. From reports, it sounds like operational security — the use of a series of burner phones — was more critical to his ability to move unnoticed through Europe. There are also reports that the authorities had a difficult time translating the dialect of (probably) Berber the attackers used.

From what we know, though, encryption is not the reason authorities failed to prevent the French attack. And a lot of other tools that are designed to identify potential attacks — like the metadata dragnet — failed.

I hate to be cynical (though comments like Litt’s — plus the way the IC used a bogus terrorist threat in 2004 to get the torture and Internet dragnet programs reauthorized — invite such cynicism). But it sure looks like the IC failed to prevent the November attack, and immediately used their own (human, unavoidable) failure to demand a new approach to encryption.

Update: In testimony before the House Judiciary Committee today, Microsoft General Counsel Brad Smith repeated a claim MSFT witnesses have made before: they provided Parisian law enforcement email from the Paris attackers within 45 minutes. That implies, of course, that the data was accessible under PRISM and not encrypted.

Former CIA Agent Says Edward Snowden Revelations Emboldened Apple to Push Back Against FBI - Democracy Now 20160225

Former CIA Agent Says Edward Snowden Revelations Emboldened Apple to Push Back Against FBI

BARRY EISLER, former CIA agent and author of several books, most recently, The God’s Eye View.

We speak with former CIA agent Barry Eisler about the role of Edward Snowden in raising public awareness about encryption and privacy ahead of the FBI’s push for Apple to break the encryption of the iPhone of one of the San Bernardino shooters. "So much of Snowden’s revelations were about this very thing. And the fact that the public knows about corporate cooperation with the government now is in part, I think, what has emboldened Apple to push back," Eisler says. "If we didn’t know about these things, I would expect that Apple would be quietly cooperating. There would be no cost to their doing so." Eisner also discusses his new novel, "The God’s Eye View," which he says is "grounded in things that are actually happening in the world. … I realized I was not going nearly far enough in what I had imagined."


This is a rush transcript. Copy may not be in its final form.

NERMEEN SHAIKH: Well, let’s turn to NSA whistleblower Edward Snowden. In a 2013 interview with The Guardian just after his identity was revealed, Snowden explained why he risked his career to leak the documents.

EDWARD SNOWDEN: I think that the public is owed an explanation of the motivations behind the people who make these disclosures that are outside of the democratic model. When you are subverting the power of government, that that’s a fundamentally dangerous thing to democracy. And if you do that in secret consistently, you know, as the government does when it wants to benefit from a secret action that it took, it will kind of get its officials a mandate to go, "Hey, you know, tell the press about this thing and that thing, so the public is on our side." But they rarely, if ever, do that when an abuse occurs. That falls to individual citizens. But they’re typically maligned. You know, it becomes a thing of these people are against the country, they’re against the government. But I’m not. I’m no different from anybody else. I don’t have special skills. I’m just another guy who sits there, day to day, in the office, watches what happening—what’s happening, and goes, "This is something that’s not our place to decide. The public needs to decide whether these programs and policies are right or wrong." And I’m willing to go on the record to defend the authenticity of them and say, "I didn’t change these. I didn’t modify the story. This is the truth. This is what’s happening. You should decide whether we need to be doing this."

NERMEEN SHAIKH: That was NSA whistleblower Edward Snowden speaking in 2013. So, Barry Eisler, you begin your book with Edward Snowden. Could you talk about that decision to talk about Edward Snowden?


NERMEEN SHAIKH: And also, the title of your book is The God’s Eye View.

BARRY EISLER: Yeah. So, like all my fiction, but especially so in this case, it’s grounded in things that are actually happening in the world. And when I first had the idea for this book, by the way, I had a notion for a pretty far-reaching surveillance program, and I thought it would make the good basis—a good basis for a novel. My concern was that what I had in mind was going to be too far-reaching. And because my brand has a lot to do with realism, I thought people might say, "Come on, Barry, the government’s not really doing all that." And while I was working on my previous book but just kind of thinking about this next one, I was actually in Tokyo doing research. June 2013 is when Glenn Greenwald and Laura Poitras first broke—first started breaking stories with The Guardian based on Snowden’s revelations. And I was immediately electrified, and I realized, "Oh, my god! I was not going nearly far enough in what I had imagined."

AMY GOODMAN: Now, I mean, what’s interesting is you’re a fiction writer here.


AMY GOODMAN: But your background is CIA—


AMY GOODMAN: —in covert operations.


AMY GOODMAN: So, you begin this book with Poitras and Greenwald meeting with Snowden in Hong Kong, complete catastrophe for the intelligence agency here.


AMY GOODMAN: They’re woken up in the middle of the night: "What do we do?" And the discussion of just taking them all out—


AMY GOODMAN: —including Ewen MacAskill, who is from The Guardian, who was with them.


AMY GOODMAN: Taking them out.


AMY GOODMAN: But you do have a background in reality, which is in covert operations.

BARRY EISLER: Right, yeah. So, if you’re asking me, have I ever heard the government give an order to kill a journalist, the answer is no. But I do know that, increasingly, the government equates journalism with terrorism—I mean, explicitly. And if we’re using certain tools and tactics against terrorists, then it makes sense those things are going to migrate to other enemies of the state, right? In fact, I think that when you think about terrorist groups like ISIS and then power centers in countries like America, a group like ISIS is nothing but ISIS—is nothing but upside to any political establishment figure who wants to increase his or her budget, or up fear among the public so that the politician can gain more power, and means more profits for corporations involved in the war machine—really, nothing but upside.

But dissident groups—student groups, antiwar groups, civil rights activists—represent a lot of downside. And this is—and journalists, most of all, who are relying—real journalists, who are relying on whistleblowers and real leaks to carry out their journalism. So, when I imagine how is the government going to respond to this kind of thing, I’m thinking, well, how does the government respond to—what sort of tools has it developed and has it deployed against the ostensible enemies of the government, and how is it going to deploy them against real enemies, like journalists?

AMY GOODMAN: How were you deployed in the CIA?

BARRY EISLER: I didn’t do very much with the CIA. And whenever I say that, people are like, "Yeah, right." It’s like when a presidential candidate says, you know, "I have no intention of running for president," and people are like, "Oh, come on, man! You know, when are you really going to"—it really—I was there for barely three years. It was mostly training. It was a super-interesting experience. I’m glad I had it. And it definitely informs everything I write about, and hopefully makes the sort of spycraft, countersurveillance, surveillance, all sorts of tradecraft and the mentality of spies, that sort of thing that I depict in my novels—hopefully, the experience I had with theCIA makes all that as realistic as possible. But I wish I could say on Democracy Now!that I was involved in numerous coups and assassinations. It would be a really cool segment we’re doing. But alas, it was mostly training.

NERMEEN SHAIKH: But one of the things that you’ve said that you learned at theCIA was that sometimes it pays to cover up the commission of a serious crime—


NERMEEN SHAIKH: —by confessing to a lesser one.

BARRY EISLER: Yes. And it’s funny how often I see that sort of thing played out in national headlines. There are things they say at the CIA that are said partly in jest, but only partly. So, another one is it is better to seek forgiveness than ask permission. And I see the government doing this sort of thing constantly. And yeah, so, some of these things you really do—another one is deny everything, you admit nothing, make counter-accusations. Keep that one in mind. You will see the government doing it, you’ll see politicians doing it all the time.

AMY GOODMAN: So, The God’s Eye View, what is it?

BARRY EISLER: Well, I don’t want to give away any spoilers, but God’s Eye is a program of far-reaching surveillance in the book. And—

AMY GOODMAN: And this is a fiction book?

BARRY EISLER: Well, I have an 18-page bibliography at the end of the book, because I want people to know that if you don’t follow these things closely, you might have—you might have that reaction I was talking about earlier, which is, you’ll read the book and say, "Well, that was—that was a fun, entertaining thriller, but come on. Can the government really do these things? Is it really doing these things?" And reviews thus far have been really gratifying for me, because people do respond to the book the way I would hope they would respond to fiction: They enjoy it, it’s gripping, they’re on the edge of their seats, and then they say, "And then I came to the bibliography, and I realized, oh, my god, all these things are real." They are real. I speculated a little bit about how a program like God’s Eye would be deployed, but all the technologies I describe in the book—and there are some that will make your hair stand up—they’re real, and they’re actually deployed.


BARRY EISLER: Can you hack into a car? Can you turn a microphone on, not just on a phone, but on all these personal assistant devices that are getting deployed in people’s homes—baby cameras, closed-circuit television, all over the—

AMY GOODMAN: [inaudible]

BARRY EISLER: Yeah, heart devices, pacemakers, these sorts of things. In my very first book, which I started writing in 1993—it was published in 2002—A Clean Kill in Tokyo, my half-American, half-Japanese assassin, John Rain, shorted out a guy’s pacemaker wirelessly. And at the time, there was no Bluetooth, it was all—I was using infrared, but it was wireless. And I checked with a Harvard cardiac pacemaker specialist, a guy I lived with in college, actually. I said, "Could you really do this?" And he’s like, "Well, I guess so. Why would you want to?" as people always asked me at the time. Now they know I’m a novelist. Anyway, yeah, it turns out you can absolutely do this, so much so that Dick Cheney, in his memoirs, acknowledged that he had his heart doctor turn off the wireless feature on his pacemaker because of concerns that somebody might try to turn it off. I was going to say "terrorist," but actually probably there a lot of people who probably, at one time or another, considered pressing the button on that.

NERMEEN SHAIKH: Well, very quickly, before we conclude, I want to ask you about something you said earlier, that the government increasingly equates journalism with terrorism.


NERMEEN SHAIKH: How did you learn that?

BARRY EISLER: Just by observing. I remember when David Miranda was detained at Heathrow. I guess this was about two years ago.

AMY GOODMAN: Glenn Greenwald’s partner.

BARRY EISLER: Right. And I wondered, why is the government doing this? I mean, they must know. It’s not like he’s got the only copy of whatever it is they’re looking for, so why do something like that? And I realized, look, this is basically a deny-and-disrupt operation. I mean, why does the government want terrorists to know that it’s into their cellphones, that it can track you by your cellphone use and probably put in a drone strike based on that information? In fact, we know that sort of thing goes on, whether it’s a so-called signature struck, where they don’t know your identity, or where they do. Why do they do that? Is it to make terrorists unable to communicate? No, it won’t have that effect. It’s to make it more difficult for them to communicate, to make plotting whatever the terrorists are trying to plot slower and harder. And that’s why the government does these things. So I thought, why detain this guy? Well, it’s—

AMY GOODMAN: Under the Terrorism Act.

BARRY EISLER: Yeah, I know. So, this is another—thank you. That is the quintessential example of like, "Oh, wow, so journalists really are literally terrorists," in this case to the U.K. government, but in cooperation with the American government. So, they’re not going to stop journalists from communicating by this, but it’s a kind of signal. They know that for the most sensitive things that journalists are working on, journalists don’t trust their cellphones, and they’re using human couriers. So what do you do? You let them know even human couriers are not going to be safe for you. Deny and disrupt.

AMY GOODMAN: I wanted to end by asking you a question that our colleague, Jeremy Scahill, asked you last night—


AMY GOODMAN: —in a Q&A that you had here in New York. And it’s about something that the Clinton campaign is trying to make a big deal of, something Bernie Sanders said decades ago—


AMY GOODMAN: —calling the CIA a dangerous institution that has got to go. He said this in 1974.

BARRY EISLER: Yeah, yeah.

AMY GOODMAN: Now, he is not saying that today. These days, he talks about more oversight for the agency.


AMY GOODMAN: But as a former person in covert operations at the CIA—


AMY GOODMAN: —do you share his view more, the one he expresses today, more oversight, or the one he expressed 40 years ago, do away with the CIA?

BARRY EISLER: Yeah, I would say, first, that people need to understand Sanders is not an outlier in calling for the abolishment of the CIA. President Truman said he’d get rid of the agency. John F. Kennedy famously said he would shatter the organization into a thousand pieces and scatter it to the winds. This is following the Bay of Pigs. A lot—Daniel Moynihan said, "Oh, my god! We’re not getting our money’s worth. This thing is doing more harm than good."

AMY GOODMAN: The senator.

BARRY EISLER: Yeah, Senator—former Senator Moynihan. So there have been a lot of prominent people who have had—who have been in a position to weigh the costs and benefits of the CIA’s existence, and have come out thinking that, on balance, national security would be improved if we actually just didn’t have a CIA. So that is a perfectly defensible and respectable position. That’s one thing.

The other thing is, look, at a minimum, if I were advising Sanders today, I would say, at a minimum, you’ve got to detach the covert action arm from the intelligence gathering and analysis arm. It’s just—these are two things that inherently don’t function well together. Just as the NSA is tasked with, on the one hand, destroying encryption, and on the other hand, there’s another part of the organization that is tasked ostensibly with improving encryption, you can’t put—that’s like putting a humidifier and a dehumidifier in the room and telling them to battle it out. It’s just like—it doesn’t work. You’re not going to get good results. And so, with the agency, the covert military arm of the agency really should be put in the military. It shouldn’t be in a position to interfere with the objectivity of intelligence analysis that our policymakers rely on.

Rogers and Alcatel-Lucent Proposed an Encryption Backdoor for Police - Motherboard 20160212

Rogers and Alcatel-Lucent Proposed an Encryption Backdoor for Police - Motherboard 20160212

As telecom companies prepare for the day when phone calls are counted in megabytes and not minutes, yet another contentious encryption debate is looming: how to secure subscribers' voice conversations, while balancing law enforcement’s need to eavesdrop when needed.

For Canadian telecom company Rogers and equipment maker Alcatel-Lucent (now Nokia), one option was a so-called backdoor, a secret key of sorts that could decrypt otherwise secure communications, and that theoretically only law enforcement could use.

In 2012, the two companies came up with a lawful interception proposal for a next-generation voice encryption protocol, known as MIKEY-IBAKE. The protocol was designed to protect conversations end-to-end—that is, no one sitting in the middle of a call's network connection could eavesdrop on what was being said.

Unless you were law enforcement, that is. For them, there was an exception, a backdoor. But there’s a problem with this scenario: a backdoor for law enforcement has the potential to be exploited by others, which is why, amongst security professionals, backdoors are so vehemently opposed.

"In the US, this has been the debate. Are we going to backdoor communications? We simply haven't had that debate here," said Christopher Parsons, a post-doctoral researcher at the Citizen Lab, which belongs to the University of Toronto’s Munk School for Global Affairs. "It seems as though we have carriers and vendors who are looking for ways to subvert that without bothering to deal with the politicians."

The documents detailing the Rogers and Alcatel-Lucent proposal are related todocuments analyzed last month by Steven Murdoch, a Royal Society University Research Fellow in the Information Security Research Group of University College London. Murdoch’s analysis described an encryption protocol related to MIKEY-IBAKE that had been modified—backdoored—by the UK intelligence agency GCHQ.

An excerpt from one of the documents describing Rogers and Alcatel-Lucent's proposal. Image: Screenshot/3GPP

On the one hand, telecom providers have no choice but to opt for stronger encryption (and, to be clear, this is a good thing). At present, "land-line calls are almost entirely unencrypted, and cellphone calls are also unencrypted except for the radio link between the handset and the phone network," wrote Murdoch, in his recent analysis of GCHQ’s backdoored cellular encryption scheme.

On the other, more widespread use of encryption has drawn the ire of law enforcement. The FBI famously described Apple and Google’s efforts to increase user data protections as making evidence go “dark.” And because various jurisdictions—including Canada and the US—include wiretap provisions as a condition of having access to wireless spectrum, employing protections that also stymie law enforcement isn't so cut and dry.

"These lawful intercept requirements are harming security,” Murdoch said in an interview. “They're preventing the deployment of security in order to facilitate surveillance, and that's not really a debate that's been discussed."

The Rogers and Alcatel-Lucent proposal was introduced during a meeting of the 3rd Generation Partnership Project's lawful interception working group in 2012. The 3GPP is an organization that develops standards that dictate how much of the world's cellular infrastructure works, including 4G and LTE (draft documents of the proposal are available on its website, but the final proposal is not).

At that meeting, which was held in Barcelona, Rogers and Alcatel-Lucent proposed an approach to encryption where, instead of protecting communications using a random number generator the system would use a pre-defined "pseudo-random number generator," or a secret number, that only a telecom provider or network operator would know.

Because all messages would be encrypted using this pre-determined number, anyone that discovered the number could decrypt any message they wanted.

“We're talking about fundamental aspects of how law enforcement interacts with our communications, that the extent to which we can trust the security provided to us by telecommunications providers"

The proposal was described by Parsons and fellow Citizen Lab researcher Andrew Hilts last year, in a report for the the Telecom Transparency Project (Parsons is its founder), but received little notice at the time.

"The Rogers/Alcatel-Lucent solution would let a [telecom service provider] either decrypt traffic in real time or retroactively decrypt traffic that had been encrypted using the [pseudo-random number generator]," the pair wrote in their 2015 report on the telecommunications surveillance. "As such, their proposal would effectively undermine the core security design decisions that were ‘baked’ into MIKEY-IBAKE."

"This should be a public discussion. This shouldn't be something that's buried away in a pretty cloistered standards environment,” said Parsons, who called the proposal “worrying.” Canadian Parliament has yet to engage in the sort of encryption debate currently taking place in the US.

“We're talking about fundamental aspects of how law enforcement interacts with our communications, that the extent to which we can trust the security provided to us by telecommunications providers,” Parsons continued. “And this all comes after Canada has passed numerous legislature that deals with security and surveillance, none of which, to my mind, explicitly clarify whether or not this kind of decryption on the fly would be required."

The encryption protocol proposed by Rogers and Alcatel-Lucent was actually previously rejected by the UK government's spy agency agency GCHQ for being too difficult to eavesdrop on. Instead, GCHQ proposed an alternate standard, MIKEY-SAKKE, which can be more readily intercepted. The UK government has beenpromoting adoption of the standard in both government and commercial products.

MIKEY-IBAKE, meanwhile, does not appear to have been implemented. Leonard Pesheck, a spokesperson for Nokia (which recently purchased Alcatel-Lucent), wrote in an email that "the MIKEY-IBAKE proposal we submitted to 3GPP SAE for standardization was not accepted and we therefore did not pursue product plans."

Rogers spokesperson Jennifer Kett also confirmed the company brought forward the MIKEY-IBAKE proposal, but "ultimately that proposal was not adopted."

"As you can appreciate, in order to best protect our customers and as a condition of our licenses, we don’t publicly disclose our security practices," Kett wrote in an email.

If those practices include backdoors, however, it’s only a matter of time before others disclose them first.

The New York Times - New Technologies Give Government Ample Means to Track Suspects, Study Finds - 20160131

The New York Times - New Technologies Give Government Ample Means to Track Suspects, Study Finds - 20160131

The F.B.I. director, James B. Comey, and other Justice Department officials have said moves by technology firms to encrypt data have choked off critical ways to monitor suspects.

For more than two years the F.B.I. and intelligence agencies have warned that encrypted communications are creating a “going dark” crisis that will keep them from tracking terrorists and kidnappers.

Now, a study in which current and former intelligence officials participated concludes that the warning is wildly overblown, and that a raft of new technologies — like television sets with microphones and web-connected cars — are creating ample opportunities for the government to track suspects, many of them worrying.

“ ‘Going dark’ does not aptly describe the long-term landscape for government surveillance,” concludes the study, to be published Monday by the Berkman Center for Internet and Society at Harvard.

The study argues that the phrase ignores the flood of new technologies “being packed with sensors and wireless connectivity” that are expected to become the subject of court orders and subpoenas, and are already the target of the National Security Agency as it places “implants” into networks around the world to monitor communications abroad.

The products, ranging from “toasters to bedsheets, light bulbs, cameras, toothbrushes, door locks, cars, watches and other wearables,” will give the government increasing opportunities to track suspects and in many cases reconstruct communications and meetings.

The study, titled, “Don’t Panic: Making Progress on the ‘Going Dark’ Debate,” is among the sharpest counterpoints yet to the contentions of James B. Comey, the F.B.I. director, and other Justice Department officials, mostly by arguing that they have defined the issue too narrowly.

Over the past year, they have repeatedly told Congress that the move by Apple to automatically encrypt data on its iPhone, and similar steps by Google and Microsoft, are choking off critical abilities to track suspects, even with a court order.

President Obama, however, concluded last fall that any effort to legislate a government “back door” into encrypted communications would probably create a pathway for hackers — including those working for foreign governments like Russia, China and Iran — to gain access as well, and create a precedent for authoritarian governments demanding similar access.

Most Republican candidates for president have demanded that technology companies create a way for investigators to unlock encrypted communications, and on the Democratic side, Hillary Clinton has taken a tough line on Silicon Valley companies, urging them to join the fight against the Islamic State.

Apple’s chief executive, Timothy D. Cook, has led the charge on the other side. He recently told a group of White House officials seeking technology companies’ voluntary help to counter the Islamic State that the government’s efforts to get the keys to encrypted communications would be a boon for hackers and put legitimate business transactions, financial data and personal communications at greater risk.

The Harvard study, funded by the Hewlett Foundation, was unusual because it involved technical experts, civil libertarians and officials who are, or have been, on the forefront of counterterrorism. Larry Kramer, the former dean of Stanford Law School, who heads the foundation, noted Friday that until now “the policy debate has been impeded by gaps in trust — chasms, really — between academia, civil society, the private sector and the intelligence community” that have impeded the evolution of a “safe, open and resilient Internet.”

Among the chief authors of the report is Matthew G. Olsen, who was a director of the National Counterterrorism Center under Mr. Obama and a general counsel of the National Security Agency.

Two current senior officials of the N.S.A. — John DeLong, the head of the agency’s Commercial Solutions Center, and Anne Neuberger, the agency’s chief risk officer — are described in the report as “core members” of the group, but did not sign the report because they could not act on behalf of the agency or the United States government in endorsing its conclusions, government officials said.

“Encryption is a real problem, and the F.B.I. and intelligence agencies are right to raise it,” Mr. Olsen said Sunday. But he noted that in their testimony officials had not described the other technological breaks that are falling their way, nor had they highlighted cases in which they were able to exploit mistakes made by suspects in applying encryption to their messages.

Jonathan Zittrain, a professor of law and computer science at Harvard who convened the group, said in an interview that the goal was “to have a discussion among people with very different points of view” that would move “the state of the debate beyond its well-known bumper stickers. We managed to do that in part by thinking of a larger picture, specifically in the unexpected ways that surveillance might be attempted.”

He noted that in the current stalemate there was little discussion of the “ever-expanding ‘Internet of things,’ where telemetry from teakettles, televisions and light bulbs might prove surprisingly, and worryingly, amenable to subpoena from governments around the world.”

Those technologies are already being exploited: The government frequently seeks location data from devices like cellphones and EZ Passes to track suspects.

The study notes that such opportunities are expanding rapidly. A Samsung “smart” television contains a microphone meant to relay back to Samsung voice instructions to the TV — “I want to see the last three ‘Star Wars’ movies” — and a Hello, Barbie brought out by Mattel last year records children’s conversations with the doll, processes them over the Internet and sends back a response.

The history of technology shows that what is invented for convenience can soon become a target of surveillance. “Law enforcement or intelligence agencies may start to seek orders compelling Samsung, Google, Mattel, Nest or vendors of other networked devices to push an update or flip a digital switch to intercept the ambient communications of a target,” the report said.

These communications, too, may one day be encrypted. But Google’s business model depends on picking out key words from emails to tailor advertisements for specific users of Gmail, the popular email service. Apple users routinely back up the contents of their phones to iCloud — a service that is not encrypted and now is almost a routine target for investigators or intelligence agencies. So are the tracking and mapping systems for cars that rely on transmitted global positioning data.

“I think what this report shows is that the world today is like living in a big field that is more illuminated than ever before,” said Joseph Nye, a Harvard government professor and former head of the National Intelligence Council. “There will be dark spots — there always will be. But it’s easy to forget that there is far more data available to governments now than ever before.”

Pfefferkorn, Riana - James Comey's default encryption bogeyman - 20160115

Pfefferkorn, Riana - James Comey's default encryption bogeyman - 20160115

FBI Director James Comey recently told the Senate Judiciary Committee that encryption routinely poses a problem for law enforcement. He stated that encryption has “moved from being available [only] to the sophisticated bad guy to being the default. So it’s now affecting every criminal investigation that folks engage in.”

This assertion may reflect a shift in the Director’s approach to trying to convince lawmakers to regulate the commercial availability of strong encryption. To date, the principal argument has been that encryption interferes with counterterrorism efforts. Federal officials asking for legislative intervention, or seeking to shame companies into maintaining security architectures that would not interfere with surveillance, generally invoke the fear of terrorist attacks. Such attacks, or the threat of them, can provoke cooperation or legislative action that would otherwise be difficult to effectuate. In August, for example, the intelligence community’s top lawyer suggested that a terror attack could be exploited to turn legislative opinion against strong encryption. And Comey’s testimony last month raised the specter of ISIL. He and other members of the intelligence community immediately mounted a full­court press against strong crypto following the tragedies in Paris and San Bernardino, even before investigators could conclude whether encrypted communications or devices played any role in either attack.

Proponents of strong encryption have long been suspicious of the claim that encryption interferes with counterterrorism investigations. Terrorism is quite rare in the US and encryption has never yet been shown to have thwarted investigations into any terrorist attacks that have taken place on US soil. This includes the May 2015 shooting in Garland, Texas that Comey has invoked. Comey points to the fact that one shooter exchanged encrypted text messages with “an overseas terrorist” shortly before the attack, but the FBI had already been monitoring one of the perpetrators for years and warned local authorities about him before the shooting. Plus, the FBI’s powerful ability to collect (unencrypted) metadata is the reason Comey knows the shooter sent those text messages.

Comey may be starting to recognize that his rationale for weakening encryption needs to hit closer to home if he hopes to persuade lawmakers and the American public. To that end, it looks like he, along with Manhattan District Attorney Cyrus Vance, is ready to argue that regular criminals — the kind more likely to predate on the general population — are getting away because of encryption.

What crimes, then, are law enforcement officials invoking in their latest calls for weakening encryption? If encryption affects “every” criminal investigation as Comey claims, you’d think that law enforcement would encounter encryption in investigations of the crimes it spends the most time and money working on. If so, then the majority of cases in which law enforcement encounters encryption should be drug cases. Statistically, the War on Drugs, not the War on Terror, would likely be the principal context in which mandatory encryption access for law enforcement would be used.

However, law enforcement’s anti­crypto advocacy hasn’t been focused on the War on Drugs. Much like Comey’s invocation of ISIL, other law enforcement leaders have asserted that the worst of the worst are the beneficiaries of strong security, focusing on murderers and sex offenders. Vance’s recent whitepaper, which calls for federal legislation mandating law enforcement access to encrypted devices, claims that iPhone device encryption using iOS 8 (which Apple cannot bypass) stymied the execution of around 111 search warrants in the space of a year. According to the report, those cases involved offenses including “homicide, attempted murder, sexual abuse of a child, sex trafficking, assault, and robbery.”

Vance’s list (which may or may not be comprehensive) is surprising. There is little overlap between the types of crimes where Vance claims Manhattan prosecutors encountered encryption, and the crimes which local and state law enforcement probably deal with most frequently. According to a newly­released FBI report, larceny, theft, assault, and drug offenses are the crimes most commonly reported by state and local law enforcement. Of those, only assault is on the Manhattan DA’s list. Drug crimes are not, even though drug arrests alone accounted for nearly a quarter of all arrests in Manhattan last year. By comparison, the other offenses on his list — homicide, robbery, sex crimes, and trafficking offenses — account for only a small fraction of reported crimes, according to the FBI report.

Not only are drug crimes common in the state and local context, they dominate the federal courts. Drug defendants are often arrested by local police, but prosecuted federally (which might help account for the absence from Vance’s list). Drug offenses top the federal courts’ most recent 12­month report on numbers of federal criminal defendants charged, by offense, which covers 17 offense categories. (The report doesn’t reflect investigations that are closed without a prosecution.) Similarly, the 2014 wiretap report, also issued by the federal courts, notes that a whopping 89 percent of all wiretaps (including 91 percent of federal wiretaps and 88 percent of state wiretaps) were for drug offenses. Homicide and assault (a combined category in the wiretap report) came in a distant second, at four percent. So one would expect that if there’s widespread use of encryption, it would proportionately impact drug crimes, and the homicide, assault, and other cases would be far behind.

State and federal wiretap statistics, combined with federal prosecution statistics, demonstrate that drug offenses are very high on law enforcement’s agenda — even as homicide clearance rates languish. And according to the FBI crime statistics report, drug offenses are one of the most commonly reported types of crime.

As more and more people carry smartphones that are encrypted by default, encountering device encryption becomes more likely to affect investigations where the crime is both common and a top law enforcement priority. That means drug offenses — and yet they are absent from Vance’s list. If you have concerns about the War on Drugs — and many people do because it is expensive, ineffectual, and disproportionately affects minorities, among other reasons — the War on Crypto is likely to make it worse.

We need more information about the facts underpinning the Manhattan DA’s report before we can say whether Vance has established a pressing law enforcement need for legislation. The report said that the office “was unable to execute” around 111 search warrants due to iOS 8 encryption. While 111 frustrated warrants may sound like a lot, that number doesn’t tell the full story. The report conspicuously fails to mention several important facts, such as whether prosecutors successfully pursued those cases using other evidence; the total number of search warrants issued for smartphones during the period cited; how many of those devices turned out to be encrypted; and of those, how many warrants were successfully executed nevertheless. If criminal investigations can succeed despite encryption, then device encryption’s detrimental impact on the public is marginal.

That’s already true for encryption of communications. 2014’s statistics for judicially-authorized wiretaps (which collect the contents of unencrypted phone calls and text messages in transit) show almost no adverse impact from encryption. Officials encountered encryption in 22 state court wiretaps out of a total of 2,275 — a sharp drop from 2013, when states came across 41 encrypted phones — and were unable to decipher plaintext in only two of the 22. For federal wiretaps, investigators encountered encryption in three wiretaps out of 1,279 total, of which two could not be decrypted.

When it comes to communications, Comey’s claim that encryption “affects every criminal investigation” is plainly an exaggeration. He and his colleagues have yet to show that the situation for devices is any different. So long as encryption has a negligible effect on law enforcement’s ability to do their jobs, their proposals to regulate encryption amount to a “solution” for a problem that doesn’t exist.

In the end, it’s the War on Drugs and other routine criminal investigations, not counterterrorism or “worst of the worst” criminal cases, that stand to benefit the most if Director Comey gets his wish for guaranteed access to the data on Americans’ encrypted smartphones. Yet officials cannily highlight ISIL recruitment, sex trafficking, and murder to promote their demands for weaker crypto, obscuring the lack of evidence that strong crypto in fact poses a significant problem for them.

This post draws a number of inferences from imperfect information, because comprehensive data about device encryption’s impact on law enforcement are simply not available. We don’t have the full picture of how law enforcement and intelligence agencies seek to compel or persuade tech companies to decrypt information for them (and on what legal authority), influence encryption standards, cooperate to share tools for bypassing crypto, or investigate crime by other means, including hacking tools. I’m researching these issues as part of the Stanford Center for Internet and Society’s Crypto Policy Project, and maybe they’ll also be considered by the crypto commission Congress plans to convene.

As Director Comey himself recently said, “without information, every single conversation in this country about policing and reform and justice is uninformed, and that is a very bad place to be.” Those words apply with equal force to the national conversation about encryption and law enforcement.