Category Archives: Apple

Apple vs FBI: A Socratic dialogue on privacy and security - Diplo 20160317 - 20160329

Apple vs FBI: A Socratic dialogue on privacy and security - Diplo 20160317 - 20160329

Diplo’s webinar on the Apple-FBI case, on 17 March (watch the recording), evolved into a Socratic dialogue on the core concepts and underlying assumptions of the case. The lively debate inspired us to create a series of posts that argue the main dilemmas, played out by three fictitious characters, Privarius, Securium, and Commercias. The first starts with the main facts.

The Apple-FBI case triggered so many questions for which we do not have ‘correct’ or clear answers. Responses often trigger new questions. Join us in the debate with your comments and questions.

Securium: Everyone is talking about it! The 16 February ruling, by a US federal judge in Riverside, California, which ordered Apple to assist the FBI in unlocking an iPhone, triggered a global debate. The iPhone is not just any phone: it belongs to one of the attackers who killed 14 people in San Bernardino in December 2015.

Commercias: A global debate indeed. Especially after Apple’s strong reaction. Declaring opposition to the order, Apple is arguing that by complying with the request, it would only create a dangerous precedent and would seriously undermine the privacy and security of its users. Other technology companies (such as Microsoft, Amazon, Google Facebook, and Twitter), as well as civil rights activists, have expressed support for Apple.

Privarius: Activists are also involved in this debate. The ruling, and the eventual outcome, can have very serious implications and repercussions. Encryption is a strong safeguard, and companies should not be made to weaken the security of their own products. Decryption should not be allowed.

Securium: Is it for companies to decide? US President Barack Obama has already objected to the creation of undecryptable black boxes, stating the need for a balance between security and privacy that would enable law enforcement authorities to continue doing their job. The outcome of this case is still unclear.

Commercias: Unclear indeed. Today’s court hearing was postponed, as the FBI said it may have found a way to unlock the phone without Apple's assistance…

Privarius: This particular case may be nearing an end, but the main issues remain open. For example, how can there possibly be a balance between privacy and security if phones are rendered decryptable? After the Snowden revelations, it became clear that we can no longer completely rely on government agencies in ensuring our privacy, which is now in the hands of technology companies.

Commercias: Even the UN High Commissioner for Human Rights issued a statement, asking the US authorities to proceed with caution, as the case 'could have extremely damaging implications for the human rights of many millions of people, including their physical and financial security’. The UN Special Rapporteur for freedom of expression also asked for caution, noting that the FBI request risks violating the International Covenant on Civil and Political Rights.

Securium: Whatever the outcomes will be, one thing is clear: even if a solution may have been found today, this does not solve the main dilemmas. So let’s see what the issues at stake are, starting with security...

The next post - published next Thursday, 24th March - tackles the security aspect.

II. Apple vs FBI: It’s just one phone - or is it?

Commercias: ...If Apple were to help the FBI unlock this one phone, in adherence to the court order, other courts in the USA and elsewhere are likely to issue similar requests for future cases.

Securium: Isn’t this farfetched? The FBI’s requests are about one single iPhone: ‘... The Court’s order is modest. It applies to a single iPhone, and it allows Apple to decide the least burdensome means of complying. As Apple well knows, the Order does not compel it to unlock other iPhones or to give the government a universal “master key” or “back door”.’

Commercias: The order may not be referring to other phones, but if we take a look around us, we can see, for example, that the Manhattan district attorney has already indicated that there are currently 175 iPhones which investigators could not unlock, and he further confirmed that he would want access to all phones which are part of a criminal investigation, should the government prevail in the San Bernardino case. Apple is very likely to be compelled to use this technique to unlock iPhones in police custody all over America and beyond. Apple's attorneys reported a list of nine other cases involving 12 different iPhones of different models that law-enforcement authorities had asked Apple to help crack, and none of them involved terrorism. We cannot run this risk.

Privarius: Apple needs to create new software to open this phone, and this software could potentially unlock any iPhone. There is no guarantee that the FBI will not use this software - or master key - again, and if it falls into the wrong hands, the software can be misused by third parties. One case will be followed by another and there won’t be an end.

Securium: We should focus on the case at hand. The order is a targeted one ‘that will produce a narrow, targeted piece of software capable of running on just one iPhone, in the security of Apple’s corporate headquarters. That iPhone belongs to the County of San Bernardino, which has consented to its being searched.’ We must also not forget that the phone was used by a terrorist who took innocent lives. Crucial information surrounding the case may be stored on this device. With this fact in mind, the court order is pretty reasonable!

Commercias: No, the fact that the court issued an order doesn’t necessarily mean it is reasonable. The fact is that Apple has been assisting the FBI in cases before this as well as in this one particularly - it has provided the backup of the phone stored within the iCloud (though, unfortunately, the last backup doesn’t contain the most recent files from the day of the shooting). The Internet industry has always been cooperative when court orders were issued (even without the court order, as we learned from Snowden). This time, what the court is requesting has crossed the line.

Securium: There are no red lines when it comes to protecting users and citizens worldwide.

Commercias: There are. The company has been asked to decrease its security level - which by the way is its corporate advantage - which helps keep users secure. If the court forces Apple to make a patch, this would reduce the security level of its system. And although the FBI has asked Apple to unlock only one iPhone, this might not be possible without affecting the privacy of all other iPhones, making them less secure in the process. Besides, do you really think that the FBI won’t use this ‘back door’? Once the privacy door is open, it will never be closed.

Securium: It is speculation. Let us not be abstract. Why would other phones be endangered?

Commercias: Technically speaking, Apple would need to create a software patch to its iOS, and install it in this particular phone. It is likely this could also be done within Apple’s headquarters, with the FBI accessing only this particular phone (even remotely) and without the chance to reach out to the particular software patch. However, since the phone needs to be handed over to investigators, there is a possibility of it being reverse-engineered. In addition, misuses and abuses cannot be fully controlled once the firmware is out.

Privarius: But let’s say, Apple creates a software patch to unlock the phone: authorities may still submit requests for hundreds of other phones to be unlocked, and requests could possibly come from other jurisdictions. In this case, Apple would need to have its teams constantly available. Moreover, future versions of the iOS would also need to have an updated patch. Ultimately, Apple might find it easier and cheaper to simply develop a real backdoor in its products, or to give up on the stronger security-by-design approach.

Securium: On the other hand, if Apple wins this court case - if the case is resumed - it can create a new precedent.

Privarius:... and a major win for privacy!

III. Apple vs FBI: A case for encryption

Commercias: ...A win for Apple - among other issues - is also a win for privacy...

Privarius: The question is, can Apple damage privacy by claiming to protect it? In making extreme claims, they could be pushing the pendulum too far, and risk provoking a counter-reaction by endangering privacy protection. As President Obama recently said at a South by Southwest (SXSW) conference, ‘after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that are dangerous and not thought through.’

Commercias: On the other hand, we may say that it was the FBI who was, in fact, pushing too much. Apple and similar companies have cooperated by giving investigators all the data they have about the suspects; yet the FBI is asking them to go an extra step, and in the process, weaken the products’ encryption. The fact is that the FBI has already acquired large amounts of evidence about this case thanks to digital forensics and the support of the Internet industry (including Apple). Today, a user’s digital communications is not only saved on his/her phone, but is stored in the cloud by service providers such as Facebook or Google, which readily cooperate with the FBI to provide the data its investigators need.

Privarius: This also raises several questions: Was there really such a need to break into the phone? Does this justify setting a precedent? Is the benefit of this request proportional to its consequences?

Commercias: Furthermore, security experts such as the former US anti-terror chief claim that the FBI could have turned to the NSA for help, since this case may be related to terrorism; it is likely that the NSA has advanced techniques that can break the code. This can lead us to conclude that there might not have been a real need for the FBI to push Apple; yet the FBI chose a case linked to terrorism to push its limits and try to set a precedent.

Privarius: One positive aspect, if you may, is that as a result, encryption technology is flourishing. There are dozens of unbreakable encryption applications online, readily available mostly for free. There are complete solutions, integrating hardware, OS, and software. More importantly, hardware development has led to the creation of motherboard chips, such as Intel’s SGX, that incorporates encryption within a silicon wafer; this chip will soon become a common feature in products, with little possibility (if any) for anyone to unlock it with the use of any software or hardware patch. The outcome will affect how users choose their products, and may lead them to switch to other products with tighter encryption, or to install their own encryption software. This will leave law enforcement with even less control.

Commercias: But even with less control, law enforcement agencies may still be able to carry out their investigations without breaking encrypted communications - such as by using metadata, digital forensics, offline means, etc - right?

Privarius: Yes, they can. While there is little evidence on the usefulness of meta-data (zero success according to NSA) or access to encryption materials in preventing terrorist attacks (prior to the Paris attack, terrorists used unencrypted SMS), most criminal cases now require digital forensics as a critical part of the investigations. I would however distinguish between surveillance for national security purposes and to combat terrorism, from digital forensics for combatting crime (and not only cybercrime).

Commercias: True. Law enforcement has many digital forensics tools available at their disposal. I would add geolocation, data from telecom companies, and access to service providers’ cloud storage through court orders and other legal means. Besides, recent research (such as that by the Berkman Center) foresees that cyberspace is unlikely to ‘go dark’, for many reasons, and there will still be many sources for digital evidence without the need to break into encrypted spaces. Which would mean that Apple can retain its strong stand over privacy. …

IV. Apple vs FBI: A matter of trust

Commercias: In the past few days we saw how the situation took a surprising U-turn when the FBI announced it may have found a way to unlock the phone without Apple's assistance. In a way, it seems that Apple has managed to stand its ground so far.

Securium: Let’s face it though, has Apple really been advocating for the rights of its users, or is this more of a business strategy through which it has tried to regain the users’ trust?

Privarius: While it looks like Apple is in fact supporting privacy, we must not forget that companies are primarily driven by commercial interests. Many - including the FBI - have argued that Apple’s position is more about its business model than the protection of human rights.

Commercias: Even if companies have commercial interests, they can still work hard to protect human rights, including privacy.

Privarius: True. But can we expect businesses to always serve the good cause? Will the protection of human rights always fit into their model, and what if profits drive them to support other causes?

Commercias: It is also a matter of trust. If we look at the Internet business model, we realise how important users’ trust is. Arguably, obeying the court order may lead to diminished trust in Apple, and could provide a market advantage to other products offering strong built-in encryption solutions.

Privarius: So perhaps, if we had to identify Apple’s position in the triangular model, we might say that Apple is both a vendor (selling tech products), and an intermediary (storing users’ data).

Commercias: Indeed. This is probably why Apple took such a strong position in challenging the authorities. Apple’s business model could be seen a somewhat more diverse than, for example, that of Google, Facebook and Twitter, which depend heavily on data. The data-driven Internet industry is quite vulnerable to major policy ‘earthquakes’, such as the Snowden revelations, or the ongoing Apple/FBI controversy. Microsoft is another company that challenged the US authorities (court case on authority over data in Ireland). Just like Apple, Microsoft has a more diverse business model than typical Internet industry companies.

Privarius: And yet, if Apple loses this case, it will further erode the users’ trust in companies too, and not just the security sector. As Edward Snowden tweeted recently: 'The @FBI is creating a world where citizens rely on #Apple to defend their rights, rather than the other way around.'

Securium: As a result, users will try to find their own ways to protect themselves - through alternative and niche products, online software, etc. In such an environment, only the more skillful citizens will be more protected, while less skillful users will be additionally endangered by criminals and terrorists, which are becoming more and more tech-savvy. We should rather aim to have a minimum level of security for everyone, and to achieve this, end users should not be left to protect themselves through the use of cipher protection….

Privarius: And yet, if governments cannot protect the security and human rights of its citizens - which is the basis of any social contract - citizens should be allowed to protect themselves.

Commercias: Exactly… In real-life, by using guns; in cyberspace by using cipher protection. This is interesting: gun lobbyists and cipher-enthusiasts may share an underlying logic for their actions.

Privarius: The analogy with guns is incorrect; encryption protects, it doesn’t cause damage to others. Connected devices - computers, smartphones, tablets - can do both. Encryption prevents criminals from misusing users’ computers (90% of attacks are based on social engineering, using access to private data to fine-tune and adjust the attacks for phishing or spear-phishing). Encryption also strengthens the security of protocols and online communications in general, making attacks such as ‘man in the middle’ attacks much harder. Not to mention that encryption can save lives - as the UN Commissioner for Human Rights rightly mentioned - lives of activists, journalists, and whistleblowers around the world. Rather than reducing the levels of cybercrime by weakening encryption, the security community needs to look into how encryption can contribute to a more secure Internet...

Securium: Or maybe, we should let the courts decide on the next steps. ...

V. Apple vs FBI: Towards an Internet social contract?

Securium: Until last week, everyone was thinking that if Apple won this court case it would create a new precedent. For the time being, it seems like the case has been resolved, since the US Department of Justice has just declared it is now able to unlock the iPhone thanks to the assistance of a third party.

Privarius: Although it seems the case is settled, the main dilemmas have not yet been resolved. Whether this will happen immediately, or in the near future, society may eventually need to make some hard choices regarding privacy and security, among others, and gradually create new models of consensus. [Read the editorial, on page 3, of Issue 8 of the Geneva Digital Watch newsletter]

Commercias: Even if the recent developments have shown that the government did manage to unlock the phone, a new social contract could tackle one of the essential arguments in the debate: whether devices should be impermeable, or ‘undecryptable’. This may be the only way to keep them safe from intrusion from both criminals and authorities.

Securium: It is not the only way. Let us take a hypothetical situation: assuming that unlocking a mobile phone is essential to preventing a nuclear attack and saving many lives, would you argue that the privacy of a mobile phone user is more important that the survival of innocent people?

Privarius: Well, it is an abstract and unrealistic situation.

Securium: We can argue at length as to whether this is possible or probable. The point is that the principle of undecryptability of mobile devices creates an important implicit decision: that of placing privacy above other human rights or security considerations...

Privarius: I still do not think this is a dangerous risk; on the other hand, allowing access to this specific mobile and setting a dangerous precedent is a very concrete risk. If Apple gives in now, how can it resist future demands from the USA and abroad?

Securium: In the USA, had the case gone forward, it would have been decided either by the courts (setting a precedent) or by Congress. Either way, the US legislative framework would have been determinative. The democratic system preserves security by allowing judicial authorities to issue orders that weaken privacy protections. President Obama was right in objecting to the creation of undecryptable black boxes. It is, after all, what happens in the offline world, when law enforcement agencies obtain the right to enter private property as part of investigations, for example.

Privarius: The difference is that online or data searches can be automated, and it is easy to imagine searches being implemented without due process. It is simply not the same as physically knocking on 100 doors.

Commercias: More importantly, if Apple or any other company had to create a patch to break into a phone, what is the likelihood that criminals would not try to gain access or exploit any vulnerabilities? Equally important is the fact that the legal basis for FBI’s request and the Court order is uncertain and has been widely disputed - which proves that there is no political or social agreement, as yet, on how to deal with this and similar cases that may come up...

The Senate’s Draft Encryption Bill Is ‘Ludicrous, Dangerous, Technically Illiterate' - Wired 20160408

The Senate’s Draft Encryption Bill Is ‘Ludicrous, Dangerous, Technically Illiterate' - Wired 20160408

AS APPLE BATTLED the FBI for the last two months over the agency’s demands that Apple help crack its own encryption, both the tech community and law enforcement hoped that Congress would weigh in with some sort of compromise solution. Now Congress has spoken on crypto, and privacy advocates say its “solution” is the most extreme stance on encryption yet.

On Thursday evening, the draft text of a bill called the “Compliance with Court Orders Act of 2016,” authored by offices of Senators Diane Feinstein and Richard Burr, was published online by the Hill.1 It’s a nine-page piece of legislation that would require people to comply with any authorized court order for data—and if that data is “unintelligible,” the legislation would demand that it be rendered “intelligible.” In other words, the bill would make illegal the sort of user-controlled encryption that’s in every modern iPhone, in all billion devices that run Whatsapp’s messaging service, and in dozens of other tech products. “This basically outlaws end-to-end encryption,” says Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology. “It’s effectively the most anti-crypto bill of all anti-crypto bills.”

It's effectively the most anti-crypto bill of all anti-crypto bills.

Kevin Bankston, the director of the New America Foundation’s Open Technology Institute, goes even further: “I gotta say in my nearly 20 years of work in tech policy this is easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen,” he says.

The bill, Hall and Bankston point out, doesn’t specifically suggest any sort of backdoored encryption or other means to even attempt to balance privacy and encryption, and actually claims to not require any particular design limitations on products. Instead, it states only that communications firms must provide unencrypted data to law enforcement or the means for law enforcement to grab that data themselves. “To uphold the rule of law and protect the security and interests of the United States, all persons receiving an authorized judicial order for information or data must provide, in a timely manner, responsive and intelligible information or data, or appropriate technical assistance to obtain such information or data.”

Hall describes that as a “performance standard. You have to provide this stuff, and we’re not going to tell you how to do it,” he says. George Washington Law School professor Orin Kerr points out on Twitter that the text doesn’t even limit tech firms’ obligations to “reasonable assistance” but rather “assistance as is necessary,” a term that means the bill goes beyond current laws that the government has used to try to compel tech firms to help with data access such as the All Writs Act.

Even more extreme, the draft bill also includes the requirement that “license distributors” ensure all “products, services, applications or software” they distribute provide that same easy access for law enforcement. “Apple’s app store, Google’s play store, any platform for software applications somehow has to vet every app to ensure they have backdoored or little enough security to comply,” says Bankston. That means, he says, that this would “seem to also be a massive internet censorship bill.”

I could spend all night listing the various ways that Feinstein-Burr is flawed & dangerous. But let's just say, "in every way possible."

— matt blaze (@mattblaze) April 8, 2016

If Grandpa Simpson was a Senator who was afraid of and confused by encryption, I think he'd write something like the Feinstein/Burr bill.

— Kevin Bankston (@KevinBankston) April 8, 2016

It's not hard to see why the White House declined to endorse Feinstein-Burr. They took a complex issue, arrived at the most naive solution.

— Matthew Green (@matthew_d_green) April 8, 2016

Burr and Feinstein’s bill disappoints its privacy critics in part because it seems to entirely ignore the points already made in a debate that’s raged for well over a year, and has its roots in the crytpo wars of the 1990s. Last summer, for instance, more than a dozen of the world’s top cryptographers published a paper warning of the dangers of weakening encryption on behalf of law enforcement. They cautioned that any backdoor created to give law enforcement access to encrypted communications would inevitably be used by sophisticated hackers and foreign cyberspies. And privacy advocates have also pointed out that any attempt to ban strong encryption in American products would only force people seeking law-enforcement-proof data protection to use encryption software created outside the U.S., of which there is plenty to choose from. Apple, in its lengthy, detailed arguments with the FBI in front of Congress and in legal filings, has called that weakening of Americans’ security a “unilateral disarmament” in its endless war with hackers to protect its users’ privacy.

White House Silence on an Anti-Encryption Bill Means Nothing
White House Silence on an Anti-Encryption Bill Means Nothing
Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People
Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People
The Apple-FBI Battle Is Over, But the New Crypto Wars Have Just Begun
Proposed State Bans on Phone Encryption Make Zero Sense
Proposed State Bans on Phone Encryption Make Zero Sense
Tom Mentzer, a spokesman for Senator Feinstein, told WIRED in a statement on behalf of both bill sponsors that “we’re still working on finalizing a discussion draft and as a result can’t comment on language in specific versions of the bill. However, the underlying goal is simple: when there’s a court order to render technical assistance to law enforcement or provide decrypted information, that court order is carried out. No individual or company is above the law. We’re still in the process of soliciting input from stakeholders and hope to have final language ready soon.”

The Burr/Feinstein draft text may in fact be so bad for privacy that it’s good for privacy: Privacy advocates point out that it has almost zero likelihood of making it into law in its current form. The White House has already declined to publicly support the bill. And Adam Schiff, the top Democratic congressman on the House of Representatives’ intelligence committee, gave WIRED a similarly ambivalent comment on the upcoming legislation yesterday. “I don’t think Congress is anywhere near a consensus on the issue,” Schiff said, “given how difficult it was to legislate the relatively easy [Cyber Information Sharing Act], and this is comparatively far more difficult and consequential.”

Bankston puts it more simply. “The CCOA is DOA,” he says, coining an acronym for the draft bill. But he warns that privacy activists and tech firms should be careful nonetheless not to underestimate the threat it represents. “We have to take this seriously,” he says. “If this is the level of nuance and understanding with which our policymakers are viewing technical issues we’re in a profoundly worrisome place.”

1Correction 4/8/2016 1:00pm EST: A previous version of this story stated that the draft bill text had been released by the senators, which a spokesperson for Senator Burr has since said in a statement to WIRED she didn’t “believe was consistent with the facts.”

"Black People Need Encryption," No Matter What Happens in the Apple-FBI Feud - Mother Jones 20160322

"Black People Need Encryption," No Matter What Happens in the Apple-FBI Feud - Mother Jones 20160322

Here's why civil rights activists are siding with the tech giant.

Last night, the FBI, saying that it may be able to crack an iPhone without Apple's help, convinced a federal judge to delay the trial over its encryption dispute with the tech company. In February, you may recall, US magistrate judge Sheri Pym ruled that Apple had to help the FBI access data from a phone used by one of the San Bernadino shooters. Apple refused, arguing that it would have to invent software that amounts to a master key for iPhones—software that doesn't exist for the explicit reason that it would put the privacy of millions of iPhone users at risk. The FBI now has two weeks to determine whether its new method is viable. If it is, the whole trial could be moot.

"One need only look to the days of J. Edgar recognize the FBI has not always respected the right to privacy for groups it did not agree with."

That would be a mixed blessing for racial justice activists, some of them affiliated with Black Lives Matter, who recently wrote to Judge Pym and laid out some reasons she should rule against the FBI. The letter—one of dozens sent by Apple supporters—cited the FBI's history of spying on civil rights organizers and shared some of the signatories' personal experiences with government overreach.

"One need only look to the days of J. Edgar Hoover and wiretapping of Rev. Martin Luther King, Jr. to recognize the FBI has not always respected the right to privacy for groups it did not agree with," they wrote. (Targeted surveillance of civil rights leaders was also a focus of a recent PBS documentary on the Black Panther Party.) Nor is this sort of thing ancient history, they argued: "Many of us, as civil rights advocates, have become targets of government surveillance for no reason beyond our advocacy or provision of social services for the underrepresented."

Black Lives Matter organizers have good reason to be concerned. Last summer, I reported that a Baltimore cyber-security firm had identified prominent Ferguson organizer (and Baltimore mayoral candidate) Deray McKesson as a "threat actor" who needed "continuous monitoring" to ensure public safety. The firm—Zero Fox—briefed members of an FBI intelligence partnership program about the data it had collected on Freddie Gray protest organizers. It later passed the information along to Baltimore city officials.

Department of Homeland Security emails, meanwhile, have indicated that Homeland tracked the movements of protesters and attendees of a black cultural event in Washington, DC, last spring. Emails from New York City's Metropolitan Transit Authority and the Metro-North Railroad showed that undercover police officers monitored the activities of known organizers at Grand Central Station police brutality protests. The monitoring was part of a joint surveillance effort by MTA counter-terrorism agents and NYPD intelligence officers. (There are also well-documented instances of authorities spying on Occupy Wall Street activists.)

Some police departments may have access to a Stingray, a device that lets officers read texts and listen to phone calls—as well as track a phone's location.

In December 2014, Chicago activists, citing a leaked police radio transmission—alleged that city police used a surveillance device called a Stingray to intercept their texts and phone calls during protests over the death of Eric Garner. The device, designed by military and space technology giant Harris Corporation, forces all cell phones within a given radius to connect to it, reroutes communications through the Stingray, and allows officers to read texts and listen to phone calls—as well as track a phone's location. (According to the ACLU, at least 63 law enforcement agencies in 21 states use Stingrays in police work—frequently without a warrant—and that's probably an underestimate, since departments must sign agreements saying they will not disclose their use of the device.)

In addition to the official reports, several prominent Black Lives organizers in Baltimore, New York City, and Ferguson, Missouri, shared anecdotes of being followed and/or harassed by law enforcement even when they weren't protesting. One activist told me how a National Guard humvee had tailed her home one day in 2014 during the Ferguson unrest, matching her diversions turn for turn. Another organizer was greeted by dozens of officers during a benign trip to a Ferguson-area Wal-Mart, despite having never made public where she was going.

In light of the history and their own personal experiences, many activists have been taking extra precautions. "We know that lawful democratic activism is being monitored illegally without a warrant," says Malkia Cyril, director of the Center for Media Justice in Oakland and a signatory on the Apple-FBI letter. "In response, we are using encrypted technologies so that we can exercise our democratic First and Fourth Amendment rights." Asked whether she believes the FBI's promises to use any software Apple creates to break into the San Bernadino phone only, Cyril responds: "Absolutely not."

"I don't think it's any secret that activists are using encryption methods," says Lawrence Grandpre, an organizer with Leaders of a Beautiful Struggle in Baltimore. Grandpre says he and others used an encrypted texting app to communicate during the Freddie Gray protests. He declined to name the app, but said it assigns a PIN to each phone that has been approved to access messages sent within a particular group of people. If an unapproved device tries to receive a message, the app notifies the sender and blocks the message from being sent. Grandpre says he received these notifications during the Freddie Gray protests: "Multiple times we couldn't send text messages because the program said there's a possibility of interception."

Cyril says "all of the activists I know" use a texting and call-encryption app called Signal to communicate, and that the implication of a court verdict in favor of the FBI would be increased surveillance of the civil rights community. "It's unprecedented for a tech company—for any company—to be compelled in this way," Cyril says.

Apple has prepared for an epic fight. But if the FBI is able to crack an iPhone without a key, the BLM crowd will have one more thing to worry about. As Cyril put it in a tweet this past February, "In the context of white supremacy and police violence, Black people need encryption."

As Apple Doubles Down on Encrypted Phones, Google Sits on Its Design for a “Digital Safe” - MIT Technology Review 20160322

As Apple Doubles Down on Encrypted Phones, Google Sits on Its Design for a “Digital Safe” - MIT Technology Review 20160322

A memory card developed by Google that upgrades a smartphone with strongly encrypted messaging and storage could improve security and trouble law enforcement.

As Apple faces criticism from the FBI for refusing to help law enforcement break into iPhones, rival Google is sitting on technology that would upgrade existing mobile devices with an encrypted “digital safe” that secures data, messages, and video and voice calls.

The technology, known as Project Vault, was created by a team led by Peiter Zatko, a hacker and security expert also known as Mudge who has since left Google. This month he called on Google to release the technology to underline its support for Apple’s refusal to open phones for the FBI and other law enforcement agencies. Google spokeswoman Victoria Cassady wouldn’t reply when asked whether the project was still active, but she hinted that there might be updates at Google’s annual developer conference in May. Zatko said he is not permitted to comment on Google’s plans.

Project Vault was introduced at the developer conference last year by Regina Dugan, leader of Google’s Advanced Technology and Projects group and previously head of DARPA, the Pentagon research agency. She showed attendees what looked like an ordinary memory card the size of a fingernail. It contained a tiny computer and storage system that instantly upgraded a device with advanced security features, such as strongly encrypted storage, messaging, video, and voice calls. Two phones were shown using Project Vault prototypes to exchange encrypted messages.

“Project Vault is your digital mobile safe,” Dugan said at the time. She said that it would initially be tested and developed with corporations before being offered to consumers. Google said it was already testing 500 of the devices internally, and it released code and documentation for Project Vault’s hardware and software online.

This memory card invented by Google upgrades a smartphone or other mobile device with encrypted storage and messaging.
Were Project Vault to be released, it could pull Google deeper into the argument between the tech industry and law enforcement over encryption technology.

Apple’s faceoff with the FBI was triggered by its decision to build iPhones that encrypt all stored data, and then to refuse to help investigators working on December’s San Bernardino shootings get around that protection. Similarly, the encryption method used by Facebook’s WhatsApp program and Apple’s iMessage service—a system that prevents even the companies providing the services from reading the messages— has angered authorities in Brazil, and is reported to also trouble the U.S. Department of Justice.

Project Vault is designed to upgrade a mobile device with both encrypted data storage and messaging. Because the code and digital keys used to encrypt messages and calls never leave the secure memory card, it could be even more resistant to eavesdropping or hacking than iMessage or WhatsApp, which operate as conventional apps.

Even if Google doesn’t move forward with Project Vault, it may still help other companies strengthen their security because its design is open source, meaning others can use it. Zatko says that some large companies, including financial institutions, are experimenting with pieces of what Google released to protect high-value customers against fraud.

Making the design open source would also help keep Project Vault trustworthy if it is released, by allowing outside experts and researchers to probe its security, says Simha Sethumadhavan, an associate professor at Columbia University who works on hardware security.

Timm, Trevor - Congress showed it's willing to fight the FBI on encryption. Finally - 20160301

Timm, Trevor - Congress showed it's willing to fight the FBI on encryption. Finally - 20160301


Members of Congress did something almost unheard of at Tuesday’s hearing on the brewing battle over encryption between Apple and the FBI: their job. Both Democrats and Republicans grilled FBI director Jim Comey about his agency’s unprecedented demand that Apple weaken the iPhone’s security protections to facilitate surveillance. This would have dire implications for smartphone users around the globe.

Normally, congressional committee hearings featuring Comey are contests among the members over who can shower the FBI director with the most fawning compliments in their five-minute allotted time frame. Hard questions about the agency’s controversial tactics are avoided at all costs. But on Tuesday, in rare bipartisan fashion, virtually every member of the House judiciary committee asked Comey pointed questions and politely ripped apart his arguments against Apple.

One judiciary member questioned how the FBI managed to mess up so badly during the San Bernardino investigation and reset the shooter’s password, which is what kicked this whole controversy and court case in motion in the first place. And if the case was such an emergency, why did they wait 50 days to go to court? Another member questioned what happens when China inevitably asks for the same extraordinary powers the FBI is demanding now. Others questioned whether the FBI had really used all the resources available to break into the phone without Apple’s help. For example, why hasn’t the FBI attempted to get the NSA’s help to get into the phone, since hacking is their job?

Comey readily admitted that the San Bernardino case could set a precedent for countless others after it, and that it won’t just be limited to one phone, as the FBItried to suggest in the days after the filing became public. Comey said the FBI has so many encrypted phones in its possession that he doesn’t know the number (that’s not including the hundreds of local police forces that are itching to force Apple to create software to decrypt those as well). Comey also admitted under questioning that terrorists would just move to another encrypted device if Apple was forced to do what the government is asking, and that there are companies all over the world offering similar products.

More than anything, though, the members of Congress expressed anger that theFBI director didn’t follow through earlier on his stated intention to engage in a debate in Congress and the public about the proper role for encryption in society. Instead, he decided to circumvent that debate altogether and quietly go to court to get a judge to do what the legislative branch has so far refused to do.

This all comes on the heels of a judge in New York strongly rebuking the FBI and Department of Justice in a court decision on Monday. (The New York case is different from the high profile San Bernardino situation that has garnered more media attention.) Comey, despite knowing he would testify on Tuesday, decided not to read the opinion from the previous day. He didn’t give a reason for why he didn’t, but given the judge thoroughly dismantled every argument the government put forward, maybe he couldn’t stomach it.

The court hearing in the San Bernardino case is in two weeks, and there is no doubt that this is really only the beginning of the debate. But, for the first time, it seems like Congress has finally opened its eyes to the long-term effects of designing vulnerabilities into our communications systems and forcing tech companies to becomes investigative arms of the government.

EFF - Why


Deep Dive: Why Forcing Apple to Write and Sign Code Violates the First Amendment

EFF filed an amicus brief today in support of Apple's fight against a court order compelling the company to create specific software to enable the government to break into an iPhone. The brief is written on behalf of 46 prominent technologists, security researchers, and cryptographers who develop and rely on secure technologies and services that are central to modern life. It explains that the court’s unprecedented order would violate Apple’s First Amendment rights. That’s because the right to free speech prohibits the government from compelling unwilling speakers to speak, and the act of writing and, importantly, signingcomputer code is a form of protected speech. So by forcing Apple to write and sign an update to undermine the security of its iOS software, the court is also compelling Apple to speak—in violation of the First Amendment.

On February 16, a federal magistrate judge in southern California ordered Apple to write and sign the new code in support of the FBI’s ongoing investigation of last December’s San Bernardino shooting. The court granted the government’s request to require Apple to provide software to help unlock an iPhone 5c used by one of the shooters. The phone is encrypted with a passcode and protected by additional iOS security features the government says it cannot bypass. In an unprecedented move, the order requires Apple to create a brand new version of its operating system with intentionally weakened security features, which the government can then use to get into the phone.

On February 25, Apple filed a motion to vacate the Judge’s order. Apple argued that compelling it to create and sign code is an extraordinary expansion of the All Writs Act, the law the government is relying on in this case. Earlier this week, a judge in New York—in a different iPhone unlocking case involving an older version of iOS—denied a request under the All Writs Act that would have forced Apple to bypass the lock screen of a seized iPhone. The judgerecognized that forcing Apple to unlock the phone would require an absurd interpretation of the All Writs Act.

But what the government is asking Apple to do in this case—i.e., force Apple and its programmers to write and sign the code necessary to comply with the judge’s order—is not just an unprecedented expansion of the All Writs Act that puts the security and privacy of millions of people at risk. It is also a violation of the First Amendment.

As we explain in our amicus brief, digital signatures are a powerful way of communicating the signer’s endorsement of the signed document—in this case, the custom iOS code. Due to the mathematical properties of digital signatures—invented in part by signers of our brief, including Martin Hellman and Ron Rivest—it would be very difficult to impersonate Apple without possessing the company’s secret signing key. Apple has chosen to build its iOS in such a way that its devices only accept iOS code signed by Apple, a design it believes best ensures user trust and strengthens the security of these devices. Since over 3 million phones were stolen in 2013 alone, the protections Apple is providing are important. By requiring Apple to sign code that undermines the security features Apple has included in iOS, the court’s order directly compels the company’s strong and verifiably authentic endorsement of the weakened code.

This is where the First Amendment comes in. The Constitution clearly prevents the government from forcing people to endorse positions they do not agree with. Whether that endorsement takes the form of raising your hand, signing a loyalty oath, putting a license plate motto on your car or, as here, implementing an algorithm that creates a digital signature, the problem is the same. As the Supreme Court noted in a case involving whether the government could force private parade organizers to include viewpoints they disagreed with, “[W]hen dissemination of a view contrary to one’s own is forced upon a speaker intimately connected with the communication advanced, the speaker’s right to autonomy over the message is compromised.”As a result, government mandates requiring people to speak are subject to strict scrutiny—the most stringent standard of judicial review in the United States.

Of course, the fact that Apple expresses its beliefs in the language of computer code and in digital signatures verifying its code implicates a set of cases where EFF pioneered the recognition that writing computer code is a form of, well, writing. In Bernstein v. DOJ and later in Universal City Studios, Inc. v. Corley, the courts agreed with us that, just like musical scores and recipes, computer code “is an expressive means for the exchange of information and ideas.” The fact that the expression comes in the form of code may implicate the level of regulation the government can apply, but not whether the code is in fact expressive.

Here, the problem is even more acute. Apple is being forced to actually write and endorse code that it—rightly—believes is dangerous. And in doing so, it is being forced to undermine the trust it has established in its digital signature. The order is akin to the government forcing Apple to write a letter in support of backdoors and sign its forgery-resistant name at the bottom. This is a clear violation of Apple’s First Amendment rights, in addition to being a terrible outcome for all the rest of us who rely on digital signatures and trustworthy updates to keep our lives secure.

The court will hear argument on Apple’s motion to vacate at 1:00 p.m. on March 22, 2016, in Riverside.  We hope the judge reconsiders this dangerous and unconstitutional order.

UN human rights chief: Lives could be in danger if the FBI forces Apple to help unlock iPhone - The Washington Post 20160304

UN human rights chief: Lives could be in danger if the FBI forces Apple to help unlock iPhone - The Washington Post 20160304

The top human rights authority at the United Nations warned Friday that if the FBI succeeds in forcing Apple to unlock an iPhone used by one of the San Bernardino attackers, it could have “tremendous ramifications” around the world and “potentially [be] a gift to authoritarian regimes, as well as to criminal hackers.”

The statement came a day after a deluge of technology companies and other groups publicly backed Apple in the fight, and it echoed what many of these firms and groups said in arguing that the FBI’s demands could have a devastating impact on digital privacy going forward.

“In order to address a security-related issue related to encryption in one case, the authorities risk unlocking a Pandora’s Box that could have extremely damaging implications for the human rights of many millions of people, including their physical and financial security,” Zeid Ra’ad Al Hussein, the U.N. High Commissioner for Human Rights, said in a statement Friday.

If the FBI prevails, Hussein argued, it would set a precedent that could make it impossible to fully protect privacy worldwide.

[Relatives of San Bernardino victims, tech groups take sides in FBI-Apple fight]

“Encryption tools are widely used around the world, including by human rights defenders, civil society, journalists, whistle-blowers and political dissidents facing persecution and harassment,” Hussein said.

Apple is fighting a judge’s order directing the company to help the FBI unlock an iPhone found after the Dec. 2 attack in San Bernardino, Calif. While the Justice Department and other law enforcement groups argue that the demands here are specific and focused on one investigation, Apple and other tech firms are arguing that an FBI victory here could be utilized in countless other cases.

The locked iPhone 5C belonged to Syed Rizwan Farook, who along with his wife, Tafsheen Malik, fatally shot 14 people and wounded 22 others during the attack. Both attackers, who pledged loyalty to the Islamic State, were killed hours after the shooting, and investigators say they are still looking into whether the pair had any ties to groups or people operating overseas.

Federal authorities obtained a magistrate judge’s order directing Apple to write software that would disable a feature that deletes the data on the iPhone — which is owned by San Bernardino County and was given to Farook in his job as a health inspector — after 10 incorrect password attempts.

Apple has fought the FBI’s order — in court, on Capitol Hill and through public statements — and this week, the company received the backing of dozens of other companies, groups and individuals.

Bruce Sewell, general counsel at Apple, before testifying on Capitol Hill this week. (Andrew Harrer/Bloomberg)
A ream of major tech companies — including Google, Amazon, Facebook, Yahoo, Twitter, Snapchat and Microsoft — signed on to court briefs that warned of “a dangerous precedent” for digital security if Apple was forced to act “against their will.” These calls were joined by groups including the American Civil Liberties Union, several trade and policy groups and dozens of technologists, security researchers and cryptographers.

The Justice Department received the backing of relatives of some of the people killed in the San Bernardino attack as well as briefs of support from law enforcement groups representing officers in California and across the country.

Hussein said that the United Nations fully supported the FBI’s investigation into the “abominable crime,” but argued against viewing this as an isolated case.

He pointed back to a report his office released last year saying that strong encryption and digital privacy are important to human rights, stating: “It is neither fanciful nor an exaggeration to say that, without encryption tools, lives may be endangered.”

In trying to glean information on the locked iPhone, authorities could “end up enabling a multitude of other crimes all across the world, including in the United States,” he said.

Doctorow, Cory - Apple vs FBI: The privacy disaster is inevitable, but we can prevent the catastrophe - Boing Boing 20160304

Doctorow, Cory - Apple vs FBI: The privacy disaster is inevitable, but we can prevent the catastrophe - Boing Boing 20160304

My new Guardian column, Forget Apple\'s fight with the FBI – our privacy catastrophe has only just begun, explains how surveillance advocates have changed their arguments: 20 years ago, they argued that the lack of commercial success for privacy tools showed that the public didn't mind surveillance; today, they dismiss Apple's use of cryptographic tools as a "marketing stunt" and treat the proportionality of surveillance as a settled question.

The privacy disaster is inevitable. Personal information is the CO2 of the surveillance economy, and we've pumped so much of it into giant, leaky, immortal databases that huge, life-destroying breaches are a given. The fight over Apple and the FBI, over privacy versus surveillance (not "privacy vs security," thank you very much) is about what happens after the coming privacy tsunamis make landfall: will we continue to pump out privacy-smog that turn the storm-of-the-century disasters into storm-of-the-millennium catastrophes?

Or will these privacy disasters prompt us to take action? Will the FBI let us decarbonize the Interet's surveillance economy, or will they demand that we build surveillance coal power-plants into every phone, computer, and Internet of Things gadget from now on?

Companies started to sell the idea of privacy. Apple and Microsoft sought to differentiate themselves from Facebook and Google by touting the importance of not data-mining to their bottom lines. Google started warning users when it looked like governments were trying to hack into their emails. Facebook set up a hidden service on Tor’s darknet. Everybody jumped on the two-factor authentication bandwagon, then the SSL bandwagon, then the full-disk encryption bandwagon.

The social proof of privacy’s irrelevance vanished, just like that. If Apple – the second most profitable company in the world – thinks that customers will buy its products because no one, not even Apple, can break into the data stored on them, what does it say about the privacy zeitgeist?

Seamlessly, the US Department of Justice switched tacks: Apple’s encryption is a “marketing stunt”. The company has an obligation to backdoor its products to assist law enforcement. Please, let’s not dredge up the old argument about whether it’s OK to spy on everyone – we settled that argument already, by pointing out the fact that no one was making any money by making privacy promises. Now that someone is making money from privacy tech, they’re clearly up to no good.

Doctorow, Cory - Forget Apple's fight with the FBI – our privacy catastrophe has only just begun - The Guardian 20160304

Doctorow, Cory - Forget Apple's fight with the FBI – our privacy catastrophe has only just begun - The Guardian 20160304

The privacy crisis is a disaster of our own making – and now the tech firms who gathered our data are trying to make money out of privacy

The smog of personal data is the carbon dioxide of privacy. We’ve emitted far too much of it over the past decades, refusing to contemplate the consequences.

For privacy advocates, the Apple-FBI standoff over encryption is deja vu all over again.

In the early 1990s, they fought and won a pitched battle with the Clinton administration over the Clipper chip, a proposal to add mandatory backdoors to the encryption in telecommunications devices.

Soon after that battle was won, it moved overseas: in the UK, the Blair government brought in the Regulatory of Investigatory Powers Act (RIPA). Privacy advocates lost that fight: the bill passed in 2000, enabling the government to imprison people who refused to reveal their cryptographic keys.

The privacy fight never stopped. In the years since, a bewildering array of new fronts have opened up on the battlefield: social media, third-party cookies, NSA/GCHQ mass surveillance, corporate espionage, mass-scale breaches, the trade in zero-day vulnerabilities that governments weaponise to attack their adversaries, and Bullrun and Edgehill, the secret programmes of security sabotage revealed by whistleblower Edward Snowden.

Who really cares about surveillance?

The first line of defense for surveillance advocates – whether private sector or governmental – is to point out just how few people seem to care about privacy. What can it matter that the government is harvesting so much of our data through the backdoor, when so many of us are handing over all that and more through the front door, uploading it to Facebook and Google and Amazon and anyone who cares to set a third-party cookie on the pages we visit?

Why is it so hard to convince people to care about privacy?

Painting the pro-privacy side as out-of-step loonies, tinfoil-hatted throwbacks in the post-privacy era was a cheap and effective tactic. It made the pro-surveillance argument into a *pro-progress* one: “Society has moved on. Our data can do more good in big, aggregated piles than it can in atomized fragments on your device and mine. The private data we exhaust when we move through the digital world is a precious resource, not pollution.”

It’s a powerful argument. When companies that promise to monetize your surveillance beat companies that promise to protect your privacy, when people can’t even be bothered to tick the box to block tracking cookies, let alone install full-disk encryption and GPG to protect their email, the pro-surveillance camp can always argue that they’re doing something that no one minds very much.

From the perennial fights over national ID cards to the fights over data retention orders, the lack of any commercial success for privacy tech was a great way to shorthand: “Nothing to see here – just mountains being made from molehills.”

And then ... companies started selling privacy

But a funny thing happened on the way to the 21st century: we disclosed more and more of our information, or it was taken from us.

As that data could be used in ever-greater frauds, the giant databases storing our personal details became irresistible targets. Pranksters, criminals and spies broke the databases wide open and dumped them: the IRS, the Office of Personnel Management, Target and, of course, Ashley Madison. Then the full impact of the Snowden revelations set in, and people started to feel funny when they texted something intimate to a lover or typed a potentially embarrassing query into a search box.

Companies started to sell the idea of privacy. Apple and Microsoft sought to differentiate themselves from Facebook and Google by touting the importance of not data-mining to their bottom lines. Google started warning users when it looked like governments were trying to hack into their emails. Facebook set up a hidden service on Tor’s darknet. Everybody jumped on the two-factor authentication bandwagon, then the SSL bandwagon, then the full-disk encryption bandwagon.

The social proof of privacy’s irrelevance vanished, just like that. If Apple – the second most profitable company in the world – thinks that customers will buy its products because no one, not even Apple, can break into the data stored on them, what does it say about the privacy zeitgeist?

The privacy catastrophe has only just begun

Seamlessly, the US Department of Justice switched tacks: Apple’s encryption is a “marketing stunt”. The company has an obligation to backdoor its products to assist law enforcement. Please, let’s not dredge up the old argument about whether it’s OK to spy on everyone – we settled that argument already, by pointing out the fact that no one was making any money by making privacy promises. Now that someone is making money from privacy tech, they’re clearly up to no good.

The smog of personal data is the carbon dioxide of privacy. We’ve emitted far too much of it over the past decades, refusing to contemplate the consequences until the storms came. Now they’ve arrived, and they’ll only get worse, because the databases that haven’t breached yet are far bigger, and more sensitive than those that have.

Like climate change, the privacy catastrophes of the next two decades are already inevitable. The problem we face is preventing the much worse catastrophes of the following the decades.

And as computers are integrated into the buildings and vehicles and cities we inhabit, as they penetrate our bodies, the potential harms from breaches will become worse.

ACLU - Why we're defending Apple - 20160302

ACLU - Why we're defending Apple - 20160302

Cell phone in chains

The stakes of the fight between Apple and the FBI could not be higher for digital security and privacy. If the government has its way, then it will have won the authority to turn American tech companies against their customers and, in the process, undercut decades of advances in our security and privacy.

Today, the ACLU — joined by its affiliates in California — is filing anamicus brief in support of Apple’s challenge to FBI efforts to compel the company to help break into an iPhone as part of the investigation into the 2015 San Bernardino shootings. We filed asimilar brief several months ago in support of Apple’s parallel fight in Brooklyn. But in this case, the stakes are even higher, because the FBI wants to force Apple to write new computer code to disable security features on one of its devices.

If the government gets its way, the legal precedent set by this case will reverberate far beyond this particular investigation and the single phone at issue. (In fact, just Monday, the magistrate judge overseeing the parallel case in Brooklyn noted the gravity of the government’s legal theory in issuing a comprehensive rejection of it.)

The government’s request relies on the All Writs Act, a gap-filling law passed in 1789, in its bid to compel Apple to create and authenticate software so that the FBI can hack into an individual’s iPhone. That law gives courts the authority to issue orders necessary for it to fulfill its judicial role and enforce its decisions. It does not, however, permit courts to give law enforcement new investigative tools that Congress has not authorized. In this case, the Act can’t be used by law enforcement to give itself the unprecedented power to conscript an innocent third party into government service against its will. The use of this law is made all the more sweeping considering the vast cybersecurity and privacy implications of what the government wants to be able to do.

What the government wants here goes beyond the well-established duties of citizens to aid law enforcement — by, for example, turning over evidence or giving testimony — because Apple doesn’t actually possess the information on the iPhone that the government seeks. The order the government has proposed would also violate the Fifth Amendment, which imposes a limit on the assistance that law enforcement may compel of innocent third parties who don’t actually have the information the government is after — a limit the government has crossed in this case. Think of it this way: Could the government get a court order compelling you to spy on your neighbor, or perhaps compelling the friend of a Black Lives Matter organizer to seek out information and report on that person’s plans for a peaceful protest? We don’t think so, and the Fifth Amendment is what defines the outer bounds of law enforcement’s authority to conscript us all into investigative service.

Though the legal arguments may seem esoteric, the power the government aims to establish here would set a troubling and dangerous precedent that would undermine everyone’s digital privacy and security. For example, if the courts uphold the government’s interpretation of the law, the FBI could force Apple to authenticate and deliver malware to a target’s devices using Apple’s automatic-update system. That would put us all at risk when you consider the implications for the rest of our devices. Automatic updates are the vaccinations of the digital world: They only work if they’re taken, and they’re only taken if they’re trusted. Consumers will have little incentive to install automatic updates if they believe they could be government-mandated malware masquerading as security fixes. As the array of mobile devices and web-connected appliances grows, so does the need for regular security updates. The government’s legal theory would undermine this system and the security of the Internet across the board.

This case did not arise in a vacuum. Over the last few years, Congress has considered and declined to compel tech companies to build backdoor access to encrypted data. And, despite the pitched battle in court, Apple and the government agree that Congress is best positioned to grapple with this enormously important question in the first place. Indeed, the House Judiciary Committee held a hearing just yesterday on Apple’s iPhone encryption. Notwithstanding its commitment to public debate on this question, the government has sought to compel Apple to assist law enforcement in mobile device unlocking more than 80 times, largely through secret court filings. (We’ve filed a FOIA request to learn more about these cases and the policy behind this effort.)

In the Brooklyn case, Magistrate Judge James Orenstein ruled, in a thorough 50-page opinion, that the government may not rely on the All Writs Act to compel Apple to assist in unlocking an iPhone. Judge Orenstein recognized a critical flaw in the government’s All Writs Act theory: It contains no “principled limit on how far a court may go in requiring a person or company to violate the most deeply-rooted values.” It doesn’t take a constitutional scholar to understand that there is a limit on the government’s power to conscript third parties into the service of law enforcement. That’s the kind of limit that distinguishes a democratic government from a police state.

As the ACLU told the court in California today, the government has crossed that limit.