Category Archives: FBI

Apple vs FBI: A Socratic dialogue on privacy and security - Diplo 20160317 - 20160329

Apple vs FBI: A Socratic dialogue on privacy and security - Diplo 20160317 - 20160329

Diplo’s webinar on the Apple-FBI case, on 17 March (watch the recording), evolved into a Socratic dialogue on the core concepts and underlying assumptions of the case. The lively debate inspired us to create a series of posts that argue the main dilemmas, played out by three fictitious characters, Privarius, Securium, and Commercias. The first starts with the main facts.

The Apple-FBI case triggered so many questions for which we do not have ‘correct’ or clear answers. Responses often trigger new questions. Join us in the debate with your comments and questions.

Securium: Everyone is talking about it! The 16 February ruling, by a US federal judge in Riverside, California, which ordered Apple to assist the FBI in unlocking an iPhone, triggered a global debate. The iPhone is not just any phone: it belongs to one of the attackers who killed 14 people in San Bernardino in December 2015.

Commercias: A global debate indeed. Especially after Apple’s strong reaction. Declaring opposition to the order, Apple is arguing that by complying with the request, it would only create a dangerous precedent and would seriously undermine the privacy and security of its users. Other technology companies (such as Microsoft, Amazon, Google Facebook, and Twitter), as well as civil rights activists, have expressed support for Apple.

Privarius: Activists are also involved in this debate. The ruling, and the eventual outcome, can have very serious implications and repercussions. Encryption is a strong safeguard, and companies should not be made to weaken the security of their own products. Decryption should not be allowed.

Securium: Is it for companies to decide? US President Barack Obama has already objected to the creation of undecryptable black boxes, stating the need for a balance between security and privacy that would enable law enforcement authorities to continue doing their job. The outcome of this case is still unclear.

Commercias: Unclear indeed. Today’s court hearing was postponed, as the FBI said it may have found a way to unlock the phone without Apple's assistance…

Privarius: This particular case may be nearing an end, but the main issues remain open. For example, how can there possibly be a balance between privacy and security if phones are rendered decryptable? After the Snowden revelations, it became clear that we can no longer completely rely on government agencies in ensuring our privacy, which is now in the hands of technology companies.

Commercias: Even the UN High Commissioner for Human Rights issued a statement, asking the US authorities to proceed with caution, as the case 'could have extremely damaging implications for the human rights of many millions of people, including their physical and financial security’. The UN Special Rapporteur for freedom of expression also asked for caution, noting that the FBI request risks violating the International Covenant on Civil and Political Rights.

Securium: Whatever the outcomes will be, one thing is clear: even if a solution may have been found today, this does not solve the main dilemmas. So let’s see what the issues at stake are, starting with security...

The next post - published next Thursday, 24th March - tackles the security aspect.

II. Apple vs FBI: It’s just one phone - or is it?

Commercias: ...If Apple were to help the FBI unlock this one phone, in adherence to the court order, other courts in the USA and elsewhere are likely to issue similar requests for future cases.

Securium: Isn’t this farfetched? The FBI’s requests are about one single iPhone: ‘... The Court’s order is modest. It applies to a single iPhone, and it allows Apple to decide the least burdensome means of complying. As Apple well knows, the Order does not compel it to unlock other iPhones or to give the government a universal “master key” or “back door”.’

Commercias: The order may not be referring to other phones, but if we take a look around us, we can see, for example, that the Manhattan district attorney has already indicated that there are currently 175 iPhones which investigators could not unlock, and he further confirmed that he would want access to all phones which are part of a criminal investigation, should the government prevail in the San Bernardino case. Apple is very likely to be compelled to use this technique to unlock iPhones in police custody all over America and beyond. Apple's attorneys reported a list of nine other cases involving 12 different iPhones of different models that law-enforcement authorities had asked Apple to help crack, and none of them involved terrorism. We cannot run this risk.

Privarius: Apple needs to create new software to open this phone, and this software could potentially unlock any iPhone. There is no guarantee that the FBI will not use this software - or master key - again, and if it falls into the wrong hands, the software can be misused by third parties. One case will be followed by another and there won’t be an end.

Securium: We should focus on the case at hand. The order is a targeted one ‘that will produce a narrow, targeted piece of software capable of running on just one iPhone, in the security of Apple’s corporate headquarters. That iPhone belongs to the County of San Bernardino, which has consented to its being searched.’ We must also not forget that the phone was used by a terrorist who took innocent lives. Crucial information surrounding the case may be stored on this device. With this fact in mind, the court order is pretty reasonable!

Commercias: No, the fact that the court issued an order doesn’t necessarily mean it is reasonable. The fact is that Apple has been assisting the FBI in cases before this as well as in this one particularly - it has provided the backup of the phone stored within the iCloud (though, unfortunately, the last backup doesn’t contain the most recent files from the day of the shooting). The Internet industry has always been cooperative when court orders were issued (even without the court order, as we learned from Snowden). This time, what the court is requesting has crossed the line.

Securium: There are no red lines when it comes to protecting users and citizens worldwide.

Commercias: There are. The company has been asked to decrease its security level - which by the way is its corporate advantage - which helps keep users secure. If the court forces Apple to make a patch, this would reduce the security level of its system. And although the FBI has asked Apple to unlock only one iPhone, this might not be possible without affecting the privacy of all other iPhones, making them less secure in the process. Besides, do you really think that the FBI won’t use this ‘back door’? Once the privacy door is open, it will never be closed.

Securium: It is speculation. Let us not be abstract. Why would other phones be endangered?

Commercias: Technically speaking, Apple would need to create a software patch to its iOS, and install it in this particular phone. It is likely this could also be done within Apple’s headquarters, with the FBI accessing only this particular phone (even remotely) and without the chance to reach out to the particular software patch. However, since the phone needs to be handed over to investigators, there is a possibility of it being reverse-engineered. In addition, misuses and abuses cannot be fully controlled once the firmware is out.

Privarius: But let’s say, Apple creates a software patch to unlock the phone: authorities may still submit requests for hundreds of other phones to be unlocked, and requests could possibly come from other jurisdictions. In this case, Apple would need to have its teams constantly available. Moreover, future versions of the iOS would also need to have an updated patch. Ultimately, Apple might find it easier and cheaper to simply develop a real backdoor in its products, or to give up on the stronger security-by-design approach.

Securium: On the other hand, if Apple wins this court case - if the case is resumed - it can create a new precedent.

Privarius:... and a major win for privacy!

III. Apple vs FBI: A case for encryption

Commercias: ...A win for Apple - among other issues - is also a win for privacy...

Privarius: The question is, can Apple damage privacy by claiming to protect it? In making extreme claims, they could be pushing the pendulum too far, and risk provoking a counter-reaction by endangering privacy protection. As President Obama recently said at a South by Southwest (SXSW) conference, ‘after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that are dangerous and not thought through.’

Commercias: On the other hand, we may say that it was the FBI who was, in fact, pushing too much. Apple and similar companies have cooperated by giving investigators all the data they have about the suspects; yet the FBI is asking them to go an extra step, and in the process, weaken the products’ encryption. The fact is that the FBI has already acquired large amounts of evidence about this case thanks to digital forensics and the support of the Internet industry (including Apple). Today, a user’s digital communications is not only saved on his/her phone, but is stored in the cloud by service providers such as Facebook or Google, which readily cooperate with the FBI to provide the data its investigators need.

Privarius: This also raises several questions: Was there really such a need to break into the phone? Does this justify setting a precedent? Is the benefit of this request proportional to its consequences?

Commercias: Furthermore, security experts such as the former US anti-terror chief claim that the FBI could have turned to the NSA for help, since this case may be related to terrorism; it is likely that the NSA has advanced techniques that can break the code. This can lead us to conclude that there might not have been a real need for the FBI to push Apple; yet the FBI chose a case linked to terrorism to push its limits and try to set a precedent.

Privarius: One positive aspect, if you may, is that as a result, encryption technology is flourishing. There are dozens of unbreakable encryption applications online, readily available mostly for free. There are complete solutions, integrating hardware, OS, and software. More importantly, hardware development has led to the creation of motherboard chips, such as Intel’s SGX, that incorporates encryption within a silicon wafer; this chip will soon become a common feature in products, with little possibility (if any) for anyone to unlock it with the use of any software or hardware patch. The outcome will affect how users choose their products, and may lead them to switch to other products with tighter encryption, or to install their own encryption software. This will leave law enforcement with even less control.

Commercias: But even with less control, law enforcement agencies may still be able to carry out their investigations without breaking encrypted communications - such as by using metadata, digital forensics, offline means, etc - right?

Privarius: Yes, they can. While there is little evidence on the usefulness of meta-data (zero success according to NSA) or access to encryption materials in preventing terrorist attacks (prior to the Paris attack, terrorists used unencrypted SMS), most criminal cases now require digital forensics as a critical part of the investigations. I would however distinguish between surveillance for national security purposes and to combat terrorism, from digital forensics for combatting crime (and not only cybercrime).

Commercias: True. Law enforcement has many digital forensics tools available at their disposal. I would add geolocation, data from telecom companies, and access to service providers’ cloud storage through court orders and other legal means. Besides, recent research (such as that by the Berkman Center) foresees that cyberspace is unlikely to ‘go dark’, for many reasons, and there will still be many sources for digital evidence without the need to break into encrypted spaces. Which would mean that Apple can retain its strong stand over privacy. …

IV. Apple vs FBI: A matter of trust

Commercias: In the past few days we saw how the situation took a surprising U-turn when the FBI announced it may have found a way to unlock the phone without Apple's assistance. In a way, it seems that Apple has managed to stand its ground so far.

Securium: Let’s face it though, has Apple really been advocating for the rights of its users, or is this more of a business strategy through which it has tried to regain the users’ trust?

Privarius: While it looks like Apple is in fact supporting privacy, we must not forget that companies are primarily driven by commercial interests. Many - including the FBI - have argued that Apple’s position is more about its business model than the protection of human rights.

Commercias: Even if companies have commercial interests, they can still work hard to protect human rights, including privacy.

Privarius: True. But can we expect businesses to always serve the good cause? Will the protection of human rights always fit into their model, and what if profits drive them to support other causes?

Commercias: It is also a matter of trust. If we look at the Internet business model, we realise how important users’ trust is. Arguably, obeying the court order may lead to diminished trust in Apple, and could provide a market advantage to other products offering strong built-in encryption solutions.

Privarius: So perhaps, if we had to identify Apple’s position in the triangular model, we might say that Apple is both a vendor (selling tech products), and an intermediary (storing users’ data).

Commercias: Indeed. This is probably why Apple took such a strong position in challenging the authorities. Apple’s business model could be seen a somewhat more diverse than, for example, that of Google, Facebook and Twitter, which depend heavily on data. The data-driven Internet industry is quite vulnerable to major policy ‘earthquakes’, such as the Snowden revelations, or the ongoing Apple/FBI controversy. Microsoft is another company that challenged the US authorities (court case on authority over data in Ireland). Just like Apple, Microsoft has a more diverse business model than typical Internet industry companies.

Privarius: And yet, if Apple loses this case, it will further erode the users’ trust in companies too, and not just the security sector. As Edward Snowden tweeted recently: 'The @FBI is creating a world where citizens rely on #Apple to defend their rights, rather than the other way around.'

Securium: As a result, users will try to find their own ways to protect themselves - through alternative and niche products, online software, etc. In such an environment, only the more skillful citizens will be more protected, while less skillful users will be additionally endangered by criminals and terrorists, which are becoming more and more tech-savvy. We should rather aim to have a minimum level of security for everyone, and to achieve this, end users should not be left to protect themselves through the use of cipher protection….

Privarius: And yet, if governments cannot protect the security and human rights of its citizens - which is the basis of any social contract - citizens should be allowed to protect themselves.

Commercias: Exactly… In real-life, by using guns; in cyberspace by using cipher protection. This is interesting: gun lobbyists and cipher-enthusiasts may share an underlying logic for their actions.

Privarius: The analogy with guns is incorrect; encryption protects, it doesn’t cause damage to others. Connected devices - computers, smartphones, tablets - can do both. Encryption prevents criminals from misusing users’ computers (90% of attacks are based on social engineering, using access to private data to fine-tune and adjust the attacks for phishing or spear-phishing). Encryption also strengthens the security of protocols and online communications in general, making attacks such as ‘man in the middle’ attacks much harder. Not to mention that encryption can save lives - as the UN Commissioner for Human Rights rightly mentioned - lives of activists, journalists, and whistleblowers around the world. Rather than reducing the levels of cybercrime by weakening encryption, the security community needs to look into how encryption can contribute to a more secure Internet...

Securium: Or maybe, we should let the courts decide on the next steps. ...

V. Apple vs FBI: Towards an Internet social contract?

Securium: Until last week, everyone was thinking that if Apple won this court case it would create a new precedent. For the time being, it seems like the case has been resolved, since the US Department of Justice has just declared it is now able to unlock the iPhone thanks to the assistance of a third party.

Privarius: Although it seems the case is settled, the main dilemmas have not yet been resolved. Whether this will happen immediately, or in the near future, society may eventually need to make some hard choices regarding privacy and security, among others, and gradually create new models of consensus. [Read the editorial, on page 3, of Issue 8 of the Geneva Digital Watch newsletter]

Commercias: Even if the recent developments have shown that the government did manage to unlock the phone, a new social contract could tackle one of the essential arguments in the debate: whether devices should be impermeable, or ‘undecryptable’. This may be the only way to keep them safe from intrusion from both criminals and authorities.

Securium: It is not the only way. Let us take a hypothetical situation: assuming that unlocking a mobile phone is essential to preventing a nuclear attack and saving many lives, would you argue that the privacy of a mobile phone user is more important that the survival of innocent people?

Privarius: Well, it is an abstract and unrealistic situation.

Securium: We can argue at length as to whether this is possible or probable. The point is that the principle of undecryptability of mobile devices creates an important implicit decision: that of placing privacy above other human rights or security considerations...

Privarius: I still do not think this is a dangerous risk; on the other hand, allowing access to this specific mobile and setting a dangerous precedent is a very concrete risk. If Apple gives in now, how can it resist future demands from the USA and abroad?

Securium: In the USA, had the case gone forward, it would have been decided either by the courts (setting a precedent) or by Congress. Either way, the US legislative framework would have been determinative. The democratic system preserves security by allowing judicial authorities to issue orders that weaken privacy protections. President Obama was right in objecting to the creation of undecryptable black boxes. It is, after all, what happens in the offline world, when law enforcement agencies obtain the right to enter private property as part of investigations, for example.

Privarius: The difference is that online or data searches can be automated, and it is easy to imagine searches being implemented without due process. It is simply not the same as physically knocking on 100 doors.

Commercias: More importantly, if Apple or any other company had to create a patch to break into a phone, what is the likelihood that criminals would not try to gain access or exploit any vulnerabilities? Equally important is the fact that the legal basis for FBI’s request and the Court order is uncertain and has been widely disputed - which proves that there is no political or social agreement, as yet, on how to deal with this and similar cases that may come up...

FBI Quietly Admits to Multi-Year APT Attack, Sensitive Data Stolen - Threatpost 20160407

FBI Quietly Admits to Multi-Year APT Attack, Sensitive Data Stolen - Threatpost 20160407

FBI QUIETLY ADMITS TO MULTI-YEAR APT ATTACK, SENSITIVE DATA STOLEN

The FBI issued a rare bulletin admitting that a group named Advanced Persistent Threat 6 (APT6) hacked into US government computer systems as far back as 2011 and for years stole sensitive data. The FBI alert was issued in February and went largely unnoticed. Nearly a month later, security experts are now shining a bright light on the alert and the mysterious group behind the attack.

“This is a rare alert and a little late, but one that is welcomed by all security vendors as it offers a chance to mitigate their customers and also collaborate further in what appears to be an ongoing FBI investigation,” said Deepen Desai, director of security research at the security firm Zscaler in an email to Threatpost.

Details regarding the actual attack and what government systems were infected are scant. Government officials said they knew the initial attack occurred in 2011, but are unaware of who specifically is behind the attacks. “Given the nature of malware payload involved and the duration of this compromise being unnoticed – the scope of lateral movement inside the compromised network is very high possibly exposing all the critical systems,” Deepen said.

In its February bulletin, the FBI wrote: “The FBI has obtained and validated information regarding a group of malicious cyber actors who have compromised and stolen sensitive information from various government and commercial networks. The FBI said the “group of malicious cyber actors” (known as APT6 or 1.php) used dedicated top-level domains in conjunction with the command and control servers to deliver “customized malicious software” to government computer systems. A list of domains is listed in the bulletin.

“These domains have also been used to host malicious files – often through embedded links in spear phish emails. Any activity related to these domains detected on a network should be considered an indication of a compromise requiring mitigation and contact with law enforcement,” wrote the FBI in its bulletin.

When asked for attack specifics, the FBI declined Threatpost’s request for an interview. Instead, FBI representatives issued a statement calling the alert a routine advisory aimed at notifying system administrators of persistent cyber criminals.

“The release was important to add credibility and urgency to the private sector announcements and ensure that the message reached all members of the cyber-security information sharing networks,” wrote the FBI. Deepen told Threatpost the group has been operating since at least since 2008 and has targeted China and US relations experts, Defense Department entities, and geospatial groups within the federal government.

According to Deepen, APT6 has been using spear phishing in tandem with malicious PDF and ZIP attachments or links to malware infected websites that contains a malicious SCR file. The payload, Deepen said, is often the Poison Ivy remote access tool/Trojan or similar. He said the group has varied its command-and-control check-in behavior, but it is typically web-based and sometimes over HTTPS.

Experts believe that attacks are widespread and not limited to the US federal government systems. “The same or similar actors are compromising numerous organizations in order to steal sensitive intellectual property,” wrote Zscaler in a past report on APT6. In December 2014, US government systems were compromised by hackers who broke into the Office of Personnel Management computer systems. That data breach, where 18 million people had their personal identifiable information stolen, didn’t come to light until months later in June of 2015.

The Senate’s Draft Encryption Bill Is ‘Ludicrous, Dangerous, Technically Illiterate' - Wired 20160408

The Senate’s Draft Encryption Bill Is ‘Ludicrous, Dangerous, Technically Illiterate' - Wired 20160408

AS APPLE BATTLED the FBI for the last two months over the agency’s demands that Apple help crack its own encryption, both the tech community and law enforcement hoped that Congress would weigh in with some sort of compromise solution. Now Congress has spoken on crypto, and privacy advocates say its “solution” is the most extreme stance on encryption yet.

On Thursday evening, the draft text of a bill called the “Compliance with Court Orders Act of 2016,” authored by offices of Senators Diane Feinstein and Richard Burr, was published online by the Hill.1 It’s a nine-page piece of legislation that would require people to comply with any authorized court order for data—and if that data is “unintelligible,” the legislation would demand that it be rendered “intelligible.” In other words, the bill would make illegal the sort of user-controlled encryption that’s in every modern iPhone, in all billion devices that run Whatsapp’s messaging service, and in dozens of other tech products. “This basically outlaws end-to-end encryption,” says Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology. “It’s effectively the most anti-crypto bill of all anti-crypto bills.”

It's effectively the most anti-crypto bill of all anti-crypto bills.
- TECHNOLOGIST JOSEPH LORENZO HALL

Kevin Bankston, the director of the New America Foundation’s Open Technology Institute, goes even further: “I gotta say in my nearly 20 years of work in tech policy this is easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen,” he says.

The bill, Hall and Bankston point out, doesn’t specifically suggest any sort of backdoored encryption or other means to even attempt to balance privacy and encryption, and actually claims to not require any particular design limitations on products. Instead, it states only that communications firms must provide unencrypted data to law enforcement or the means for law enforcement to grab that data themselves. “To uphold the rule of law and protect the security and interests of the United States, all persons receiving an authorized judicial order for information or data must provide, in a timely manner, responsive and intelligible information or data, or appropriate technical assistance to obtain such information or data.”

Hall describes that as a “performance standard. You have to provide this stuff, and we’re not going to tell you how to do it,” he says. George Washington Law School professor Orin Kerr points out on Twitter that the text doesn’t even limit tech firms’ obligations to “reasonable assistance” but rather “assistance as is necessary,” a term that means the bill goes beyond current laws that the government has used to try to compel tech firms to help with data access such as the All Writs Act.

Even more extreme, the draft bill also includes the requirement that “license distributors” ensure all “products, services, applications or software” they distribute provide that same easy access for law enforcement. “Apple’s app store, Google’s play store, any platform for software applications somehow has to vet every app to ensure they have backdoored or little enough security to comply,” says Bankston. That means, he says, that this would “seem to also be a massive internet censorship bill.”

I could spend all night listing the various ways that Feinstein-Burr is flawed & dangerous. But let's just say, "in every way possible."

— matt blaze (@mattblaze) April 8, 2016

If Grandpa Simpson was a Senator who was afraid of and confused by encryption, I think he'd write something like the Feinstein/Burr bill.

— Kevin Bankston (@KevinBankston) April 8, 2016

It's not hard to see why the White House declined to endorse Feinstein-Burr. They took a complex issue, arrived at the most naive solution.

— Matthew Green (@matthew_d_green) April 8, 2016

Burr and Feinstein’s bill disappoints its privacy critics in part because it seems to entirely ignore the points already made in a debate that’s raged for well over a year, and has its roots in the crytpo wars of the 1990s. Last summer, for instance, more than a dozen of the world’s top cryptographers published a paper warning of the dangers of weakening encryption on behalf of law enforcement. They cautioned that any backdoor created to give law enforcement access to encrypted communications would inevitably be used by sophisticated hackers and foreign cyberspies. And privacy advocates have also pointed out that any attempt to ban strong encryption in American products would only force people seeking law-enforcement-proof data protection to use encryption software created outside the U.S., of which there is plenty to choose from. Apple, in its lengthy, detailed arguments with the FBI in front of Congress and in legal filings, has called that weakening of Americans’ security a “unilateral disarmament” in its endless war with hackers to protect its users’ privacy.

MORE WIRED ENCRYPTION COVERAGE
White House Silence on an Anti-Encryption Bill Means Nothing
White House Silence on an Anti-Encryption Bill Means Nothing
Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People
Forget Apple vs. the FBI: WhatsApp Just Switched on Encryption for a Billion People
US-JUSTICE-APPLE-IPHONE
The Apple-FBI Battle Is Over, But the New Crypto Wars Have Just Begun
Proposed State Bans on Phone Encryption Make Zero Sense
Proposed State Bans on Phone Encryption Make Zero Sense
Tom Mentzer, a spokesman for Senator Feinstein, told WIRED in a statement on behalf of both bill sponsors that “we’re still working on finalizing a discussion draft and as a result can’t comment on language in specific versions of the bill. However, the underlying goal is simple: when there’s a court order to render technical assistance to law enforcement or provide decrypted information, that court order is carried out. No individual or company is above the law. We’re still in the process of soliciting input from stakeholders and hope to have final language ready soon.”

The Burr/Feinstein draft text may in fact be so bad for privacy that it’s good for privacy: Privacy advocates point out that it has almost zero likelihood of making it into law in its current form. The White House has already declined to publicly support the bill. And Adam Schiff, the top Democratic congressman on the House of Representatives’ intelligence committee, gave WIRED a similarly ambivalent comment on the upcoming legislation yesterday. “I don’t think Congress is anywhere near a consensus on the issue,” Schiff said, “given how difficult it was to legislate the relatively easy [Cyber Information Sharing Act], and this is comparatively far more difficult and consequential.”

Bankston puts it more simply. “The CCOA is DOA,” he says, coining an acronym for the draft bill. But he warns that privacy activists and tech firms should be careful nonetheless not to underestimate the threat it represents. “We have to take this seriously,” he says. “If this is the level of nuance and understanding with which our policymakers are viewing technical issues we’re in a profoundly worrisome place.”

1Correction 4/8/2016 1:00pm EST: A previous version of this story stated that the draft bill text had been released by the senators, which a spokesperson for Senator Burr has since said in a statement to WIRED she didn’t “believe was consistent with the facts.”

The Government Has Used the All Writs Act on Android Phones At Least 9 Times - Motherboard 20160330

The Government Has Used the All Writs Act on Android Phones At Least 9 Times - Motherboard 20160330

The federal government has asked Google for technical assistance to help it break into a locked Android smartphone using the All Writs Act at least nine times, according to publicly available court documents discovered by the American Civil Liberties Union.

The ACLU released the Google court documents along with 54 court cases in which the feds asked Apple for assistance obtaining information from a locked iPhone. The revelations show that many agencies have been using the All Writs Act, a 1789 law that the government says allows it to compel third party companies to help it in criminal investigations.

The law was at the heart of a recent legal battle between the FBI and Apple in San Bernardino, and this is the first time it’s been confirmed that Google has also received these sorts of orders. The FBI and Apple have an ongoing legal battle over the issue in New York.

The cases all appear to be closed, are in seven separate states, and involve the Department of Homeland Security, FBI, Customs and Border Patrol, the Secret Service, and, interestingly, the Bureau of Land Management. Google is believed to have complied with all of the orders, however the company tells Motherboard that none of the cases required the company to write new software for the federal government.

"We carefully scrutinize subpoenas and court orders to make sure they meet both the letter and spirit of the law,” a Google spokesperson told me. “However, we've never received an All Writs Act order like the one Apple recently fought that demands we build new tools that actively compromise our products' security. As our amicus shows, we would strongly object to such an order."

Google, Microsoft, Facebook, and several other major tech companies filed a legal brief in support of Apple in its recently-ended legal battle with the federal government, which said the companies are “united in their view that the government’s order to Apple exceeds the bounds of existing law and, when applied more broadly, will harm Americans’ security in the long run.”

In many of the cases found by the ACLU on publicly available law databases, Google was required to reset the password of an Android smartphone so that the government could gain access. Passcode and password resets of this kind are not possible on iPhones.

Google’s switch to default device encryption happened only with the Marshmallow version of Android, which was released in October but is still not available for many Android phones. Android phones are notoriously slow to get Google’s security and software updates; just 2.3 percent of Android phones are running Marshmallow, according to Google. It’s hard to say for sure, but it seems possible that Google has dealt with fewer of these orders because most of the Android phones out in the wild are likely susceptible to the federal government’s forensic tools.

Google has been asked to assist the Bureau of Land Management in the investigation of an alleged marijuana grow operation in Oregon; the Department of Homeland Security in an investigation of an alleged child pornographer in California; the FBI in the investigation of an alleged cocaine dealer named “Grumpy” in New Mexico; and the Secret Service in an unknown case in North Carolina. It has been asked to reset the passwords or bypass the lock screens of Samsung, Kyocera, Alcatel, and HTC phones, among several other unidentified devices.

“These cases show that the government has an interest in getting this kind of assistance from tech companies in a wide variety of cases,” ACLU attorney Esha Bhandari told me. “The government and law enforcement in general have an interest in using the All Writs Act in a wide variety of investigations, including criminal investigations.”

Court documents for the cases are available here:

https://www.aclu.org/sites/default/files/field_document/Or_1-12-mc-305.pdf

https://www.aclu.org/sites/default/files/field_document/ED_Cal_2-15-sw-00568.pdf

https://www.aclu.org/sites/default/files/field_document/ED_Cal_1-15-sw-00073.pdf

https://www.aclu.org/sites/default/files/field_document/ED_Cal_2-15-sw-00602.pdf

https://www.aclu.org/sites/default/files/field_document/NM_1-13-mr-115.pdf

https://www.aclu.org/sites/default/files/field_document/SD_4-14-mc-97.pdf

https://www.aclu.org/sites/default/files/field_document/ND%203-12-mj-132.pdf

https://www.aclu.org/sites/default/files/field_document/NM_1-13-mr-115_0.pdf

"Black People Need Encryption," No Matter What Happens in the Apple-FBI Feud - Mother Jones 20160322

"Black People Need Encryption," No Matter What Happens in the Apple-FBI Feud - Mother Jones 20160322

Here's why civil rights activists are siding with the tech giant.

Last night, the FBI, saying that it may be able to crack an iPhone without Apple's help, convinced a federal judge to delay the trial over its encryption dispute with the tech company. In February, you may recall, US magistrate judge Sheri Pym ruled that Apple had to help the FBI access data from a phone used by one of the San Bernadino shooters. Apple refused, arguing that it would have to invent software that amounts to a master key for iPhones—software that doesn't exist for the explicit reason that it would put the privacy of millions of iPhone users at risk. The FBI now has two weeks to determine whether its new method is viable. If it is, the whole trial could be moot.

"One need only look to the days of J. Edgar Hoover...to recognize the FBI has not always respected the right to privacy for groups it did not agree with."

That would be a mixed blessing for racial justice activists, some of them affiliated with Black Lives Matter, who recently wrote to Judge Pym and laid out some reasons she should rule against the FBI. The letter—one of dozens sent by Apple supporters—cited the FBI's history of spying on civil rights organizers and shared some of the signatories' personal experiences with government overreach.

"One need only look to the days of J. Edgar Hoover and wiretapping of Rev. Martin Luther King, Jr. to recognize the FBI has not always respected the right to privacy for groups it did not agree with," they wrote. (Targeted surveillance of civil rights leaders was also a focus of a recent PBS documentary on the Black Panther Party.) Nor is this sort of thing ancient history, they argued: "Many of us, as civil rights advocates, have become targets of government surveillance for no reason beyond our advocacy or provision of social services for the underrepresented."

Black Lives Matter organizers have good reason to be concerned. Last summer, I reported that a Baltimore cyber-security firm had identified prominent Ferguson organizer (and Baltimore mayoral candidate) Deray McKesson as a "threat actor" who needed "continuous monitoring" to ensure public safety. The firm—Zero Fox—briefed members of an FBI intelligence partnership program about the data it had collected on Freddie Gray protest organizers. It later passed the information along to Baltimore city officials.

Department of Homeland Security emails, meanwhile, have indicated that Homeland tracked the movements of protesters and attendees of a black cultural event in Washington, DC, last spring. Emails from New York City's Metropolitan Transit Authority and the Metro-North Railroad showed that undercover police officers monitored the activities of known organizers at Grand Central Station police brutality protests. The monitoring was part of a joint surveillance effort by MTA counter-terrorism agents and NYPD intelligence officers. (There are also well-documented instances of authorities spying on Occupy Wall Street activists.)

Some police departments may have access to a Stingray, a device that lets officers read texts and listen to phone calls—as well as track a phone's location.

In December 2014, Chicago activists, citing a leaked police radio transmission—alleged that city police used a surveillance device called a Stingray to intercept their texts and phone calls during protests over the death of Eric Garner. The device, designed by military and space technology giant Harris Corporation, forces all cell phones within a given radius to connect to it, reroutes communications through the Stingray, and allows officers to read texts and listen to phone calls—as well as track a phone's location. (According to the ACLU, at least 63 law enforcement agencies in 21 states use Stingrays in police work—frequently without a warrant—and that's probably an underestimate, since departments must sign agreements saying they will not disclose their use of the device.)

In addition to the official reports, several prominent Black Lives organizers in Baltimore, New York City, and Ferguson, Missouri, shared anecdotes of being followed and/or harassed by law enforcement even when they weren't protesting. One activist told me how a National Guard humvee had tailed her home one day in 2014 during the Ferguson unrest, matching her diversions turn for turn. Another organizer was greeted by dozens of officers during a benign trip to a Ferguson-area Wal-Mart, despite having never made public where she was going.

In light of the history and their own personal experiences, many activists have been taking extra precautions. "We know that lawful democratic activism is being monitored illegally without a warrant," says Malkia Cyril, director of the Center for Media Justice in Oakland and a signatory on the Apple-FBI letter. "In response, we are using encrypted technologies so that we can exercise our democratic First and Fourth Amendment rights." Asked whether she believes the FBI's promises to use any software Apple creates to break into the San Bernadino phone only, Cyril responds: "Absolutely not."

"I don't think it's any secret that activists are using encryption methods," says Lawrence Grandpre, an organizer with Leaders of a Beautiful Struggle in Baltimore. Grandpre says he and others used an encrypted texting app to communicate during the Freddie Gray protests. He declined to name the app, but said it assigns a PIN to each phone that has been approved to access messages sent within a particular group of people. If an unapproved device tries to receive a message, the app notifies the sender and blocks the message from being sent. Grandpre says he received these notifications during the Freddie Gray protests: "Multiple times we couldn't send text messages because the program said there's a possibility of interception."

Cyril says "all of the activists I know" use a texting and call-encryption app called Signal to communicate, and that the implication of a court verdict in favor of the FBI would be increased surveillance of the civil rights community. "It's unprecedented for a tech company—for any company—to be compelled in this way," Cyril says.

Apple has prepared for an epic fight. But if the FBI is able to crack an iPhone without a key, the BLM crowd will have one more thing to worry about. As Cyril put it in a tweet this past February, "In the context of white supremacy and police violence, Black people need encryption."

FBI is fighting back against Judge's Order to reveal TOR Exploit Code - The Hacker News 20160329

FBI is fighting back against Judge's Order to reveal TOR Exploit Code - The Hacker News 20160329

Last month, the Federal Bureau of Investigation (FBI) was ordered to reveal the complete source code for the TOR exploit it used to hack visitors of the world’s largest dark web child pornography site, PlayPen.

Robert J. Bryan, the federal judge, ordered the FBI to hand over the TOR browser exploit code so that defence could better understand how the agency hacked over 1,000 computers and if the evidence gathered was covered under the scope of the warrant.

Now, the FBI is pushing back against the federal judge’s order.

On Monday, the Department of Justice (DOJ) and the FBI filed a sealed motion asking the judge to reconsider its ruling, saying revealing the exploit used to bypass the Tor Browser protections is not necessary for the defense and other cases.

In previous filings, the defence has argued that the offensive operation used in the case was "gross misconduct by government and law enforcement agencies," and that the Network Investigative Technique (NIT) conducted additional functions beyond the scope of the warrant.

The Network Investigative Technique or NIT is the FBI's terminology for a custom hacking tool designed to penetrate TOR users.

This particular case concerns Jay Michaud, one of the accused from Vancouver, Washington, who was arrested in last year after the FBI seized a dark web child sex abuse site and ran it from agency’s own servers for the duration of 13 days.

During this period, the FBI deployed an NIT tool against users who visited particular, child pornography threads, grabbing their real IP addresses among other details. This leads to the arrests of Michaud among others.

The malware expert, Vlad Tsyrklevich held by the defense to analyse the NIT, said that it received only the parts of the NIT to analyse, but not sections that would ensure that the identifier attached to the suspect's NIT-infection was unique.
"He is wrong," Special Agent Daniel Alfin writes. "Discovery of the 'exploit' would do nothing to help him determine if the government exceeded the scope of the warrant because it would explain how the NIT was deployed to Michaud's computer, not what it did once deployed."

In a separate case, the Tor Project has accused the FBI of paying Carnegie Mellon University (CMU) at least $1 Million to disclose the technique it had discovered that could help them unmask Tor users and reveal their IP addresses. Though, the FBI denies the claims.

Crockford, Kade - Keep Fear Alive - The bald-eagle boondoggle of the terror wars - The Baffler 20160311

Crockford, Kade - Keep Fear Alive - The bald-eagle boondoggle of the terror wars - The Baffler 20160311

Burke_LibertyB30.3_96

“If you’re submitting budget proposals for a law enforcement agency, for an intelligence agency, you’re not going to submit the proposal that ‘We won the war on terror and everything’s great,’ cuz the first thing that’s gonna happen is your budget’s gonna be cut in half. You know, it’s my opposite of Jesse Jackson’s ‘Keep Hope Alive’—it’s ‘Keep Fear Alive.’ Keep it alive.”
—Thomas Fuentes, former assistant director, FBI Office of International Operations

Can we imagine a free and peaceful country? A civil society that recognizes rights and security as complementary forces, rather than polar opposites? Terrorist attacks frighten us, as they are designed to. But when terrorism strikes the United States, we’re never urged to ponder the most enduring fallout from any such attack: our own government’s prosecution of the Terror Wars.

This failure generates all sorts of accompanying moral confusion. We cast ourselves as good, but our actions show that we are not. We rack up a numbing litany of decidedly uncivil abuses of basic human rights: global kidnapping and torture operations, gulags in which teenagers have grown into adulthood under “indefinite detention,” the overthrow of the Iraqi and Libyan governments, borderless execution-by-drone campaigns, discriminatory domestic police practices, dragnet surveillance, and countless other acts of state impunity.

The way we process the potential cognitive dissonance between our professed ideals and our actual behavior under the banner of freedom’s supposed defense is simply to ignore things as they really are.

They hate us for our freedom, screech the bald-eagle memes, and so we must solemnly fight on. But what, beneath the official rhetoric of permanent fear, explains the collective inability of the national security overlords to imagine a future of peace?

Incentives, for one thing. In a perverse but now familiar pattern, what we have come to call “intelligence failures” produce zero humility, and no promise of future remedies, among those charged with guarding us. Instead, a new array of national security demands circulate, which are always rapidly met. In America, the gray-haired representatives of the permanent security state say their number one responsibility is to protect us, but when they fail to do so, they go on television and growl. To take but one recent example, former defense secretary Donald Rumsfeld appeared before the morally bankrupt pundit panel on MSNBC’s Morning Joe to explain that intractable ethnic, tribal, and religious conflict has riven the Middle East for more than a century—the United States, and the West at large, were mere hapless bystanders in this long-running saga of civilizational decay. This sniveling performance came, mind you, just days after Politico reported that, while choreographing the run-up to the 2003 invasion of Iraq, Rumsfeld had quietly buried a report from the Joint Chiefs of Staff indicating that military intelligence officials had almost no persuasive evidence that Saddam Hussein was maintaining a serious WMD program. Even after being forced to resign in embarrassment over the botched Iraq invasion a decade ago, Rumsfeld continues to cast himself as an earnestly out-manned casualty of Oriental cunning and backbiting while an indulgent clutch of cable talking heads nods just as earnestly along.

And the same refrain echoes throughout the echelons of the national security state. Self-assured and aloof as the affluenza boy, the FBI, CIA, and NSA fuck up, and then immediately apply for a frenzied transfer of ever more money, power, and data in order to do more of what they’re already doing. Nearly fifteen years after the “Global War on Terror” began, the national security state is a trillion-dollar business. And with the latest, greatest, worst-ever terrorist threat always on the horizon, business is sure to keep booming.

The paradox produces a deep-state ouroboros: Successful terrorist attacks against the West do not provoke accountability reviews or congressional investigations designed to truly understand or correct the errors of the secret state. On the contrary, arrogant spies and fearful politicians exploit the attacks to cement and expand their authority. This permits them, in turn, to continue encroaching on the liberties they profess to defend. We hear solemn pledges to collect yet more information, to develop “back doors” to decrypt private communications, to keep better track of Muslims on visas, send more weapons to unnamed “rebel groups,” drop more cluster bombs. Habeas corpus, due process, equal protection, freedom of speech, and human rights be damned. And nearly all the leaders in both major political parties play along, like obliging extras on a Morning Joe panel. The only real disagreement between Republican and Democratic politicians on the national stage is how quickly we should dispose of our civil liberties. Do we torch the Bill of Rights à la Donald Trump and Dick Cheney, or apply a scalpel, Obama-style?

Safety Last

Both Democrats and Republicans justify Terror War abuses by telling the public, either directly or indirectly, that our national security hangs in the balance. But national security is not the same as public safety. And more: the things the government has done in the name of preserving national security—from invading Iraq to putting every man named Mohammed on a special list—actually undermine our public safety.

That’s because, as David Talbot demonstrates in The Devil’s Chessboard, his revelatory Allen Dulles biography and devastating portrait of a CIA run amok, national security centers on “national interests,” which translates, in the brand of Cold War realpolitik that Dulles pioneered, into the preferred policy agendas of powerful corporations.

Public safety, on the other hand, is concerned with whether you live or die, and how. Any serious effort at public safety requires a harm-reduction approach acknowledging straight out that no government program can foreclose the possibility of terroristic violence. The national security apparatus, by contrast, grows powerful in direct proportion to the perceived strength of the terrorist (or in yesterday’s language, the Communist) threat—and requires that you fear this threat so hysterically that you release your grip on reason. Reason tells you government cannot protect us from every bad thing that happens. But the endlessly repeated national security meme pretends otherwise, though the world consistently proves it wrong.

When it comes to state action, the most important distinction between what’s good for public safety (i.e., your health) and what’s good for national security (i.e., the health of the empire, markets, and prominent corporations) resides in the concept of the criminal predicate. This means, simply, that an agent of the government must have some reasonable cause to believe you are involved with a crime before launching an investigation into your life. When the criminal predicate forms the basis for state action, police and spies are required to focus on people they have reason to believe are up to no good. Without the criminal predicate, police and spies are free to monitor whomever they want. Police action that bypasses criminal predicates focuses on threats to people and communities that threaten power—regardless of whether those threats to power are fully legal and legitimate.

Nearly fifteen years after the “Global War on Terror” began, the national security state is a trillion-dollar business.

We can see the results of this neglect everywhere the national security state has set up shop. Across the United States right now, government actors and private contractors paid with public funds are monitoring the activities of dissidents organizing to end police brutality and the war on drugs, Israeli apartheid and colonization in Palestine, U.S. wars in the Middle East, and Big Oil’s assault on our physical environment. In the name of fighting terrorism, Congress created the Department of Homeland Security, which gave state and local law enforcement billions of dollars to integrate police departments into the national intelligence architecture. As a result, we now have nearly a million cops acting as surrogates for the FBI. But as countless studies have shown, the “fusion centers” and intelligence operations that have metastasized under post-9/11 authorities do nothing to avert the terror threat. Instead, they’ve targeted dissidents for surveillance, obsessive documentation, and even covert infiltration. When government actors charged with protecting us use their substantial power and resources to track and disrupt Black Lives Matter and Earth First! activists, they are not securing our liberties; they’re putting them in mortal peril.

Things weren’t always like this. Once upon a time, America’s power structure was stripped naked. When the nation saw the grotesque security cancer that had besieged the body politic in the decades after World War II (just as Harry Truman had warned it would) the country’s elected leadership reasserted control, placing handcuffs on the wrists of the security agencies. This democratic counterattack on the national security state not only erected a set of explicit protocols to shield Americans from unconstitutional domestic political policing, but also advanced public safety.

Mission Creeps

As late as the 1970s, the FBI was still universally thought to be a reputable organization in mainstream America. The dominant narrative held that J. Edgar Hoover’s capable agents, who had to meet his strict height, weight, and dress code requirements, were clean-cut, straight-laced men who followed the rules. Of course, anyone involved with the social movements of that age—anti-war, Communist, Black Power, American Indian, Puerto Rican Independence—knew a very different FBI, but they had no evidence to prove what they could see and feel all around them. And since this was the madcap 1970s, the disparity between the FBI’s glossy reputation as honest crusaders and its actual dirty fixation on criminalizing the exercise of domestic liberties drove a Pennsylvania college physics professor and anti-war activist named William Davidon to take an extraordinary action. On the night of the Muhammad Ali vs. Joe Frazier fight of March 8, 1971, Davidon and some friends broke into an FBI office in Media, Pennsylvania. They stole every paper file they could get their hands on. In communiqués to the press, to which they attached some of the most explosive of the Hoover files, they called themselves the Citizens’ Commission to Investigate the FBI.

Not one of the costly post-9/11 surveillance programs based on suspicionless, warrantless monitoring stopped Tsarnaev from blowing up the marathon.

When Davidon and his merry band of robbers broke into the FBI office, they blew the lid off of decades of secret—and sometimes deadly—police activity that targeted Black and Brown liberation organizers in the name of fighting the Soviet red menace. According to Noam Chomsky, the Citizens’ Commission concluded that the vast majority of the files at the FBI’s Media, Pennsylvania, office concerned political spying rather than criminal matters. Of the investigative files, only 16 percent dealt with crimes. The rest described FBI surveillance of political organizations and activists—overwhelmingly of the left-leaning variety—and Vietnam War draft resisters. As Chomsky wrote, “in the case of a secret terrorist organization such as the FBI,” it was impossible to know whether these Pennsylvania figures were representative of the FBI’s national mandate. But for Bill Davidon and millions of Americans—including many in Congress who were none too pleased with the disclosures—these files shattered Hoover’s image as a just-the-facts G-man. They proved that the FBI was not a decent organization dedicated to upholding the rule of law and protecting the United States from foreign communist threats, but rather a domestic political police primarily concerned with preserving the racist, sexist, imperialist status quo.

In a cascade of subsequent transparency efforts, journalists, activists, and members of Congress all probed the darker areas of the national security state, uncovering assassination plots against foreign leaders, dragnet surveillance programs, and political espionage targeting American dissidents under the secret counterintelligence program known as COINTELPRO. Not since the birth of the U.S. deep state, with the 1947 passage of the National Security Act, had the activities of the CIA, FBI, or NSA been so publicly or thoroughly examined and contested.

Subsequent reforms included the implementation of new attorney general’s guidelines for domestic investigations, which, for the first time in U.S. history, required FBI agents to suspect someone of a crime before investigating them. Under the 1976 Levi guidelines, named for their author, Nixon attorney general Edward Levi, the FBI could open a full domestic security investigation against someone only if its agents had “specific and articulable facts giving reason to believe that an individual or group is or may be engaged in activities which involve the use of force or violence.” The criminal predicate was now engraved in the foundations of the American security state—and the Levi rules prompted a democratic revolution in law enforcement and intelligence circles. It would take decades and three thousand dead Americans for the spies to win back their old Hoover-era sense of indomitable mission—and their investigative MO of boundless impunity.

False Flags

In the years following the 9/11 attacks, the Bush administration began Hoovering up our private records in powerful, secret dragnets. When we finally learned about the warrantless wiretapping program in 2005, it was a national scandal. But just as important, and much less discussed, was the abolition of Levi’s assertion of the criminal predicate. So-called domestic terrorism investigations would be treated principally as intelligence or espionage cases—not criminal ones. This shift has had profound, if almost universally ignored, implications.

Michael German, an FBI agent for sixteen years working undercover in white supremacist organizations to identify and arrest terrorists, saw firsthand what the undoing of the 1970s intelligence reforms meant for the FBI. And German argues, persuasively, that the eradication of the criminal predicate didn’t just put Americans at risk of COINTELPRO 2.0. It also threatened public safety. The First and Fourth Amendments, which protect, respectively, our rights to speech and association and our right to privacy, don’t just create the conditions for political freedom; they also help law enforcement focus, laser-like, on people who have the intent, the means, and the plans to harm the rest of us.

Think of it like this, German told me: You’re an FBI agent tasked with infiltrating a radical organization that promotes violence as a means of achieving its political goals—the Ku Klux Klan, for example. KKK members say horrible and disgusting things. But saying disgusting things isn’t against the law; nor, as numerous studies have shown, is it a reliable predictor of whether the speaker will commit an act of political violence. When surrounded by white supremacists constantly spouting hate speech, a law enforcement officer has to block it out. If he investigates people based on their rhetoric, his investigations will lead nowhere. After all, almost no white supremacist seriously intending to carry out a terrorist attack is all that likely to broadcast that intent in public. (Besides, have you noticed how many Americans routinely say disgusting things?)

Today, more than a decade after it shrugged off the Levi guidelines, the FBI conducts mass surveillance directed at the domestic population. But dragnet surveillance, however much it protects “national security,” doesn’t increase public safety, as two blue-ribbon presidential studies have in recent years concluded. Indeed, the Boston bombings, the Paris attacks, and the San Bernardino and Planned Parenthood shootings have all made the same basic point in the cold language of death. The national security state has an eye on everyone, including the people FBI director James Comey refers to as “the bad guys.” But despite its seeming omniscience, the Bureau does not stop those people from killing the rest of us in places where we are vulnerable.

The curious case of Boston Marathon bomber Tamerlan Tsarnaev demonstrates the strange consequences of sidelining criminal investigations for national security needs. In 2011, about eighteen months before the bombings, Tsarnaev’s best friend and two other men were murdered in a grisly suburban scene in Waltham, Massachusetts—their throats slashed, marijuana sprinkled on their mutilated corpses. These murders were never solved. But days after the marathon bombings, law enforcement leaked that they had forensic and cellphone location evidence tying Tamerlan Tsarnaev to those unsolved crimes. Not one of the costly post-9/11 surveillance programs based on suspicionless, warrantless monitoring stopped Tsarnaev from blowing up the marathon. But if the police leaks were correct in assigning him responsibility for the 2011 murders, plain old detective work likely would have.

If security agencies truly want to stop terrorism, they should eliminate all domestic monitoring that targets people who are not suspected of crimes. This would allow agents to redirect space and resources now devoted to targeting Muslims and dissidents into serious investigations of people actually known to be dangerous. It’s the only reasonable answer to the befuddling question: Why is it that so many of these terrorists succeed in killing people even though their names are on government lists of dangerous men?

After the terrorist attacks in November, the French government obtained greater emergency powers in the name of protecting a fearful public. Besides using those powers to round up hundreds of Muslims without evidence or judicial oversight, French authorities also put at least twenty-four climate activists on house arrest ahead of the Paris Climate Change Conference—an approach to squashing dissent that didn’t exactly scream liberté, and had nothing to do with political violence. As with the Boston Marathon and countless other attacks on Western targets, the men who attacked the Bataclan were known to intelligence agencies. In May 2015, months before the attacks in Paris, French authorities gained sweeping new surveillance powers authorizing them to monitor the private communications of suspected terrorists without judicial approval. The expanded surveillance didn’t protect the people of Paris. In France, as in the United States, the devolution of democratic law enforcement practice has opened up space that’s filled with political spying and methods of dragnet monitoring that enable social and political control. This is not only a boondoggle for unaccountable administrators of mass surveillance; it also obstructs the kind of painstaking detective work that might have prevented the attacks on the Bataclan and the marathon.

Our imperial government won’t ever admit this, but we must recognize that the best method for stopping terrorism before it strikes is to stop engaging in it on a grand scale. Terrorist attacks are the price we pay for maintaining a global empire—for killing a million Iraqis in a war based on lies, for which we have never apologized or made reparations, and for continuing to flood the Middle East with weapons. No biometrics program, no database, no algorithm, no airport security system will protect us from ourselves.

Timm, Trevor - Congress showed it's willing to fight the FBI on encryption. Finally - 20160301

Timm, Trevor - Congress showed it's willing to fight the FBI on encryption. Finally - 20160301

congress

Members of Congress did something almost unheard of at Tuesday’s hearing on the brewing battle over encryption between Apple and the FBI: their job. Both Democrats and Republicans grilled FBI director Jim Comey about his agency’s unprecedented demand that Apple weaken the iPhone’s security protections to facilitate surveillance. This would have dire implications for smartphone users around the globe.

Normally, congressional committee hearings featuring Comey are contests among the members over who can shower the FBI director with the most fawning compliments in their five-minute allotted time frame. Hard questions about the agency’s controversial tactics are avoided at all costs. But on Tuesday, in rare bipartisan fashion, virtually every member of the House judiciary committee asked Comey pointed questions and politely ripped apart his arguments against Apple.

One judiciary member questioned how the FBI managed to mess up so badly during the San Bernardino investigation and reset the shooter’s password, which is what kicked this whole controversy and court case in motion in the first place. And if the case was such an emergency, why did they wait 50 days to go to court? Another member questioned what happens when China inevitably asks for the same extraordinary powers the FBI is demanding now. Others questioned whether the FBI had really used all the resources available to break into the phone without Apple’s help. For example, why hasn’t the FBI attempted to get the NSA’s help to get into the phone, since hacking is their job?

Comey readily admitted that the San Bernardino case could set a precedent for countless others after it, and that it won’t just be limited to one phone, as the FBItried to suggest in the days after the filing became public. Comey said the FBI has so many encrypted phones in its possession that he doesn’t know the number (that’s not including the hundreds of local police forces that are itching to force Apple to create software to decrypt those as well). Comey also admitted under questioning that terrorists would just move to another encrypted device if Apple was forced to do what the government is asking, and that there are companies all over the world offering similar products.

More than anything, though, the members of Congress expressed anger that theFBI director didn’t follow through earlier on his stated intention to engage in a debate in Congress and the public about the proper role for encryption in society. Instead, he decided to circumvent that debate altogether and quietly go to court to get a judge to do what the legislative branch has so far refused to do.

This all comes on the heels of a judge in New York strongly rebuking the FBI and Department of Justice in a court decision on Monday. (The New York case is different from the high profile San Bernardino situation that has garnered more media attention.) Comey, despite knowing he would testify on Tuesday, decided not to read the opinion from the previous day. He didn’t give a reason for why he didn’t, but given the judge thoroughly dismantled every argument the government put forward, maybe he couldn’t stomach it.

The court hearing in the San Bernardino case is in two weeks, and there is no doubt that this is really only the beginning of the debate. But, for the first time, it seems like Congress has finally opened its eyes to the long-term effects of designing vulnerabilities into our communications systems and forcing tech companies to becomes investigative arms of the government.

EFF - Why

Home

Deep Dive: Why Forcing Apple to Write and Sign Code Violates the First Amendment

EFF filed an amicus brief today in support of Apple's fight against a court order compelling the company to create specific software to enable the government to break into an iPhone. The brief is written on behalf of 46 prominent technologists, security researchers, and cryptographers who develop and rely on secure technologies and services that are central to modern life. It explains that the court’s unprecedented order would violate Apple’s First Amendment rights. That’s because the right to free speech prohibits the government from compelling unwilling speakers to speak, and the act of writing and, importantly, signingcomputer code is a form of protected speech. So by forcing Apple to write and sign an update to undermine the security of its iOS software, the court is also compelling Apple to speak—in violation of the First Amendment.

On February 16, a federal magistrate judge in southern California ordered Apple to write and sign the new code in support of the FBI’s ongoing investigation of last December’s San Bernardino shooting. The court granted the government’s request to require Apple to provide software to help unlock an iPhone 5c used by one of the shooters. The phone is encrypted with a passcode and protected by additional iOS security features the government says it cannot bypass. In an unprecedented move, the order requires Apple to create a brand new version of its operating system with intentionally weakened security features, which the government can then use to get into the phone.

On February 25, Apple filed a motion to vacate the Judge’s order. Apple argued that compelling it to create and sign code is an extraordinary expansion of the All Writs Act, the law the government is relying on in this case. Earlier this week, a judge in New York—in a different iPhone unlocking case involving an older version of iOS—denied a request under the All Writs Act that would have forced Apple to bypass the lock screen of a seized iPhone. The judgerecognized that forcing Apple to unlock the phone would require an absurd interpretation of the All Writs Act.

But what the government is asking Apple to do in this case—i.e., force Apple and its programmers to write and sign the code necessary to comply with the judge’s order—is not just an unprecedented expansion of the All Writs Act that puts the security and privacy of millions of people at risk. It is also a violation of the First Amendment.

As we explain in our amicus brief, digital signatures are a powerful way of communicating the signer’s endorsement of the signed document—in this case, the custom iOS code. Due to the mathematical properties of digital signatures—invented in part by signers of our brief, including Martin Hellman and Ron Rivest—it would be very difficult to impersonate Apple without possessing the company’s secret signing key. Apple has chosen to build its iOS in such a way that its devices only accept iOS code signed by Apple, a design it believes best ensures user trust and strengthens the security of these devices. Since over 3 million phones were stolen in 2013 alone, the protections Apple is providing are important. By requiring Apple to sign code that undermines the security features Apple has included in iOS, the court’s order directly compels the company’s strong and verifiably authentic endorsement of the weakened code.

This is where the First Amendment comes in. The Constitution clearly prevents the government from forcing people to endorse positions they do not agree with. Whether that endorsement takes the form of raising your hand, signing a loyalty oath, putting a license plate motto on your car or, as here, implementing an algorithm that creates a digital signature, the problem is the same. As the Supreme Court noted in a case involving whether the government could force private parade organizers to include viewpoints they disagreed with, “[W]hen dissemination of a view contrary to one’s own is forced upon a speaker intimately connected with the communication advanced, the speaker’s right to autonomy over the message is compromised.”As a result, government mandates requiring people to speak are subject to strict scrutiny—the most stringent standard of judicial review in the United States.

Of course, the fact that Apple expresses its beliefs in the language of computer code and in digital signatures verifying its code implicates a set of cases where EFF pioneered the recognition that writing computer code is a form of, well, writing. In Bernstein v. DOJ and later in Universal City Studios, Inc. v. Corley, the courts agreed with us that, just like musical scores and recipes, computer code “is an expressive means for the exchange of information and ideas.” The fact that the expression comes in the form of code may implicate the level of regulation the government can apply, but not whether the code is in fact expressive.

Here, the problem is even more acute. Apple is being forced to actually write and endorse code that it—rightly—believes is dangerous. And in doing so, it is being forced to undermine the trust it has established in its digital signature. The order is akin to the government forcing Apple to write a letter in support of backdoors and sign its forgery-resistant name at the bottom. This is a clear violation of Apple’s First Amendment rights, in addition to being a terrible outcome for all the rest of us who rely on digital signatures and trustworthy updates to keep our lives secure.

The court will hear argument on Apple’s motion to vacate at 1:00 p.m. on March 22, 2016, in Riverside.  We hope the judge reconsiders this dangerous and unconstitutional order.

UN human rights chief: Lives could be in danger if the FBI forces Apple to help unlock iPhone - The Washington Post 20160304

UN human rights chief: Lives could be in danger if the FBI forces Apple to help unlock iPhone - The Washington Post 20160304

The top human rights authority at the United Nations warned Friday that if the FBI succeeds in forcing Apple to unlock an iPhone used by one of the San Bernardino attackers, it could have “tremendous ramifications” around the world and “potentially [be] a gift to authoritarian regimes, as well as to criminal hackers.”

The statement came a day after a deluge of technology companies and other groups publicly backed Apple in the fight, and it echoed what many of these firms and groups said in arguing that the FBI’s demands could have a devastating impact on digital privacy going forward.

“In order to address a security-related issue related to encryption in one case, the authorities risk unlocking a Pandora’s Box that could have extremely damaging implications for the human rights of many millions of people, including their physical and financial security,” Zeid Ra’ad Al Hussein, the U.N. High Commissioner for Human Rights, said in a statement Friday.

If the FBI prevails, Hussein argued, it would set a precedent that could make it impossible to fully protect privacy worldwide.

[Relatives of San Bernardino victims, tech groups take sides in FBI-Apple fight]

“Encryption tools are widely used around the world, including by human rights defenders, civil society, journalists, whistle-blowers and political dissidents facing persecution and harassment,” Hussein said.

Apple is fighting a judge’s order directing the company to help the FBI unlock an iPhone found after the Dec. 2 attack in San Bernardino, Calif. While the Justice Department and other law enforcement groups argue that the demands here are specific and focused on one investigation, Apple and other tech firms are arguing that an FBI victory here could be utilized in countless other cases.

The locked iPhone 5C belonged to Syed Rizwan Farook, who along with his wife, Tafsheen Malik, fatally shot 14 people and wounded 22 others during the attack. Both attackers, who pledged loyalty to the Islamic State, were killed hours after the shooting, and investigators say they are still looking into whether the pair had any ties to groups or people operating overseas.

Federal authorities obtained a magistrate judge’s order directing Apple to write software that would disable a feature that deletes the data on the iPhone — which is owned by San Bernardino County and was given to Farook in his job as a health inspector — after 10 incorrect password attempts.

Apple has fought the FBI’s order — in court, on Capitol Hill and through public statements — and this week, the company received the backing of dozens of other companies, groups and individuals.

Bruce Sewell, general counsel at Apple, before testifying on Capitol Hill this week. (Andrew Harrer/Bloomberg)
A ream of major tech companies — including Google, Amazon, Facebook, Yahoo, Twitter, Snapchat and Microsoft — signed on to court briefs that warned of “a dangerous precedent” for digital security if Apple was forced to act “against their will.” These calls were joined by groups including the American Civil Liberties Union, several trade and policy groups and dozens of technologists, security researchers and cryptographers.

The Justice Department received the backing of relatives of some of the people killed in the San Bernardino attack as well as briefs of support from law enforcement groups representing officers in California and across the country.

Hussein said that the United Nations fully supported the FBI’s investigation into the “abominable crime,” but argued against viewing this as an isolated case.

He pointed back to a report his office released last year saying that strong encryption and digital privacy are important to human rights, stating: “It is neither fanciful nor an exaggeration to say that, without encryption tools, lives may be endangered.”

In trying to glean information on the locked iPhone, authorities could “end up enabling a multitude of other crimes all across the world, including in the United States,” he said.