Tag Archives: Ars Technica

Privacy Shield doesn’t do enough to curtail US surveillance, say EU data watchdogs - Ars Technica 20160413

Privacy Shield doesn’t do enough to curtail US surveillance, say EU data watchdogs - Ars Technica 20160413

"Great step forward," but still work to do, say privacy experts.

Exceptions in the proposed EU-US Privacy Shield framework that would allow the US to carry out mass surveillance of EU citizens are "not acceptable," the Article 29 Working Party of EU data protection authorities said today in a press conference.

The Chairman of the group, Isabelle Falque-Pierrotin, explained that the Article 29 Working Party would look with "great interest" on the forthcoming ruling by the Court of Justice of the European Union (CJEU) on whether mass surveillance of EU citizens could be legal. If the CJEU finds that the surveillance carried out by GCHQ is unlawful, it would have a big impact on the national security exceptions included in Privacy Shield.

Falque-Pierrotin said that the data protection authorities also had some concerns about the independence and effectiveness of the Privacy Shield ombudsperson who will deal with complaints from Europeans about how their data has been used by the NSA.

However, the Article 29 Working Party called the proposed Privacy Shield in general a "great step forward" compared to the Safe Harbour framework it is designed to replace. But Falque-Pierrotin said "it is rather difficult to understand all the documents and annexes, as they are complex and not consistent." She went on: "we believe it would have been better to have something simpler and less complex."

Falque-Pierrotin pointed out that the imminent arrival of new data protection rules in the EU meant that the Privacy Shield needed some kind of review mechanism to allow it to be updated. Currently, there is no provision to do this.


Unlikely to satisfy Europe's data protection watchdogs—nor, for that matter, EU's top court.
The Article 29 Data Protection Working Party, which was set up under the 1995 Directive on the protection of personal data, is purely advisory, and the European Commission is not obliged to follow its advice.

Before making a final decision whether to proceed with the Privacy Shield framework, the Commission will wait to hear from another group set up under the 1995 Directive. The Article 31 Committee consists of representatives of the Member States, and therefore follows their policies, which are broadly in favour of Privacy Shield. The Article 31 Committee is expected to consider the Privacy Shield arrangement at meetings on April 29 and May 19 before issuing its opinion.

The European Commission must then decide whether to try to modify the current Privacy Shield proposal in the light of the Article 29 Working Party's comments, plus any made by the Article 31 Committee. The Commission told Ars that it is hopeful it will be able to give the go-ahead for Privacy Shield in June, which would then come into immediate effect. The European Parliament does not have a vote on this issue, which lies purely within the competence of the Commission.

Until then, the alternative transfer mechanisms, such as standard contractual clauses and binding corporate rules, can still be used for personal data transfers to the US. Falque-Pierrotin said that the Article 29 Working group would not be considering whether these were valid until after the European Commission had produced the final version of Privacy Shield.

Brussels terror attacks: Why ramping up online surveillance isn’t the answer - Ars Technica 20160402

Brussels terror attacks: Why ramping up online surveillance isn’t the answer - Ars Technica 20160402

I am in Brussels. And I am scared. Very scared… of the probable security backlash following last month’s terrorist attacks.

I don’t want to live in a city where everyone is viewed with suspicion by the authorities, because it won’t stop there. Because suspicion is infectious. When misappropriated and misdirected, that sort of suspicion can very easily become racism and prejudice—two of the key ingredients that led the awful attacks on the morning of Tuesday, March 22.

ISIL is not only fighting a cultural war; it's fighting a media one. For that reason maybe we should really stop talking about it as though it was a “real” war. As though there were valiant warriors on both sides. As though those responsible for the Brussels bombings are anything more than common murderers, plain and simple. Truthfully, the only community the Brussels attackers belong to now is the criminal community.

It is high time to strip terrorists of their mystique. We must stop playing their game. Statistically, I am not any less safe today than I was on the Monday before the attacks. Yet if many politicians have their way, my activities will be monitored a great deal more.

Two days after the attacks, EU ministers met in Brussels at an emergency justice and home affairs council, and predictably demanded more access to our Internet histories, more powers to track people, and more ways to break into our private communications.

The European People's Party has reportedly said it wants personal data on everyone who takes a train to be stored. Meanwhile, a so-called Passenger Name Record is in the works for all airline passengers.

And, even before the terrorist attacks, Belgium officials were mulling the expansion of the country's data collecting and storing laws. Never mind that the European Court of Justice, and the Belgian Constitutional Court have ruled that data retention is illegal. My adopted country also plans new surveillance legislation that would allow intelligence agencies more freedom to eavesdrop on cross-border communications: “Hello Mum, nice to talk to you… and everyone else listening in.”

Turning leaky tap on secure apps

On a European level, the ePrivacy Directive is up for review this year, and there will be no prizes for predicting that secure online communications services, such as WhatsApp and Telegram, and even Viber, Skype, and Facebook Messenger could all be in the cross-hairs. Will there be anywhere left if you want to have a private conversation online?

Like anyone, I believe those who carried out the attacks in Brussels should be caught and brought to justice, but not at any cost. And certainly not at the cost to ordinary citizens’ freedom, and way of life.


But UN official warns: "without encryption tools, lives may be endangered."
That is even supposing these new measures would work to prevent future attacks: I’ve seen no evidence—and I've asked the question among many Brussels-based folk—to support that view. Does taking a plane or a train make you more likely to be a terrorist—sorry—murderer? Are overwhelmed police forces really able to cope with combing through that amount of data? Experience suggests that having access to the travel histories of everyone would have made little difference in the Brussels case. Europe’s security problem is not with too little information, but with too little sharing, and understanding of that information.

Just like physical security, increased surveillance powers generally don’t make us any safer. The reality is that when we see souped-up security guards everywhere we don’t feel more secure. Often the effect is the opposite. Security theatre isn’t even effective as good theatre.

I completely understand the desire to do something—anything!—after such a horrible atrocity. One of the most difficult emotions I had to cope with, as the horror unfolded, was feeling useless. But sometimes, especially when we are shocked, doing nothing is the better option.

Kneejerk reactions are almost always the most ill-thought out ones. Four day before the attacks, the European Data Protection Supervisor, Giovanni Buttarelli, put out a press release saying that legislative proposals to fight cross-border crime including terrorism were too rushed and too weak to do the job, anyway. Now, post-March 22, hasty decisions are even more likely.

That I am appalled at some of the reaction to the attacks is not a surprise. My view has always been that more “security,” more surveillance, and more data retention, not only won't work, but will undermine our rights. My opinion on this point is not new. What is new is that it has been tested. As someone who walks the streets past Maelbeek metro station every day, I feel I have a valid insight on what will and won't make me feel safe, and how much of my privacy I am willing to give up for it.

Existing Belgian powers didn't prevent these attacks

Here in Belgium, investigators already have the power to get a court order on telecoms operators to track a suspect’s SIM card down to the nearest phone tower location, as they did with Salah Abdeslam, the man suspected of being behind the Paris attacks. Of course terrorists must be expected to keep their phone with them at all times. Imagine if they learned sophisticated counter-intelligence techniques like, say, leaving it at home!

Miguel Discart
Yes, I have sat down and cried at what has happened in my city. I swallowed fear as I anxiously waited to hear from loved ones. I felt powerless, and grateful to strangers for support in the wake of the attacks. Today, I feel angry that cynical opportunists will twist this to their own ends.

In my ideal world there would be a moratorium on new security or surveillance laws for at least three months following a terror atrocity. That won’t happen because, as bad as knee-jerk reactions are, there are others who will have waited for just this sort of event to push their own agenda. That, as much as everything else that has happened in the days since the attacks on Brussels, makes me want to weep.

Predictably, even those further afield had an opinion: US presidential candidate Hillary Clinton said “we have to toughen our surveillance, our interception of communication.” Presumably “we” in this instance is those already with huge amounts of power, and control.

Meanwhile, would-be US president Donald Trump reportedly said he would use torture to combat terrorism. I read sensible, reasonable replies from decent people explaining why torture doesn’t work or is unreliable for gathering information. And—for a moment—this seems a reasonable conversation. Before I jolt back to myself and realise this is torture we are talking about. Surely any decent human being opposes it on principle. What point have we reached, where we are even in a position of discussing this?


"Mustn't sacrifice quality to meet the deadline," says shadow home secretary.
I am well aware that collecting PNR data is not comparable to torture. And I am not opposed to proportionate and specific surveillance. My opposition, on principle, is to mass unjustified collection of personal information "just in case." Just in case of what? In case we're all closet terrorists?

Predictably, Europol Director Rob Wainwright blamed encryption. He told POLITICO. “Encrypted communication via the Internet, and smartphones are a part of the problems investigators face in these instances. We have to find a more constructive legislative solution for this problem of encryption.”

Since when is encryption a problem? Encryption is what allows us to use online banking to book holidays, buy birthday presents. Using crypto tools doesn’t mean you are a terrorist. Weakening encryption will just create vulnerabilities that will be exploited by the very criminals and terrorists we want to stop. This is as good a reason as any to defend encryption.

But again I shake my head and wonder why we can’t just expect privacy on principle. I am frustrated and saddened that the default position of treating other humans as decent law-abiding folk, is changing to one where the assumption is we are all potential terrorists.

The terrorist’s weapon of choice is fear. When we fear them, they have won a battle. When we start to fear each other—the woman in a headscarf on the metro, the man with the large bag at the airport, the teenager with his hands in his pockets—they are one step closer to winning the war. Let’s not play into their hands.

Senator: let’s fix “third-party doctrine” that enabled NSA mass snooping - Ars Technica 20160403

Senator: let’s fix “third-party doctrine” that enabled NSA mass snooping - Ars Technica 20160403

Q&A: Ars sits down with Oregon's outspoken advocate of strong crypto, Sen. Ron Wyden.

This past week hundreds of lawyers, technologists, journalists, activists, and others from around the globe descended upon a university conference center to try to figure out the state of digital rights in 2016. The conference, appropriately dubbed "RightsCon," featured many notable speakers, including Edward Snowden via video-conference, but relatively few from those inside government.

Sen. Ron Wyden (D-Oregon), however, was an exception. On the first day of the conference, he gave an in-person speech, in which he argued for a "New Compact for Security and Liberty."

The Oregon senator is likely familiar to Ars readers: he’s been one of the most consistently critical voices of the expansion of government surveillance in recent years. We last spoke with him in October 2014 when he made the case that expanded active spying hurts the American economy. In December 2014, Wyden introduced the "Secure Data Act" in the United States Senate, which aims to shut down government-ordered backdoors into digital systems. However, that bill hasn’t even made it to committee yet, over a year later.

On Thursday, the day after his address, Wyden sat down with Ars at a downtown Peet’s Coffee, where we chatted in a more detail about his proposal. What follows is the transcript of our conversation that has been lightly edited for clarity.

Ars: What does your compact mean in terms of new legislation? Because some of these items outlined in your speech, like the third-party doctrine, Congress doesn’t have the authority to overturn that.


Bill prevents FBI from meddling with companies that choose to encrypt by default.
A: Well, Congress could pass a law. But let’s begin at the beginning. What I wanted to do yesterday in this speech was to refocus the debate. More than anything else, that’s what the talk was about. I can tell you—and I don’t have an exact count—but my guess is that there have been thousands upon thousands of articles written in the last few months and they invariably start with the phrase: "In the ongoing debate between security and privacy, the following happened today... "
And I want to make clear that I don’t think that’s what the debate is all about. It is not about security versus privacy. In my view, this debate is about less security versus more security. My view is that at a time when millions of Americans have their life wrapped up in a smartphone—their medical records, their financial records, they might be tracking their child to make sure their child isn’t molested—strong encryption is the must-have go-to security tool for millions of Americans and the communities in which they live. So I want to re-focus the debate along those lines.

Are we to understand that what you're calling a compact will evolve into actual legislation?

Let’s take some of these devices one by one. As your readers know, for weeks now we’ve been told that there is going to be a Burr-Feinstein bill in the United States Senate that in fact would be a piece of legislation that would, in effect, mandate that a private company weaken the security of their products so they would be to comply with a court order. The first thing that I want to do as part of our strategy is to block that legislation. And I’m going to argue that it should be blocked on the grounds that it will weaken the security of millions of Americans. The second thing that I want to do after we block that bill is pass affirmative legislation that I’ve introduced called the Secure Data Act, where we wouldn’t be talking about blocking legislation, but talking about affirmative action to ensure the security of the data of millions of Americans. So those would clearly be two steps that would be very relevant to today’s discussion.

Beyond that, with respect to the third-party doctrine. I think that when people enter into a private business relationship, they don’t expect that that’s going to be public. And particularly now in an age of digital services I think it’s important that that law be re-written: that law stems from a decision that’s decades old. And I’m encouraged that even people like Justice Sotomayor thinks it ought to be rewritten. So that’s the third area.


Proposed rules to let one judge authorize "remote access" essentially anywhere.
A fourth area would be that we’re more vigilant with respect to administrative actions that might be taken that again, instead of a win-win in which we’ll have more security and more liberty, there will be a lose-lose. Yesterday I talked about Rule 41, which is something that the Justice Department wants to do, where in effect, they could get one warrant and in effect get access to scores and scores of computers outsides that one jurisdiction. And I think that’s a mistake.
And finally I talked about the need for more talent. I take a position that challenges the intelligence agencies to adapt to new times. That’s why I went through the Miranda decision and how people thought "Oh my goodness, we’ll never get a confession!" Obviously law enforcement adapted to those new challenges. I think having talented people, some of whom have been in the room at RightsCon, would be a very good way to adapt. So those are, kind of, the four or five areas where a combination of elected officials who block unwise measures and affirmatively move to pass legislation to update our laws makes sense.

We at Ars struggle, as I think a lot of people struggle, to not only understanding the tools like PGP, but also to put them into practice and use them. For example, at Ars, I’m one of six people who has a publicly-listed PGP key. I’d be curious to find out from you what kinds of tools you use in your office, what kinds of tools are used in the Senate more generally, and what that experience has been like.

First of all, I think that those who are using a smartphone are counting on encryption. And that is a basic security measure. But for me, the important way to assess your question is that when legislators make policy, the big mistakes come when they are reacting, particularly when there has been a horrible tragedy and someone makes a knee-jerk reaction. When you get a chance to reflect on it, instead of what I call a win-win—security and liberty—too often you get a lose-lose. For example you weaken strong encryption, the first thing that’s going to happen is people who seek encryption are going to go overseas where there are hundreds of products and there’s even less control over them.

You’re somebody who pops up in the news a lot, talking about these issues of privacy. You obviously care a lot about them. I’d love to hear how you plan to convince your colleagues of the importance of these issues. I think it can be hard for people who aren't as steeped in these issues to wrap their brains around them. So I’d love to hear what that process has been like for you.


Tech leaders also look to prevent government spying.
First of all, we’ve come quite a ways. Back when I started in 1996, I wrote the law that ensured that a website owner would not be held personally liable for something that was posted on the site. We wrote the digital signatures law and banning tax discrimination, for example, so that people who needed Internet access to get education and employment opportunities wouldn’t face problems. It’s been a pretty amazing ride since then. All the way to the time when the NSA overreached with respect to metadata. When we started, there were only a handful of us trying to rein in that overreach. By the time we were done, we had plenty of Republican votes and what had been a secret interpretation of the Patriot Act was gone. Education efforts take time.
One of my favorite accounts was that there was a law that came out of Intel Comm [Senate Select Committee on Intelligence] that passed 14-1 written by Sen. Feinstein (D-California) to deal with so-called overly broad leaks, and I knew the bill was a turkey from the very beginning, and I didn’t even know how bad it was. After it got out of committee, we had a chance to learn more about it, educate ourselves, we all talked about it, and by the time we were done, the senators who had written it didn’t want anything to do with it and we were able to get rid of it. So education efforts can take more time. But we’ve had a fair number of successes and of course nothing matches the campaign of SOPA and PIPA.

You talked about hiring more technologists. What would that look like in your mind?

Obviously I think it's very valuable for offices, individual House and Senate offices to have a go-to person who is knowledgeable about the technology. I was talking yesterday mostly about agencies like the FBI and the government.

Would that involve hiring from the private sector and bringing them on in these types of cases [that involve cryptography] ? Because obviously the FBI already has people...

I’d like them to be in a position to get leadership positions and permanent positions on the basis of their knowledge and expertise and the kinds of issues that people were talking about at RightsCon.

Are there any cases or issues that we in the public should be aware of in Oregon that maybe haven't hit the national stage yet?


DOJ won't say how, but its mysterious new method to bust through iPhone 5C worked.
Let’s put it this way: when the FBI said that they had been able to access the Apple San Bernardino phone, it was clear that was not the end of the debate. In fact, this debate is just starting. And we’ve heard about other jurisdictions that purportedly are looking at it. I’ve been very troubled and it will be something I'll be following up on. The FBI has said this just involves one phone. We’re talking about re-creating code, so it’s not about one. And then later the district attorney in New York talked about scores and scores of phones.
Is there anything that you'll be taking from your experience at RightsCon back to Oregon or to Washington?

I was hoping that it was a two-way street, and it was. My goal was to make sure that the people there who play these leadership roles in so many grassroots organizations, that they had a sense of what I as one elected official thought the challenge was all about. That’s why I said right at the beginning: I see our job as trying to convince politicians it’s not about security versus privacy, it’s about more security versus less security. And I think as we went we had a lot of good conversation. I think there was a lot of interest at RightsCon about what’s coming next—[people were asking] what does he think is coming next—and there was a lot of interest in Rule 41.

Last time we spoke, you’d mentioned that before the president leaves office that you wanted to play basketball with him. Is that going to happen?

It’d better happen soon! There are a few priorities for Oregon that I may see if I can get on the court. He’s been very gracious and he’s invited me multiple times and I think I indicated that I was saving it for something big that Oregon needs, and we’re heading into the home stretch.

Anything else that I didn’t ask you about that you’d like to add?

I think that this is going to be a very busy few months. People have asked what’s next, and we’re going to have some classified briefings, I assume, to try to figure out what the details are with respect to how the process went forward, with respect to accessing the data on the [San Bernardino] phone, and there are zero-day issues that we’re talking about and looking at.

UC Berkeley profs lambast new “black box” network monitoring hardware - Ars Technica 20160205

UC Berkeley profs lambast new “black box” network monitoring hardware - Ars Technica 20160205

University of California administration says it's just going after "bad actors."

Days after a group of concerned professors raised alarm bells over a new network monitoring system installed at the University of California, Berkeley and the other nine campuses of the University of California system, a separate committee of system-wide faculty has now given its blessing. Some Berkeley faculty remain concerned that their academic freedom has been threatened by the new full packet capture system that sits on each campus network’s edge, however. They say that retaining such information could be used as a way to constrain legitimate discussion or research on controversial topics.

Last summer, the University of California Office of the President (UCOP) ordered that a Fidelis XPS system be installed at all 10 campuses at a total estimated cost of at least a few million dollars. The Fidelis hardware and software is designed to "detect attacks" and analyze "every single packet that traverses the network."

The move came in response to a July 2015 attack against the University of California Los Angeles Health System, which resulted in 4.5 million records being stolen. Following that attack, University of California President Janet Napolitano, the former Secretary of Homeland Security, moved quickly to bring more digital monitoring onto the campuses, which stretch from Berkeley to San Diego. The UC Regents, the governing board of the entire UC system, now face 17 separate lawsuits as a result of the breach at UCLA. Similar network monitoring hardware has also been installed at other universities nationwide.

"We recognize that the essential openness of the University represents a cybersecurity challenge,"David G. Kay, a University of California Irvine computer science professor and head of the UC-wide committee, wrote in the Monday letter to the UC Academic Senate. "We have been informed that the monitoring of communications looked only for ‘malware signatures’ and Internet traffic patterns. As neither message content nor browsing activity were monitored, we believe this level of monitoring can be appropriate."

But exactly how the Fidelis XPS operates and what data is being retained and scanned is unknown.

Kate Moser, a UCOP spokeswoman, refused to answer Ars’ specific questions, referring us simply toprepared statements from both Napolitano and Executive Vice President Rachael Nava. She also pointed us to a new website, security.ucop.edu, which states, "UC is taking appropriate steps to prevent cyber attacks by advanced persistent threat actors." That site also touts the new Cyber-Risk Governance Committee, which acts as an umbrella group for the affected campuses, the Lawrence Berkeley National Laboratory, and the UCOP’s own network.


The recent dust-up arose when Ethan Ligon, a member of a Berkeley Information Technology committee began alerting other faculty that the UCOP had "intrusive hardware" installed on the campus, "over the objections of our campus IT and security experts." That e-mail went out several days after the UCOP formally rejected the Berkeley group’s request to shut the Fidelis system down.

"It's a black box," Ligon, a professor of agricultural economics, told Ars over coffee at a campus-adjacent café this week. "Our own IT staff don't have any access to it. It's not like their IT guys are better qualified than our IT guys."

He said that many IT staff are concerned about speaking out for fear of losing their jobs—few of them have the kind of job security that Ligon and other tenured professors have.

Ligon shared with Ars a slide deck that he prepared for a committee meeting earlier this week. The economics professor also pointed Ars to a 2005 UCOP policy document stating that while the administration is certainly allowed to do network monitoring, it doesn’t comport with the provision thatmandates the "least invasive degree of inspection."

Previously, campus-monitoring log files were deleted as a matter of course unless there was a specific reason to retain them. Ligon and his colleagues argue that this level of monitoring goes far beyond that policy, usurping the normal autonomy granted to each campus.

After being shown the Kay letter on Thursday, Ligon called this a "small move in the right direction" but said he hopes the UCOP will do more to acknowledge its role in perceived overreach. "The limited progress is the statement that UCOP's behavior constituted a ‘serious failure of shared governance,’" he added by e-mail. "But note that this is faculty saying this; we still don't have any acknowledgement of this from UCOP. Still, this seems to set the stage for UCOP to at least tacitly acknowledge that they misbehaved. Finally, the document doesn't say anything about stopping the monitoring, and indeed goes out of its way to suggest that it was justified."

The concern has extended beyond academia as well: Rep. Ted Lieu (D-Calif.) who represents western Los Angeles—including UCLA—has weighed in. Lieu is one of a handful of a computer science majors in Congress, and he is also a Lieutenant Colonel in the United States Air Force Reserves.

NSA kept its e-mail metadata program after it "ended" in 2011 - Ars Technica 20151120

It’s official - NSA did keep its e-mail metadata program after it “ended” in 2011 - Ars Technica 20151120

The New York Times gets a new NSA doc confirming what some had long suspected.

Though it was revealed by Edward Snowden in June 2013, the National Security Agency's (NSA) infamous secret program to domestically collect Americans’ e-mail metadata in bulk technically ended in December 2011.  Or so we thought. A new document obtained through a lawsuit filed by The New York Times confirms that this program effectively continued under the authority of different government programs with less scrutiny from the Foreign Intelligence Surveillance Court (FISC).

The bulk electronic communications metadata program was initially authorized by the government under the Pen Register and Trap and Trace (PRTT) provision, also known as Section 402 of the Foreign Intelligence Surveillance Act. The Timesdocument, a previously-top secret National Security Agency Inspector General (NSA IG) report from January 2007, contains a lot of intelligence jargon but crucially notes: "Other authorities can satisfy certain foreign intelligence requirements that the PRTT program was designed to meet."

While such a theory had been pushed previously by some national security watchers, including Marcy Wheeler, this admission had yet to be officially confirmed. Wheeler argued that not only do the post-PRTT programs achieve the same goal, but she believed they were in fact more expansive than what was previously allowed.

The bulk metadata program, which began in secret under authorization from the FISC in 2004, allowed the NSA to collect all domestic e-mail metadata including to, from, date, and time. When this program was revealed by the Snowden leaks in The Guardian, the government said that the PRTT program had been shut down 18 months earlier for "operational and resource reasons."

It was believed that the FISC imposed a number of restrictions on the PRTT program, according to the Office of the Director of National Intelligence (ODNI) itself.

The databases could be queried using an identifier such as an email address only when an analyst had a reasonable and articulable suspicion that the email address was associated with certain specified foreign terrorist organizations that were the subject of FBI counterterrorism investigations. The basis for that suspicion had to be documented in writing and approved by a limited number of designated approving officials identified in the Court’s Order. Moreover, if an identifier was reasonably believed to be used by a United States person, NSA’s Office of General Counsel would also review the determination to ensure that the suspected association was not based solely on First Amendment-protected activities.

The PRTT program was designed to help the intelligence community intercept and analyze "one-end foreign" communication—in other words, people in the US communicating with people outside the US.

EO 12333 strikes again

The newly public document cites two legal authorities that govern foreign data collection: Section 702 of the FISA Amendments Act and the Special Procedures Governing Communications Metadata Analysis (SPCMA), which sits under Executive Order (EO) 12333.

Section 702 largely governs content collection wholly outside the United States (it’s what PRISM falls under). Meanwhile, EO 12333, which ex-government officials (including Snowden himself) have complained about, is a broad Reagan-era authority that allows data collection on Americans even when Americans aren't specifically targeted. Without this executive order, such actions would be forbidden under the Foreign Intelligence Surveillance Act (FISA) of 1978.

EO 12333 specifically allows the intelligence community to "collect, retain, or disseminate information concerning United States persons" if that information is "obtained in the course of a lawful foreign intelligence, counterintelligence, international narcotics, or international terrorism investigation."'

According to John Tye, a former State Department official who spoke with Ars in August 2014, EO 12333 has the potential to be abused as it could "incidentally" collect foreign-held data on Americans. "12333 is used to target foreigners abroad, and collection happens outside the US," he told Ars. "My complaint is not that they’re using it to target Americans, my complaint is that the volume of incidental collection on US persons is unconstitutional."

Tye continued:

There are networks of servers all over the world and there have been news stories on Google and Yahoo—the minute the data leaves US soil it can be collected under 12333. That’s true not just for Google and Yahoo, that’s true for Facebook, Apple iMessages, Skype, Dropbox, and Snapchat. Most likely that data is stored at some point outside US or transits outside the US. Pretty much every significant service that Americans use, at some point it transits outside the US.

Hypothetically, under 12333 the NSA could target a single foreigner abroad. And hypothetically if, while targeting that single person, they happened to collect every single Gmail and every single Facebook message on the company servers not just from the one person who is the target, but from everyone—then the NSA could keep and use the data from those three billion other people. That’s called 'incidental collection.' I will not confirm or deny that that is happening, but there is nothing in 12333 to prevent that from happening.

UPDATE Saturday 12:55pm ET

Tye also e-mailed Friday evening, adding:

Yes, this is consistent with what I've been saying. One of the key points is that section 215 provides only a small part of the data that the NSA collects on US persons; most such data is collected outside the borders of the US under EO 12333.

There is a lot more than even the Savage article explains. We're beginning to scratch the surface.

GCHQ hauled before Investigatory Powers Tribunal over intrusive hacking powers - Ars Technica 20151201

GCHQ hauled before Investigatory Powers Tribunal over intrusive hacking powers - Ars Technica 20151201

"GCHQ could get a warrant to hack everyone in Birmingham with little oversight."

Enlarge / The Investigatory Powers Tribunal hearing is being held in the Rolls Building, London, this week.

Today, the UK government has been forced to appear before the Investigatory Powers Tribunal (IPT) to justify the use of intrusive hacking powers, known as Computer Network Exploitation (CNE), by GCHQ. This is the result of legal action last year brought by Privacy International and seven Internet and communications service providers from around the world. The IPT is the only body that can rule on complaints about GCHQ and the UK's other intelligence agencies. The hearing is expected to last four days.

Privacy International contends that "the infection of devices with malicious software, which enables covert intrusion into the devices and lives of ordinary people, is so invasive that it is incompatible with democratic principles and human rights standards." The claimants also assert that "GCHQ’s attacks on [communications] providers are not only illegal, but are destructive, undermine the goodwill the organisations rely on, and damage the trust in security and privacy that makes the internet such a crucial tool of communication and empowerment."

The legal challenge has already forced the UK government to reveal more details of its hitherto secret hacking programmes. Witness Statements by Ciaran Martin, director general for cyber security at GCHQ, show that the secretary of state does not individually sign off on most CNE operations abroad, but only when "additional sensitivity" or "political risk" is involved. Moreover, Martin also said that the commissioner of the intelligence services only formally reviewed the individual targets of GCHQ hacks overseas in April 2015.

From government documents released as a result of the case, it also emerged that the commissioner of the intelligence services was concerned about the legality of using very broad "thematic warrants" to justify the hacking of people in the UK. He was worried that current law "does not expressly allow for a class of authorisation," and therefore the warrants were too broad. As Privacy International explains: "This means that GCHQ could get a warrant in the UK to hack the computer of everyone in Birmingham with little meaningful oversight."

When the legal challenges were filed last year, GCHQ had no lawful authority to break into computer systems. To remedy that situation, the UK government quietly amended the Computer Misuse Act to provide legal cover for the hacking. It also issued a draft Equipment Interference Code of Practice, which sought to formalise these activities. Most recently, the draft Investigatory Powers Bill tries to put CNE on a stronger statutory footing.

Privacy International and the other claimants have updated their legal challenges to reflect those developments, and have also made available expert reports by Ross Anderson, professor of security engineering at Cambridge University, and Peter Sommer, visiting professor at De Montfort University Cyber Security Centre.

The claimants seek the following orders from the IPT: a declaration that GCHQ’s intrusion into computers and mobile devices is unlawful and contrary to Articles 8 and 10 of the European Convention on Human Rights; an order requiring the destruction of any unlawfully obtained material; an injunction restraining further unlawful conduct.

GCHQ - Schedule Of Public Statements CNE Final - 20151119x