Compliance Management

Hand in hand: Privacy and transparency

Share

If privacy and transparency were a couple, they'd likely change their Facebook relationship status to “it's complicated” and then publicly play out what can only be described as a rollercoaster of harmony and discord as they ride through life together. 

The cybersecurity equivalent of Dr. Phil might occasionally call them co-dependent and dysfunctional because their goals can be at odds, but also supportive and in lockstep. Transparency, in some cases, is a necessary protector of privacy.

“There's a trade-off, a national debate about what we're willing to give up to be secure,” says Jack Huffard (left), president, COO and co-founder of Tenable.

Government surveillance of the caliber revealed by Edward Snowden, which gained an uneasy acceptance of sorts in a post-9/11 world more willing to make that trade-off, has severely tested the bounds of privacy, with the FBI and other law enforcement agencies often arguing that they need visibility into email, phone records and mobile devices of suspected terrorists and other criminals to make their case.

With their reputations tarnished by Snowden's revelations and facing a steady onslaught of data requests from government, with warrants obtained quietly and without the benefit of public scrutiny, some of the biggest technology and internet corporations – including Google, Microsoft, Yahoo, LinkedIn and Facebook – waged a battle against the U.S. Department of Justice (DOJ) in an effort to release more complete information on government demands. After a significant win in early 2014, those companies began releasing updated transparency reports.

Under the new rules, companies could consider two different options when reporting on government requests – one that allows for more generalized aggregate reporting in bands of 250, or another that allows for a greater breakdown of reporting in bands of a thousand.

Microsoft chose the latter option and announced that, between Jan. 1, 2013 and June 30, 2013, it had received between zero and 999 Foreign Intelligence Surveillance Act (FISA) orders seeking disclosure of content. The requests impacted between 15,000 and 15,999 accounts.

For the same period, the company received between zero and 999 non-content FISA requests that impacted the same number of accounts, which mirrored non-content requests based on National Security Letters (NSL) during the same timeframe.

“While there remain some constraints on what we can publish, we are now able to present a comprehensive picture of the types of requests that we receive from the U.S. government pursuant to national security authorities,” Brad Smith, general counsel and executive vice president with Microsoft, posted at the time.

LinkedIn chose the other option and announced that, between Jan. 1, 2013 and June 30, 2013, it had received between 0 and 249 national security requests that impacted between 0 and 249 accounts. National security requests in this option are comprised of FISA orders and NSLs.

“We did so because we believe that this option gives our members and the public a more accurate picture of the number of national security-related requests we receive and the number of accounts impacted, even though this option requires us to aggregate national security-related requests,” Erika Rottenberg, general counsel at LinkedIn, posted at the time.

The companies have continued to broaden the information included in their transparency reports, pushing government through the courts to gain that right. They've also waged a war of sorts with the government over encryption and access to their customers' private data, often locked in mobile devices. The most notable example is Apple's stand against the FBI over unlocking an iPhone 5c used by one of the San Bernardino shooters. 

A confluence of events and undercurrents that have brought Apple and the FBI to an inevitable confrontation: A terrorist attack on U.S. soil by husband/wife team Syed Rizwan Farook and Tasheen Malik that followed close on the heels of the horrifying attacks in Paris lent a sense of urgency to pending investigations. An FBI worried about the rise of homegrown terrorists adept at using technology to communicate and “going dark” to evade detection and relying on a law, the All Writs Act, that's more than 225 years old, as broad authority to demand tech companies provide access to data locked in iPhones and other smart devices.

The case sent FBI Director James Comey back and forth to Capitol Hill arguing the agency's case and appearing before the Senate Select Committee on Intelligence (SSCI) to contend that the FBI's investigation had been hampered by the inability to crack the iPhone. The Justice Department dropped the case, however, after the FBI used a third party to crack the phone.

The Apple case, while far from settled, will likely influence policy in favor of privacy. An enormous amount of consensus emerged that a backdoor in an encrypted system is not good, it creates a key for access, says J. Trevor Hughes (left), president and CEO of the International Association of Privacy Professionals (IAPP). “Any backdoor creates a security risk.”  

The public, activists and some lawmakers rightly assessed that leaving a way in for even the most upright of democracies would open it up to national and intelligence initiatives of more nefarious governments and organizations.

“Where does the law have the right to look?” says Sam Curry, chief product officer at Cybereason. “If we had just one sovereignty, the answer would be easy. But we don't.”

Whether the industry will continue to stand tall against government entreaties, though likely, remains to be seen. Their plight might have gotten tougher after recent expanded surveillance powers were given to the NSA and the FBI in the waning days of the Obama administration. 

An order stipulating that communications intercepted by the NSA can be shared before privacy protections are applied sounded the alarm among privacy advocates, prompting Nate Cardozo, senior staff attorney at the Electronic Frontier Foundation (EFF), to tell SC Media that “raw signals intelligence information” would most certainly threaten privacy rights. 

The NSA had been restricted in what it could do with the data collected as part of its surveillance activities. But the altered rules means that more government personnel will have access to the intercepted raw data – which includes communications from satellite transmissions, phone calls and emails, both in the U.S. and abroad. 

“This change represents a significant and substantive expansion of the number of people and agencies permitted to access raw, unfiltered, warrantless surveillance data,” Cardozo (left) says.

Through Rule 41, a new edict proposed by the Supreme Court and recently adopted in earnest, the FBI, too, has been granted broader authority to spy – the rule lets U.S. judges sign off on warrants outside their jurisdiction. 

The new rule would allow judges to accommodate a wider dragnet, even across countries. Privacy advocates fear that the FBI will be able to expand its surveillance capabilities. An agent would need only to get a judge's signature on a search warrant to put into play the agency's network investigative techniques (NITs), which allow the agency to hack into and monitor any computer or device on the globe.

While law enforcement has gained expanded powers that could compromise privacy, simultaneously, privacy has gotten a boost through other court proceedings – most notably, though, the courts have continued to rule in favor of privacy protections – or at least the rulings have been a mixed bag.

In the Apple case, while Sheri Pym, a U.S. Magistrate Judge for the Central District of California, ordered the tech company to provide “reasonable technical assistance” to help law enforcement access encrypted data on the iPhone 5c – using its exclusive expertise to bypass the auto-erase function on the phone so that FBI investigators could input an unlimited number of passcodes as they attempted to unlock the iPhone of the killers – other judges, like James Orenstein, known as a Fourth Amendment advocate, took the government to task. In a 50-page ruling he knocked the government for assigning itself broad authority under the All Writs Act (AWA). 

“Under the circumstances of this case, the government has failed to establish either that the AWA permits the relief it seeks or that, even if such an order is authorized, the discretionary factors I must consider weigh in favor of granting the motion,” Orenstein wrote at the time. 

It was Orenstein who first raised questions over prosecutors' request that the court order Apple to unlock an iPhone 5s that the Drug Enforcement Agency (DEA) had seized in a drug investigation. The judge took aim at the government's expansive use of the AWA and asked Apple to respond. That case was also resolved when a third party came forward and offered a password for the phone.

More recently, Congress has made some noises toward bolstering privacy protection. In a voice vote in early February in favor of the Email Privacy Act, the House moved law enforcement one step closer to having to obtain warrants to search information, including email, that has been stored with third parties for more than six months.

“We applaud the House for passing the Email Privacy Act,” Adam Brandon, CEO of FreedomWorks, said of the bill. “As we continue to rely more and more on electronic communication, we need an upgrade to the Fourth Amendment..Current law suggests that you don't have a reasonable right to privacy on communication over 180 days old.” 

Brandon noted in a statement sent to SC Media that “private communication is private, regardless of whether a third party is used to convey information or how long it has been.”

Less than two weeks earlier, the U.S. government's petition for a rehearing of the case involving access to Microsoft customer data stored on a server in Ireland was slapped down by the Second Circuit Court of Appeals.

“The opposite ruling could have resulted in chaos and a privacy disaster,” Greg Nojeim, director of the Center for Democracy & Technology's Freedom, Security & Technology Project, said in a release. “Providers would have been subject to conflicting obligations to an even greater extent than is the case today, and users' communications privacy could become, over time, subject to the whims of not just the U.S. government, but also other countries seeking their data.” 

A three-judge panel had earlier ruled that warrants from the U.S. government couldn't be used to force Microsoft to hand over emails stored in the Irish server. The government, peeved that data stored overseas would be beyond its reach, then petitioned the court for a rehearing before the full panel of judges.

But the January split vote, 4-4, on the government's entreaty meant that the earlier ruling in Microsoft's favor would stand.

The ruling “is definitely not the end of the story,” Nojeim said, contending that “the government will appeal this decision to the Supreme Court or it will ask Congress to rewrite the law, or it will do both.” 

Indeed, Second Circuit Court Judge Susan L. Carney, acknowledging that the Stored Communications Act (SCA) “has been left behind by technology,” wrote that “it is overdue for a congressional revision that would continue to protect privacy but would more effectively balance concerns of international comity with law enforcement needs and service provider obligations in the global context in which this case arose.”

And the Supreme Court has been definitive that smart devices are the “footlockers” of their owners' personal lives – and therefore law enforcement can't delve into their contents without warrants.

Tethered to technology

But transparency is not all about law enforcement action, government requests or surveillance. The proliferation of interconnected, smart devices, or the Internet of Things (IoT), in almost constant communication with each other, has pushed greater demand on companies to be more transparent about the way they gather and use data on their customers.

Privacy is a puzzle that has become more complicated as an avalanche of data – generated by everything from corporate servers to wearable devices, like Fitbits and smart watches – gain both mass and momentum. There is a certain opaqueness to the way that information is shared, in part because people, companies and organizations are more interconnected through technology than ever before.

When data leaves a computer or personal device, its path is not straightforward and linear. Rather, a web of interconnectedness among people, companies and machines means information is shared seamlessly (most of the time) among entities, with the implicit, if not explicit, permission of consumers and businesses that want – and have become accustomed to – the benefits of that fluid information flow. And, it is fair to expect that data volume and the speed at which it travels will pick up as the IoT comes to bear. 

But if digitization has grown seemingly faster than the speed of light, our legal system has not. Digitization and tech advances have outpaced our ability to manage and protect data. Privacy officers, law enforcement and government are working within a set of laws that by the most generous definition are antiquated, and until recently progress had been hampered by a chronically stalled Congress.

Still, companies are facing increasing pressure to build privacy into their products and services and to offer explicit, detailed information on the terms of how they share data – changing their policies to only allow data-sharing if customers specifically opt in.

The American Civil Liberties Union (ACLU), last October, claimed a victory for privacy after the Federal Communications Commission (FCC) voted to require internet service providers to obtain opt-in permission from customers to use or share their personal data.

“Today's vote is a historic win for privacy and free expression and for the vitality of the internet,” Jay Stanley, a senior policy analyst at the ACLU said in a statement at the time. “Just as telephone companies are not allowed to listen in to our calls or sell information about who we talk to, our internet providers shouldn't be allowed to monitor our internet usage for profit.”

The order applies the privacy requirements of Section 222 of the Communications Act to broadband ISPs. The commission called the privacy rules a “framework” that give customers more control over their information. The order defined categories of information that fall under the umbrella of “sensitive” – precise geo-location data, financial information, health data, information on children and Social Security numbers, as well as web browsing history, the history of app usage and communications content.

The order allows ISPs to share non-sensitive information, like email addresses, unless a customer specifically opts out. The FCC also provided exceptions to the consent strictures. “Customer consent is inferred for certain purposes specified in the statute, including the provision of broadband service or billing and collection,” according to the FCC's release, and requires “no additional customer consent is required beyond the creation of the customer-ISP relationship.”

ISPs must also adhere to transparency requirements that include giving customers clear notice of what information is being collected and how it will be shared. They must also comply with reasonable data security practices.

A step in the right direction, yes. But by no means is the FCC action the coda to the privacy /transparency issue.

“We can expect the industry to try to exploit every crack in these protections, and hope that the spirit of vigorous oversight and consumer protection that has animated this proceeding will continue,” says the ACLU's Stanley.

If not, privacy will likely take a hit. 

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.