Monday, November 26, 2012

Should you cover your tracks from government snooping?




Most of us store a lot of stuff in the cloud.  For example, most of us keep lots of old emails in the cloud, since storage is free, they're easily searchable, and it's always possible that those old emails could come in handy some day.  In fact, there are a lot of practical reasons to keep stuff like old emails forever.  Yet it's worth taking a moment to consider the risk that governments can access data that you choose to keep. 


Governments are in a unique category, since they can simply pass laws to give themselves the rights to access data.  Some of these laws are wildly out of date, and simply no longer fit for purpose, in particular the US law from 1986, called the Electronic Communications Privacy Act.  For some years now, there have been many calls to Congress to update these laws.  Perhaps the Petraeus scandal will give this movement new impetus, since the privacy debate usually advances only when abstract privacy concepts are given a human face and a story that people can empathize with.  

As a normal user of email, it's fair to ask whether there's any reasonable risk that a government would be interested in accessing my emails.  After all, most of us are not Director of the CIA or cybercriminals.   As a matter of civil liberties, it's important for everyone to have some sense of the balance between privacy and surveillance that the government has chosen.  As a user, I want to know which governments are accessing data, and how often.  I know that published metrics will be imperfect, but I want to have more transparency, so that I can make my own decisions, as a user and as a citizen.  


Seen from a global perspective, it's important to realize that most governments around the world are accessing user data.  It's not just one or two governments.  I can't count the number of times privacy advocates in Europe have warned users that the US government could potentially access their data in the cloud, without mentioning the risks that their own governments could do the same thing.  In fact, to take the French example, the French government is trying to launch a "French cloud", explicitly to try to evade US government surveillance, even though this taxpayer-funded initiative is based on "bad assumptions about cloud computing and the Patriot Act", and even though France's own anti-terrorism law "has been said to make the Patriot Act look "namby-pamby by comparison", as reported on ZDNet.  I think it's fair to assume that most people would be far more uncomfortable with foreign governments, rather than their own governments, accessing their data.  That points to one of the hardest issues in the cloud, namely, that multiple governments can (and do) have the power to demand access to user data, if they follow appropriate legal procedures. 


In light of all this, I believe that it's an ethical imperative for companies that are entrusted with user data to publish statistics on governments' requests for access to user data.  A number of web companies are now publishing data on all this, in addition to Google, which started this trend of reporting on governments' request for user data.  I strongly encourage you to take a look at those statistics, which may challenge some of your long-held intuitions about which governments are most active in trying to access user data.  Other companies have also started publishing statistics:   DropboxLinkedInSonic.net and Twitter  But most companies are still not publishing any such statistics.  


A lot of companies are failing their users now.  The Electronic Frontier Foundations ranked companies "When the government comes knocking, who has your back?"  There are a lot of big names on that list doing very little to give their users transparency.  


In the meantime, as users, we all have to decide if we want to keep thousands of old emails in our inboxes in the cloud.  It's free and convenient to keep them.  Statistics published by some companies seem to confirm that the risks of governments seeking access to our data are extremely remote for "normal people".  But the laws, like ECPA, that are meant to protect the privacy of our old emails are obsolete and full of holes.  The choice is yours:  keep or delete.  I'm a pragmatist, and I'm not paranoid, but personally, I've gotten in the habit of deleting almost all my daily emails, except for those that I'd want to keep for the future.  Like the rule at my tennis club:  sweep the clay after you play. 



Wednesday, November 14, 2012

Book Burning, updated for the Digital Age







We're so much more enlightened than prior Book Burning Generations, aren't we?  Book burning has a long and inglorious history.  History also teaches us that the book burners usually end up getting burned themselves.  

Think of Savanarola in 1497, in the famous Bonfire of the Vanities, burning books and objects that were deemed temptations to sin.  Two years later, Savanarola was himself burned at the stake.

Think of the Nazis in 1933, burning "un-German" books.  Twelve years later, they left Germany burning, along with much of Europe.  

Book burning has been with us in every age.  Books were burned to protect the faith, or to protect the nation, or to protect the regime.  Now, in order to protect "privacy", Europe is creating a poorly-defined, poorly-conceived "Right to be Forgotten", on which I've blogged before.  Are we re-igniting the long tradition of book burning?   

In the digital age, we don't burn physical books.  Instead, we delete data.  

The Right to be Forgotten is more pernicious than book burning.  The Right to be Forgotten attempts to give to individuals the legal rights to obliterate unpalatable elements of their personal data, published in third-party sources, whether they are social networking sites, or newspapers, or books, or online archives.  In the real world, these can be things like a report on a politician taking a bribe.  Or a doctor put on trial for medical malpractice.  Or a person filing for bankruptcy.  You can easily see how the person concerned could have an interest in obliterating any reference to these embarrassing facts, while other people might have a very legitimate interest to know about them. 

Historically, book burning was usually a symbolic, political protest act.  No one burning books was under the illusion of destroying the text of a book being burned.  Only the physical copy of the text was being burned.  The text would survive elsewhere.  But the Right to be Forgotten is attempting to obliterate the text, the source, the facts themselves, and not merely some copy of those facts circulating in a physical book or newspaper or online site.  

Deleting data in the name of the "right to be forgotten" is only the tip of the privacy-ideology iceberg.  One of the core tenets of this ideology is that all personal data should be deleted, as soon as it is "no longer necessary".  This ideology is based on the fear that any personal data could be mis-used to invade someone's privacy, and that the risk of an invasion of privacy should automatically outweigh any potential future benefits of retaining the data.  This is a deeply pessimistic ideology, which concludes that retaining data can give rise to future risks and to future benefits, but since we don't yet know what they are, we should default to deleting the data to prevent the risks, rather than retaining them to enable the benefits.  

As Savanarola might say, in an outburst of data deletion demogoguery, let's burn all those "vanities", those databases of personal data, which are nothing but temptations to sin against someone's privacy.  But the opposite may prove true, that these vanities are databases of great value and beauty, and we will someday learn it would be a sin to obliterate them.  Botticelli is believed to have burned some of his paintings, as he was caught up in Savanarola-fever.  A few years later, Botticelli renounced Savanarola's worldview.  

I can understand that databases should be protected, secured, analyzed responsibly, yes...but obliterated?, just because something could go wrong?   If we took that approach in the rest of our lives, what would be left?  How bizarre that this destructive pessimistic philosophy on data deletion has become conventional wisdom, at least in Europe.  Well, for now.  In the long run, book burning has never been a winning strategy.  If you think our age is more enlightened than prior ages of book burners, why do you think burning books in the name of privacy is more legitimate than burning books in the name of race, religion, or regime?











Monday, November 5, 2012

The Marketplace of Privacy Compliance Programs

The data protection establishment, worldwide, has been inventing a lot of new privacy compliance programs.  All these different, well-intentioned initiatives are meant to serve the same purpose:  improve privacy protections.  All of them are, or likely will soon be, mandatory for most big companies.  I can hardly keep track of all the different initiatives, but here are the ones I have struggled to understand:

  • Accountability
  • Privacy by Design
  • Privacy Impact Assessments
  • Consent Decrees
  • Audits (internal and external)
  • Regulatory reviews
  • Data Processing Documentation
  • Database notifications/registrations
  • Binding Corporate Rules
  • Safe Harbor Compliance programs
Lots of my acquaintances in the privacy field have asked me what I think about all this:   Are these programs meant to run independently, even if they overlap and cover the same ground?  Does anyone have a clue how much all this will cost?   Where do you turn for help to implement these programs?  Can one solid privacy compliance program be implemented to meet all of these goals?  Clearly, all of us privacy professionals are struggling to understand this. 

I'm sure we all believe that privacy programs need a solid compliance-program foundation to be effective.  Most of also probably believe that different actors should have the freedom to develop programs that fit their cultures.  Nimble Internet companies have very different cultures than government bureaucracies, so naturally, these different cultural worlds must have the freedom to design programs that works in their respective cultures.  Clearly, one-size-does-not-fit-all.  Programs have to be customized for the size and sensitivity of the processing.  A government database of child-abuse records is more sensitive than a database of some web site's analytics logs, so it's wrong to try to run the same compliance programs for both. 

On cost:  despite all the good intentions motivating these compliance initiatives, no one has even begun to figure out what all these compliance programs are going to cost.  Take Europe as an example:  I've read some statements from politicians that future EU privacy laws will reduce business' compliance cost.  That is simply not credible.  On the one hand, under the new rules, businesses in Europe will save a little money, once they no longer have to fill out national database notification forms across Europe.  In the scheme of things, that is peanuts.  On the other hand, imposing new compliance obligations (mandatory privacy impact assessments, mandatory data protection officers, mandatory security breach notifications, mandatory data processing documentation) will cost a lot.   The problem is that nobody knows how much all this will cost.  I'm working on the educated guess that the current EU privacy compliance proposals will increase the privacy compliance costs on businesses in Europe ten-fold, starting around 2015.  Yes, ten-fold.  That excludes the costs of fines and sanctions for non-compliance, now proposed to run up to some percentage(s) of a company's worldwide turnover.  This massive increase in compliance costs is largely the result of the proposed EU sanctions for failing to adequately document compliance programs.  I'm still hopeful that more realistic compliance obligations will be created for Small and Medium sized Enterprises, but the big trend is clearly towards costly new compliance obligations in Europe.  

I get the feeling that the many people debating privacy laws have no idea (and perhaps don't care) how much all this ends up costing.  I also haven't read any classic regulatory cost/benefit analysis on these new obligations.  As a lawyer trained at Harvard in the cost/benefit analysis of government regulations, I am surprised to see that there's been essentially zero academic or economic analysis to decide which privacy compliance rules are effective and which are pointless red tape.    

At the writing of this blog, I really don't know how all the compliance initiatives above are supposed to fit together.  I don't know which are superfluous.  All this has yet to be worked out.  While each of the programs above overlaps with the others in some ways, each is also slightly different too.  We've got to figure out how to minimize duplication among these programs, or we're all going to waste our time and money on re-inventing the wheel.  

Privacy compliance initiatives today remind me of the early days of the railroad, when each railroad line had its own track width, meaning trains could only travel on one track.  Eventually, all this will get sorted out, just as railroad track width was eventually standardized, but in the meantime, I fear we're all going to be running around in circles.  Like the early days of the railroad, we're still in the early, experimental, inefficient, non-standardized, frontier-age of duplicative privacy compliance programs.   

Friday, November 2, 2012

Greece: protecting freedom of expression
















You may have read about the widely-reported case of the Greek journalist who published the list of 2000 Greeks with Swiss bank accounts.   The journalist was put on trial for criminal breach of data protection rules.  Thankfully, the courts recognized that this journalist published the names in the public interest.  Indeed, the case confirmed the world's strong suspicions that the Greek political and financial elites were protecting themselves from investigations into tax evasion.  Rather than investigate why the Greek tax authorities failed to investigate this list of 2000 names, after having been given the list two years ago by the IMF, the authorities put the journalist on trial.  This was a transparent attempt to use the criminal justice system, and "data protection", as a way to chill this (and other) journalists' attempts to expose tax evasion and political connivance. 


Thankfully, the Greek court dismissed the charges of data protection crimes against the journalist.  
As a privacy lawyer, I note a few things.  Data protection laws in Europe explicitly foresee an exemption from normal privacy laws, for journalistic purposes, such as "necessary to reconcile the right to privacy with the rules governing freedom of expression" and for "substantial public interest".  Articles 9 and 8 of the Directive.   Surely, this Greek example meets both tests, and the court was quick to reach that result.

Nonetheless, I'm very worried about the increasingly criminalization of privacy laws, especially across Southern Europe.  Once privacy laws are inscribed into penal codes, they open the door to prosecutors and criminal judges pursuing such cases with the blunt machinery of criminal justice, backed up with threats of jail.  Many such cases, like this Greek example, are nuanced cases balancing fundamental human rights, like privacy and freedom of expression.  Nothing is more dangerous to freedom of expression than using vague notions of "privacy" to threaten journalists, or newspapers, or Internet platforms, or employees of Internet platforms, with jail time, when they are exercizing their rights to freedom of expression or operating a platform for others to do so.   There are now hundreds of such cases around the world.  


Luckily, the Greek justice system was quick, and resolved this case in days.  But many criminal justice systems are notoriously slow.  As reported in The Economist, to take the example of Italy:  "Italian justice has a reputation for moving very slowly."   My own Italian privacy criminal trial has been dragging on for years, and is expected to begin the appeals phase soon, on December 4, almost 5 years after I was first "detained" by Italian police in Milan.   5 years is a long time to put someone through criminal justice hell, in a landmark case trying to make me vicariously liable for user-generated content uploaded to an Internet video platform.  


Congratulations, Costas Vaxevanis, I respect your courage. Powerful forces try to use criminal privacy statutes to restrict freedom of expression.  Thank you for standing up to them.