The White House recently unveiled the framework of the president’s first big data privacy plan, part of a series that seeks to shield individual liberty.
The administration is working with bipartisan sponsors on a bill to protect data collected from students through educational apps, with lawmakers claiming to have worked alongside privacy advocates and more than 100 companies including Microsoft, Google and Amplify to develop a pledge that will curtail the misuse of data gathered in the classroom.
At the heart of the matter is the growing realization that we live in an age where everyone has their own unique data trail — whether that’s from our smartphones, store cards or posts on social media. This information has become so valuable that it is now collated and analyzed before being sold to other companies. And the likes ofFacebook, Google and Twitter have built billion-dollar valuations on this type of data.
But with Edward Snowden’s revelations still fresh in the collective memory, public concern over digital footprints has amplified in recent months — not least due to high-profile data hacks experienced by the likes of Target and Home Depot — and there is a growing unease about technology’s ability to both safeguard and bypass our civil liberties.
What’s certain is that we leave ourselves exposed to data catastrophes without the sort of measures proposed by Obama. Equally apparent is the urgent need for our lawmakers to play catch-up with the technological realities of today, because if the tech world moves in dog years, then politicians move at a snail’s pace.
While I reject the Orwellian prophecies of many privacy advocates, it’s unrealistic to expect the private sector to regulate itself. But we must not let fear stand in the way of progress or let Luddism win out. Any legislation that’s too heavy-handed in its wording could do more harm than good in the long run and cause collateral damage to innovative thinking — akin to using a frying pan to squash a fly.
Often when people talk about big data, they refer to what it will make possible in the future. A data-driven world is a technological promised land, where many diseases have been eradicated, one that’s run on clean energy and where disasters are met with speedy and effective responses. Where the standard dosage is a thing of the past, replaced by individually tailored health care. But the reality is that we are not going to be able to make this happen without some sort of negotiation on data use.
It’s a hugely complicated and delicate matter, one that concerns every individual on the planet, but it’s a debate worth having and getting right if we are to preserve both privacy and technological innovation.
Instead of scaremongering, we need government and private sector alike to start educating the public about how companies actually use data. From our posts on Facebook and Twitter through to the taxis we order on Uber and what we buy at Safeway, huge amounts of information is already in the public domain. We need to ensure that the public is aware of what is being disclosed and articulate why they stand to benefit by having personal information out in the open — and of course we need to include an opt-out clause as standard for those who remain unconvinced.
In this regard, we can take a lesson from my native UK, where the National Health Service was forced to suspend plans for a central stash of patient records that researchers said could transform British health care. A poor communications strategy proved to be Care.data’s undoing, with widespread public pressure successfully leading to the initiative being shelved.
Britons struggling to reconcile the trade-off between individual privacy and the collective benefits of medical research.
A much more effective approach would have been to listen to the public fears and explain the life-enhancing implications of data-driven health care. How many millions could be saved by new treatments? How many lives currently cut short could be extended? And how many terminal diseases could be eradicated? Again, of paramount concern should have been to reassure the public that they could opt out if they remained unconvinced.
But government must also carry out wider consultations with the private sector before passing any big data law. Of the hundreds of stakeholders engaged by the Obama administration’s team, not nearly enough were at the forefront of big data. Indeed, I was amazed that there was no room at the table for the likes of Hortonworks, Cloudera or my own company, WANdisco.
These are the organizations that have the broadest overview of what’s taking place in the big data space, companies that are helping everyone from banks and utility providers, to government agencies and hospitals implement big data strategies.
Not only would engaging with big data vendors provide lawmakers with a richer understanding of how and where the technology is being used, it would also shed light on where technology already has the ability to deal with public concerns.
One of the most acute data fears is that we have little say over what remains confidential — that what was once private is fast becoming publicly known, including voting habits, health records, bank balance and sexual history. But data need not be indiscriminately copied from one location to another — the technology exists to ensure records remain sufficiently anonymous, with only certain information used for an alternative purpose.
Public fears are to be expected, with new tech often feared before becoming accepted. When Bill Clinton introduced his human genome project in the 1990s, the initial investment of $1 billion from the U.S. government was met with public outcry. However, it created a $150 billion industry and one of the most important medical breakthroughs of our generation.
Big data promises to be the next great technological leap. And while we may need legislation to ensure it stays on the right path, we cannot let it clip innovation’s wings at the same time.
Image credit: CC by r2hox