Is privacy a thing of the past?

Is privacy a thing of the past?

Do we marvel at our ever-moretailored, safer world, or sleepwalk into a trust crisis to rival the financial crash?

It’s almost soothing. The sound of people discussing privacy issues, a hum of familiar debate. Yes, nowadays we might think a bit harder before ticking to sign up for spam. And yes, it used to be ticked for you but now that’s not allowed (something to do with public opinion and legislation, an inconvenience in the battle with the quarter 3 sales target).

But under the surface tectonic plates are in motion. We’re in transit from a world where our memory was made of paper to one made of electrical charge, ones and zeros, digital. From one where there’s a capacity limit to one that’s trending towards infinite. From one where search was manual to one where it’s effortless, instant and capable of finding a needle in an Albert Hall of hay.

It’s not just memory that’s increasing. The information available is multiplying too. Every phone call, every Google search, every e-mail, every photo, every trip in a car (your own or your Uber), every border crossing, every stroll in front of someone’s CCTV, plus cameras in cars, on policemen, on cyclists.

This makes life easier in so many ways.

We can be orders of magnitude more productive in how we buy things, share things, move around and get work done. It also helps us be more effective. We can solve crimes with new clues, spot terrorists plans before they act, generally keep us freer from harm. And it’s happening to us all, so the Z generation – 15-21 year olds who have never known an internet-free world – might have some embarrassing pictures on Facebook when they’re 50, but so will everyone else. Maybe no one will mind? Are we just being so last century?

But perhaps the consequences of what’s happening haven’t had time to emerge. Who has your data? The car mechanic, the bank, the estate agent, government – the list goes on but it doesn’t strike you as one full of people likely to look after you or me before looking after themselves. And some on the list don’t have much in the way of security either. Everywhere I drive and stop, when I do it, my speed, braking, steering, popped onto a USB stick and posted on a website for sale to the highest bidder. Is that really OK? If phone hacking showed what could be done with voicemail, and Snowden showed what was being done with more of our communications, what could be done with a complete record of somebody’s life?

This is something worth exploring, on both side of the argument, so we enlisted three views that should have it surrounded:

  • A leader from the business that the world uses to share their lives, Claire Valoti, Director of Agency Partnerships, Facebook UK
  • Muz Janoowalla, Head of Digital Policing at Accenture and so an expert with first-hand experience of how data can make us safe
  • And Carly Nyst, a pioneering human rights lawyer who has fought unlawful surveillance and promotes the right to privacy around the world

The potential for a spirited debate was high and we weren’t disappointed.

Privacy and personal identity are the biggest single issue on the internet. Biggest as in life, death and the security of nations, at least since Edward Snowden’s revelations. Biggest also as in one trillion Euros which is the Boston Consulting Group’s estimate of the value that could be created by digital identity in Europe by 2020.

That’s equivalent to 8 per cent of the combined EU GDP. Personal data has been described as the new oil; but nearer the mark may be a new currency, a liquidity equivalent for the digital marketplace.

This was suggested by Facebook UK’s Director of Agency Partnerships, Claire Valoti, in the course of the Foundation’s November Forum, assembled to debate the topic ‘Is privacy a thing of the past?’

That question was not answered directly, although others have been forthright: ‘Get over it’: Sun Microsystems’ Scott McNealy; ‘We know more about you than you would care for us to know’: the CIO of a large US credit-rating agency; Gmail users should have ‘no reasonable expectation’ that their communication will be private: Google.

There has always been a shifting border between what’s private and what isn’t, and the question about where the line should be drawn ran right through the forum debate. It is a common issue, now central to arguments over the government’s new Investigatory Powers Bill for example.

Thus Valoti spoke of the ‘value exchange’ that leads 29m UK Facebook users to visit the site 14 times a day, with at least some awareness of the information they are supplying in return. ‘There’s a lot of good that comes from living in a very open and connected world’, she insisted. Two-thirds of the value identified by BCG would accrue to individuals, mostly in the form of ‘free’ internet services, and Facebook research shows that going online helps move 10 per cent of the newly connected out of poverty, according to Valoti.

Another undisputed public ‘good’ of sharing personal data is greater ability to combat nuisance, crime or terrorism – something that has shifted into high-res focus since the terrible events in Paris. Muz Janoowalla, Head of Digital Policing at Accenture, highlighted the legal difficulties faced by the police in using data diffused on social media to track gang activity, for example, and described how police access to location data on smartphones or CCTV images could prevent crime happening and enlist public help as the eyes and ears of the law when it did.

Yet data is a two-edged sword. Like any currency, it can be deployed for good or ill, and the potential for harm is as great as the potential for good. Those who have lived under the Stasi or the like have good reason to fear official data-gathering abilities – which is why the right to privacy was enshrined in international law in the 1948 Declaration of Human Rights, pointed out  human rights consultant Carly Nyst, who advises Amnesty International and the UN on privacy issues. ‘There was no argument about that, because everyone knew that the right to privacy was a fundamental barrier between people and those who have power.’

Understanding of privacy and its limits has evolved rapidly over the last few years as the implications of the ‘value exchange’ trade-off have begun to sink in. On the corporate side, Valoti described ‘building a business that really understands that privacy and choice is important to people. What we do is make sure that people have that choice of how much they share, and, to be honest, that’s why we have a successful business’. On security, Nyst declared herself ‘all in favour of using technological advances to promote safety, and I believe that surveillance is an entirely legitimate act of the state. However, it’s clear that the technological advances have far outpaced regulation in this field.’ In its absence individuals have struggled to come to terms with both companies’ head start and massive state exploitation of the loopholes, as illuminated by the Snowden leaks.

More sophistication will be needed in future as the volume of digital traces, capability of search and information sources all multiply, bringing new dilemmas with them. Thus the more the ‘quantified self’ movement grows, the easier it will become to devise (for instance) cheaper insurance policies for those with superior health and more active lifestyles – but that undermines the solidarity principle underpinning the whole idea of insurance.

Foundation Partner Charlie Dawson noted that on-board electronics on today’s cars record data on location, speed, driving habits, number of passengers and other parameters, all downloadable by an enterprising computerliterate car mechanic with an eye for its possible value.
(For one answer, read Dave Eggers’ uncomfortable satire ‘The Circle’, featuring a not-too-hard to recognize Silicon Valley internet company.)

Already unintended consequences are emerging. As Nyst pointed out, responding to customer concerns, internet companies are increasingly resorting to end-to-end encryption – meaning that even the carrier can’t see the message content – and more generally privacy-enhancing technology is a hot category in Silicon Valley venture capital circles. This will clearly benefit more than ordinary individuals who would prefer to keep their browsing history private. But it is also recognition, Nyst explained, ‘that the internet is a very unsafe and unstable means to transmit data... When she introduced the [surveillance] bill, the Home Secretary said that 90 per of major organisations in this country last year experienced a cyber-attack’. While privacy remains important, online safety in all its forms is rising rapidly up the agenda, sharpening the dilemma.

An encryption arms race between legitimate users and legitimate state agencies would be absurd – but also a very real possibility. How to get to a position where, as Janoowalla put it, in the knowledge that the information was in any case out there, people would freely make it available to those who would use it for good? In the end it comes down to what all participants agreed was the ability to make ‘informed choice’.

Informed choice is a proxy for trust – trust that people knowwhat data is being collected, what it is used for and that it is secure: that in the end it is theirs. ‘I don’t think it should have to be a trade-off for consumers,’ said Nyst. ‘You should be able to make the most of those benefits but also know that your data’s safe and it’s only going to be used for the purposes you handed it over for.’ At present, protested one audience member, ‘we have no way of weighing the benefits against the costs – it feels like signing a blank cheque.’

The irony is that this is surely a phony war. Companies stand to gain from treating data as a positive rather than zero-sum game, just as individuals do from engaging with their own online responsibilities, as Janoowalla suggests. The payback for open data freely given would be massivegains in accuracy, completeness and therefore efficiency that would benefit individuals, private companies, public services and the law alike. On the other hand, without trust of this kind, the value of the currency is severely debased: according to Pew Internet, nearly 90 per cent of web users hide or lie online to avoid being observed, with hackers and criminals viewed as the biggest threat – but advertisers as a close second. That’s clearly bad news for a business model based on advertising, as is the reported 50 per cent growth in downloads of ad-blocking apps over the past year

As the internet of things evolves, whole cities could become giant surveillance devices, certainly favouring public order, but at what cost to individual liberty? ‘Why is it we’re comfortable sharing information with private-sector companies which want to use it to make money from us by selling us products or services, while we’re very uncomfortable sharing that very same information with government agencies that want to use it to keep us safe?’ he asked.

Much of today’s volatile picture can be explained by the interplay of different agendas at different levels, conducted against a political and technological background that is changing at warp speed. It’s not just you – everyone is playing catch-up. So now is not the time to go away. Everything is still to play for. 

But in any case, are better targeted ads the best the internet can offer as a dividend on the trove of personal data? Whether for consumers, citizens, states or corporate executives, ambitions should surely be much higher. Of course, as the conversation underlined, this is a new reality where the rules are being made up as we go along. By any past standards, Web 2.0 and the networks it has enabled are still in their infancy – hard though it is to believe, Facebook and Twitter are less than 10 years old.

The Foundation’s View
This discussion was fascinating, swinging opinion one way then the other as the strong arguments made by each side took root. So much good in sharing our information, so much potential for it to go wrong.

Here are three of the big points that felt important to us:

  1. We willingly share our information because we get something valuable in return. We often don’t think too much about it, but even so in the vast majority of cases the organisation we share it with has good intentions. They give us something we value, and they realise value themselves, but they also understand that in the longer term they need to act in ways that retain their customers’ trust. What we often don’t think through is the potential for the data, or for an approach to gathering it, to fall into the hands of someone less benign. What the police do in London might be quite different to the police in Cairo, and in Cairo the police a few years ago might be quite different to the police now. So sharing innocently with a mobile phone company one day can be a matter of deep regret on another. The integrity of a business’s management and owners one day is no guarantee of their integrity many years hence, but your data lasts forever.
  2. We can often see how using newly available data makes a market work better, with insurance being a great and increasingly familiar example. Fitter people are a lower health risk, with evidence from a Fitbit to show you’re active. More careful drivers are a lower risk, with evidence from telematics to show you’re still taking it easy on the roads. But insurance works by averaging risks together and giving us a deal that protects us all. If we take out all the low risk people, then the people left will have to pay more. Fair enough you cry. But what about when the situation can’t be avoided – driving at night  because you work on a night shift, more likely to get ill because your DNA turned out that way? The market works better, but does the society that it’s part of? 
  3. We are getting a bit more familiar with these kinds of issues. We have what was described as a greater maturity around data, privacy and trust, and ‘we’ means as customers and also as organisations. As we see things going wrong we learn, and we become a bit more street-wise around sharing and more skilful at being shared with. The point of agreement that emerged from the debate was the need for informed consent, from customers and actually from the organisations asking for it too. The challenge is how to achieve true, wide-spread, understanding of what lies behind the click of ‘yes’ or ‘no’. Dozens of pages of terms and conditions definitely don’t work. So what does, or could? The organisations that get the hang of this challenge will be well set up for customer-led growth and success that lasts.

Read more from The Foundation here.

Newsletter

Enjoy this? Get more.

Our monthly newsletter, The Edit, curates the very best of our latest content including articles, podcasts, video.

CAPTCHA
4 + 1 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Become a member

Not a member yet?

Now it's time for you and your team to get involved. Get access to world-class events, exclusive publications, professional development, partner discounts and the chance to grow your network.