Tag Archives: data privacy

Analysis: The pros and cons of privacy and data protection laws

The starting point for most privacy and data protection laws is creating a safer environment for all of us and our personal data – but the inevitable overreach often has far-reaching consequences

Most privacy and data protection laws have the noble aims of making us and our personal information safer – but overreach in the detail is a common side effect of attempts to do the right thing.

The consequences of this legal overreach can themselves be far-reaching – not just to personal privacy, but to technological innovation as a whole, if creators and those with grand ideas feel stifled by the competing needs of overlapping legislation.

The worst case scenario? Potential stagnation for technological innovation.

The broad scope of privacy and data protection laws is generally to ensure the free flow of personal data between the member states, while their ultimate purpose is to regulate how such data should be processed in order to maintain a balance between the various interests of the personal data ecosystem.

Of course, constant fluctuations in both technological and socio-economic contexts make achieving these grand aims a challenge. Regulation is always lagging behind new technological and market challenges, even as it struggles to keep up.

As Maria Macocinschi, who is studying for a doctorate in law at the University of Turku in Finland, notes: “The rigidity in revising and adapting the laws to the fast technological and economic developments is creating frustrations not only for consumers but also for companies.”

She also cites the much-praised General Data Protection Regulation (GDPR), which comes into force in May next year, as a well-intentioned law that may have adverse side effects.

She said: “GDPR, for example displays two contradictory trends. While it ensures a simplification of the regulatory environment and harmonisation of the standards, it also poses additional burdens and costs for companies. Therefore, the free flow of information might be quite affected by these overwhelming obligations.”

Regulation is inevitably deeply complicated, balancing as it must the conflicting interests of the various parties involved (public and private institutions, and consumers) as well as translating more traditional human values in a constantly changing digital environment.

Laws around surveillance are a good example of clashing interests and values: while surveillance such as CCTV is employed primarily for the protection of the citizens for security reasons, the same technologies are now being used in ways that seem to undermine the same values once sought to be protected.

Countries like China, for example, are trying to use technology that will predict when a crime is going to take place, before it even happens – the very stuff of sci-fi films.

The potential for horrifying consequences for those caught up in it makes it increasingly important that surveillance, and the emerging dataveillance phenomenon, should be carefully regulated to ensure a balance between the public interest, the economic rights of companies and the individuals’ privacy and data protection.

In terms of increasing the efficiency and effectiveness of current data protection laws, Maria says there are three broad areas that should be considered:

  • We need to look at how traditional legal concepts should be revised, taking into account the current state of information innovations
  • We need to look at how we regulate the emerging actors in this burgeoning ecosystem, as well as the new methods of collecting and processing data.
  • We also need to reframe the importance of the legal requirements for consent in the intensified and opaque dataveillance systems.

So how do we balance the necessary values and rights for the democratic functioning of the society with preserving personal privacy? This, of course, raises questions of how much privacy is desirable, legally and otherwise?

As with so many other things, regulation initially and superficially seems to be the natural answer here – providing guidelines for the protection of individual interest and public good. However, the law by itself cannot achieve this goal.

Furthermore, the extent to which we all, as consumers, promote and open up our own private lives through social media poses its own problems. The internet is a growing force in all our increasingly transparent lives. With the big data crunching capabilities of all the information we have willingly or unknowingly put out there, the ability for public and private actors to know far more about us than we are comfortable with has never been more real. Our identities, behaviours, transactions and other preferences and vulnerabilities are all gathered and exploited for various obscure purposes.

Again, legislation such as the GDPR is trying to address this, by putting more power over personal information back in the hands of consumers – but here too, law-making inevitably runs behind real life, meaning we are always struggling to keep up.

A new right to data portability (Art. 20 GDPR) and a revised right to be forgotten (Art. 17 GDPR) are aiming to build a stronger protection for the data subject and redress consumer sovereignty. However, such powers for individuals are not absolute. The interest in the protection of information privacy will always be balanced against other public interests as necessary in a democratic society (Recital 73 GDPR).  

So how should we try and find this balance moving forwards?  Maria has three key suggestions.

She said: “Balancing conflicting interests is difficult but not impossible. A first step would be educating individuals about what informational privacy is and the real benefits and consequences of sharing personal information. In a democratic society, a person should not isolate herself from the rest of the community, but rather participate and contribute to the decision making.

Therefore, data protection regulations should not be perceived as tools facilitating the invisibility of the individuals to the rest of the world. Rather, they provide the necessary measures to ensure their safe participation in the society. Disclosing personal information is a requisite for identification in a digital environment of disappearing bodies, and for effectively communicating their consumer preferences to the companies.

Secondly, each participant in the personal information ecosystem should acknowledge the importance of privacy intermediaries. For controlling and managing their personal data, individuals need the technical architectures (such as digi.me) and supportive guidelines (privacy guardians).

The technological development should not be perceived by consumers and legislators as a big threat to privacy and personal data. While technology might pose some risks, it can also provide useful solutions for the protection of individuals and their fundamental rights. Therefore privacy and sharing are not foes, but complementary to each other. “

This blog is a joint venture between digi.me and Maria Macocinschi

MEF Global Consumer Trust study 2017: all hail the rise of the savvy user

Driving interoperability adoption with the Kantara Initiative

Here at digi.me, we have three driving principles that inform and influence every step we take.

Two of them, you won’t be surprised to hear, are privacy and security – but the third is slightly less obvious. What is it? It’s interoperability, and it’s absolutely vital in the field of personal data ownership and control.

The ability for open data exchange, and for data from various platforms and businesses to be brought together in a reusable and useful format demands interoperability, which in itself requires common standards and ontologies.

If we are to bring (as we must to regain full control over our personal data) massively disparate sources of data together, and then require them to function together as a whole, at least for processing purposes, interoperability is the only way forward.

And so the work on advancing that becomes hugely important – which was a key reason behind digi.me joining the Kantara Initiative, as they are doing a great deal of pioneering work in this area.

Julian Ranger, Founder and Executive Chairman of digi.me, said: “It is important that we are leading the work to promote cross-businesss and cross-platform interoperability to allow individuals to maximise the use of their personal data whilst having full control.

“To this end, we have joined the Board of Kantara and are active within their Working Groups promoting development and adoption of standards for the Personal Data Ecosystem.”

Find out more about Kantara in this short leaflet summarising their most notable activities.

What Hillary Clinton as US president would mean for the future of tech

As the eyes of the world look to the US ahead of the presidential elections in November, it’s clearer than ever that where Barack Obama’s successor leads, other countries will follow.

Which, regardless of which way you lean politically, makes Hillary Clinton publishing her tech agenda very interesting.

Technology is at the forefront of many of the big questions facing our world today, and critically important to how people in that world are able to live, work, be their best selves and contribute to the world of tomorrow.

Of course we’re not naive – her agenda is designed to tick all the boxes for all the people, and most politicians worldwide over promise and under deliver – but it’s still a glance into what a Clinton presidency would mean for some key tech issues.

As anyone familiar with us will know, our driving ambition is all about putting data back into the hands of the people who have most at stake in it – ie you – and giving them more control over what they can do with that.

As part of that, what countries say at a top tier level about what they want data to be and do is, in many ways, critical to our business and how we expand across the globe. So Hillary saying that she wants to “harness the power of technology and innovation” as well as fight for privacy and net neutrality speaks to what we believe in, and is very encouraging to hear.

Our Internet of Me vision sees each of us at the centre of our connected lives, gaining greater insight by gathering information formerly scattered all over the web in one place, and then allowing consumers to exchange it for rewards, while businesses allowed access to it can innovate using 100 per cent accurate and rich data sets.

The open flow of data worldwide is a critical part of this, and so commitments to: “fighting for Internet Freedom and insisting on the responsibility of all nations to respect free speech and human rights online, as well as the open flow of data across borders and access to digital markets.” also get the thumbs up.

Again with the caveats that having an agenda, even if elected, is of course no guarantee of action, plans to open up US data sets on health, education and criminal justice and strong protection of consumer values all sound broadly good too.

It’s got to be put in the mix with other policies as well, of course, and America will decide its next President on more than what they will do in the technology sphere.

But it would be interesting to see Donald Trump set out a similar agenda, so we could see where he stands on these critical issues of our times.

 

 

You, yes YOU, should be the most important thing in the Internet of Things

The second part of the extensive interview with our founder and chairman Julian Ranger has been published, focusing on why the IoT needs an Internet of Me to make it real and fix obvious and glaring privacy and security issues.

It’s a cracking read, and covers a lot of ground including the war on privacy, why data collection needs to move from behind us to in front of us, and why the way apps and devices handle data needs to undergo a fundmental shift.

I can’t recommend it highly enough if you have any interest at all in the field of personal data privacy and the personal data economy (which we all, of course, should have) – so here are a couple of choice quotes to whet your appetite:

“The Internet of Things at the moment is being built on data that is collected behind us. It needs to move in front of us. We need a path towards that, from the dark side to the light side. The IoT can’t work behind us, which is the way it’s been built today. That’s just loading more rubbish on a rubbish framework. There needs to be a new framework.”

“A large part of the cost of IoT is that all this data is being racked up into offline storage that companies are having to do. It is uneconomic to keep supporting old stuff. But what would happen if the data came to me first — not necessarily just into digi.me but wherever I choose to keep it — and I then decided where it went? If a business no longer wants to support it I can still keep this piece of kit and it keeps talking to my system for ever if I want, but more importantly I get to control where the data goes.”

Tempted? Excellent – it’s well worth your time.

And if you missed the first interview, which took place over on the Internet of Me forum which we sponsor and support, you can find that here.

The whys and why-nots of using wearables at work

Wearables have been in the news a lot lately – notably where the line should be drawn between employers keeping an eye on the health and wellbeing of those they employ, and where that crosses into breach of privacy.

Company fitness trackers, incentives and even competitions are all in common use to promote a leaner, stronger workforce – and inevitably this produces data about our habits outside, as well as inside, work.

As this article points out, its is in no-one’s interests to have overworked, stressed and anxious employees – particularly if they don’t spot this in themselves. So, carried out in a climate of trust and accountability, tracking health, for example, can be helpful for both parties.

But however well-intentioned, does that give companies the right to track health data generally in a bid to optimise performance, to see how employees react to stressful situations before choosing who makes important presentations, for example, or to detect some health conditions before they become serious?

The advent of the web and messaging platforms has already led to an ‘always on’ culture where many find it trickier to maintain a good work/life balance. In the same vein, should employers really be able to track their employees 24/7, including out of office hours, even if it’s supposedly for their own good? Most would stay not.

Not only are there concerns over privacy – employees who are pregnant, for example, may not want to share this before the traditional 12-week scan that confirms all is well – but wearables may pick up on changes well before this. And of course there are concerns over data accuracy – nobody wants to be judged on something that is ultimately wrong, but this could be hard to prove or overturn.

AS the WSJ article points out: “There are serious questions about the accuracy of many of the apps that power these devices. Making decisions about people based on their private information is bad enough. Worse would be making decisions based on private health data that was wrong.”

Dr Deborah Peel, of Patient Privacy Rights, writing ahead of their annual summit on June 7th which digi.me is sponsoring, is clear that consent is crucial.

She said: “As technology moves faster and faster, we need to stop and ask questions.

“Before you give your data away, take a look at the device or app’s privacy policy. Do you understand how information you volunteer will be used? Is it clear? Most importantly, how can we minimize the potential misuse of our data, and maximize the benefits of wearable technology so that they serve the individual before they serve your employer’s bottom line?”

A requirement to use, and share information from, data trackers is also likely to undermind employee morale, if they feel they are not trusted to be truthful about their fitness for work.

Data-sharing agreements can be put in place, but the employee may still feel under pressure to reveal more than they are confortable with to justify their continued employment, so there would likely be a need for even more policies around coercion.

All in all, while they can clearly have benefits for both parties, employers need to tread carefully when considering a wearables policy – and get consent at every step of the way.

EU GDPR: full details of what it means for personal data and your business

Data is the currency of today’s digital economy – and the new GDPR will not only protect this valuable resource for both individuals and companies when it becomes law in 2018 but increase innovation and cut costs as well.

According to estimates, the value of European citizens’ personal data has the potential to grow to nearly €1 trillion annually by 2020 – and business opportunities will only be increased by strengthening and unifying Europe’s already high standard of data protection.

Jan Philipp Albrecht (Greens, DE), who steered the GDPR legislation through Parliament, said: “The regulation will also create clarity for businesses by establishing a single law across the EU. The new law creates confidence, legal certainty and fairer competition.” But what are the key things businesses need to know?

  • One law for the whole continent – one of the biggest attractions is that Europe will now be covered by one law, applied in the same way everywhere, instead of a patchwork of national ones. Eliminating the need to consult local lawyers in each country a business has dealings or premises will see direct cost savings as well as legal certainty. Savings from dealing with one pan-European law rather than 28 are estimated at €2.3bn per year.
  • Regulatory one-stop shop – businesses will only have to deal with one regulatory body rather than 28, making it simpler and cheaper for companies to do business in the EU. They will also profit from faster decisions, one single contact point and less red tape as well as consistency of decisions where the same processing activity takes place in several member states.
  • The same rules for all companies – all companies, whether or not they are based in the EU, will have to adher to the same rules when doing business with its citizens, creating a level playing field that does not exist at the moment where European companies are governed by stricter standards.
  • Technological neutrality – innovation will continue to thrive under the new rules.

There are also new rights aimed primarily at giving individuals more control over their personal data that will additionally benefit business. For example, the new right to data portability, which allows individuals to move their personal data between service providers without losing, for eg contacts and emails, will take away disincentives to switch which often mean building up again from scratch, meaning start-ups and small companies can compete on equal terms in markets previously dominated by industry giants. This will make the European economy more competitive. New privacy-friendly solutions are also likely to fare well in this climate.

SMEs will also benefit from a data protection reform aimed at stimulating economic growth and allowing them to access new markets by cutting costs and red tape for European business. As well as the measures outlined above, such as one law instead of 28, the obligations on data controllers and processors are adjusted based on the size of the business and/or the the nature of the data being processed, so as to avoid creating unnecessary red tape and a disproportionate regulatory burden for smaller firms. So, for example:

  • SMEs need not appoint a data protection officer, unlike larger companies, unless their core activities require regular, systematic and large scale monitoring of data subjects. or they process sensitive areas of personal data such as that revealing racial or ethnic origin or religious beliefs.
  • They also do not need to keep records of any processing activities that are occasional or are unlikely to result in a risk to the rights of the data subject
  • They will also not be obliged to report all data breaches to individuals, unless these represent a “high risk for their rights and freedoms.”

An essential principle of the new system will be that data protection is private both by design and by default, which will incentivise businesses to innovate and “develop new ideas, methods, and technologies for security and protection of personal data.”

The new rules promote techniques such as anonymisation (removing personally identifiable information where it is not needed), pseudonymisation (replacing personally identifiable material with artificial identifiers), and encryption (encoding messages so only those authorised can read it) to protect personal data.

The use of “big data” analytics, such as driverless cars, which can done using anonymised or pseudonymised data, will be actively encouraged under the new regulation, showing it goes hand in hand with innovative and progressive solutions.

Overall, the new data protection rules give businesses opportunities to remove the lack of trust that can affect people’s engagement through innovative uses of personal data.

Giving individuals clear, effective information about what their data is being used for will help build trust in analytics and innovation for the benefit of all.

Why privacy is a right… not a luxury

Privacy should be a right we can all take for granted – but the problem is it is being taken away from us without our consent.

Will new trading models that put data back in the hands of the individual, who can then share or exchange it at will, make privacy the preserve of only those who have sufficient income to make a choice over this, rather than those whose circumstances compel them to to take advantage of what’s on offer, whether or not that’s what they would choose?

That’s the premise in this otherwise sensible article on the growth of the sharing economy, which ponders:  “But paying consumers to give up their privacy may not be particularly freeing for lower-income tech users. The practice essentially puts a premium on privacy: If you want to keep your data, and stay anonymous, you have to give up cash and deals. If this model plays out, a private smart home will be more expensive than one that reports back on its users.”

The article, which also mentions an AT&T deal in the US, where not having your search and browsing history recorded costs more each month, makes some good points, and that is one interpretation of the facts.

But the bald and biggest fact being overlooked here is that nothing is free in this world – users of so-called ‘free’ newspaper or gaming sites just don’t realise they are paying for them with their data!

Ultimately, privacy is a right that we should all automatically have and which nobody should have to pay for.

But yet while we make choices all the time in every area of our lives, at the moment we don’t have a choice about what happens to our data – information about us – that is taken from us multiple times a day without permission and then used as a crude (and often irritating) targeting tool.

Sometimes we do actively sign this right of ours away, but through ‘I agree’ buttons and long-winded privacy policies designed to confuse and bore us and which companies know the vast majority of us will just quickly scan, at most, to get to the service they offer.

Putting the individual back in control, at their centre of their connected world in what we are calling the Internet of Me, will automatically enable a more private world.

Getting a discount or deal simply for sharing that data, on your terms, with who you choose, will be a benefit that will appeal to all, and simply another life choice to make.

So, actually, if privacy could be seen as a luxury, it’s a luxury around actually having a choice, as opposed to being forced to give up data as we all are at the moment, whether we know it or not. Or even like it or not.

But the bottom line is that privacy is a right for all – and here at digi.me we’re doing everything we can to enable a world in which that’s known and accepted universally.

Ad-blocking hits the mainstream in the UK

Nearly 15 million people in the UK will be using ad-blocking technology by 2017 according to the first ever estimate from eMarketer.

By the end of next year, it expects 27% of internet users, or 14.7 million people, will be choosing to stop digital ads on at least one of their devices, largely in response to evermore intrusive tracking ads that take personal data without permission and also create slow, heavy pages that cannibalise bandwidth and add to page load times.

The report estimates that of the 10.9 million people who currently block ads, the vast majority (90.2%) do so on a desktop or laptop PC, with about 28% blocking ads on smartphones, although there is overlap as some block on multiple devices. Mobile ad blocking is still lagging behind, as the tech is still catching up and ad blocking doesn’t fully work within apps, where most mobile users spend their time.

eMarketer senior analyst Bill Fisher said: “There’s no doubting that ad blocking is now a very real issue for advertisers. Next year, over a quarter of the people they’re trying to reach will be wilfully making themselves unreachable.

“The good news is that numbers like this have forced those within the industry to think long and hard about what it is that they need to do better in order that this practice doesn’t become an epidemic.”

The “greatest consumer boycott in history”, as it has been dubbed, already sees more than 250,000 million users of ad-blocking tech worldwide, with numbers increasing fast.

And while a failure of advertising to understand what their readers do – or don’t – want has contributed heavily, the effects are far-reaching.

Industries and models that rely on advertising to fund their content and make it available for free – including video games and newspapers – find their very funding model under threat as part of this anger aimed at advertising that tracks us and sells on our data.

So what’s the way forward? Ad blocking has always been with us, in the sense that if we didn’t want to read an ad in a paper we simply skipped over it and on to the next news item we were interested in. Ad-blocking isn’t designed to punish the publishers, but the ads that don’t respect our privacy – so how can we apply this analogue blocking to the digital age, and pack a punch where it needs to go without hurting those who are largely innocent parties?

Handily, work is underway at the Internet Identity Workshop in Mountain View on just that, at sessions run by internet legend and our advisor, Doc Searls, and attended by our founder and chairman Julian Ranger and our EVP North America Jim Pasquale.

As Doc said ahead of the session: “What we need is a solution that scales for readers and is friendly to publishers and the kind of advertising readers can welcome—or at least tolerate, in appreciation of how ads sponsor the content they want. This is what we have always had with newspapers, magazines, radio and TV in the offline world, none of which ever tracked anybody anywhere.

“So now we offer a solution. It’s a simple preference, which readers can express in code, that says this: Just show me ads that aren’t based on tracking me. Equally simple code can sit on the publishers’ side. Digital handshakes can also happen between the two…”

The work at IIW will be to reach agreement on that term, its wording , and the code that expresses and agrees to it.

As Doc said: “…this one term is a first step. There will be many more before we customers get the full respect we deserve from ad-funded businesses online. Each step needs to prove to one business category or another that customers aren’t just followers. Sometimes they need to take the lead.

“This is one of those times.  So let’s make it happen.”

Privacy attitudes harden as consumers wake up

All the signs are that mass consumer awareness of the issues surrounding data privacy is finally reaching a tipping point that will force an economic consensus for action.

Already this year we have seen the re-writing of the Safe Harbour agreement, agreement for the GDPR in Europe (also affecting those who trade with us), the stand-off between Apple and the FBI over encryption and the proposed Investigatory Powers Bill in the UK.

Governments and businesses know more about us than ever – and consumers are fighting back. Over 200 million people now employ ad-blocking software, fed up with intrusive trackers that steal their data without their consent and affect their page loading speeds as well as taking up excessive bandwidth.

So it’s of little surprise that OpenXchange’s Consumer Openness Index 2016 shows a hardening of attitudes in the past 12 months.

The headline statistic is that people care about privacy more than ever before, with 80% of the 3,000 questioned believing everyone has a fundamental right to privacy.

The Internet-savvy populations questioned in the US, UK and Germany also said they are more likely to stop using many types of companies if news of a privacy scandal emerged, while those who believe that companies such as Facebook, Twitter and Google should never have the right to share personal data is now up to 57%.

“Governments and corporations are gathering unfathomable amounts of information about the online lives of every individual,” said Rafael Laguna, CEO of Open-Xchange. “As a result, it’s no surprise that across the world, people increasingly fear their personal data is exposed. Worse than that, recent studies have shown that people feel powerless to protect their data. But there is hope: there are signs that citizens believe that compromising their right to privacy can no longer be tolerated. They are asking for greater transparency in the services they use and the politicians they elect, and searching for solutions to protect themselves.”

Consumers are also demanding the ability to protect their data, as the majority (88%) would be interested in at least one encryption-related service, such as a one-click button that encrypts outgoing email or encryption as a standard feature of applications they use.

All of which is good news for the good ship privacy and all who believe in and sail in her.

This survey, and many others before it, reinforces the belief that underpins everything we do here at digi.me, that the current system is broken beyond repair and needs radical transformation. Neither users nor businesses get what they need from the current advertising model, and both sides are trading salvos in a war that shows every sign of escalating without the prospect of resolution.

We believe the only solution is a new way forward, a connected world centred on the individual in control of their own data, that businesses can then approach directly for access for rich data in return for something the user wants, in the form of an offer or personalised service.

This is what digi.me will be when our Permissioned Access model comes in later this year, and this move to the Internet of Me is so important that we are sponsoring an industry-wide forum to find the best solutions for all – you and the businesses you deal with; whoever, whatever and wherever they may be.

The future is bright, the future is you – and we look forward to helping and guiding you on that journey.