The starting point for most privacy and data protection laws is creating a safer environment for all of us and our personal data – but the inevitable overreach often has far-reaching consequences
Most privacy and data protection laws have the noble aims of making us and our personal information safer – but overreach in the detail is a common side effect of attempts to do the right thing.
The consequences of this legal overreach can themselves be far-reaching – not just to personal privacy, but to technological innovation as a whole, if creators and those with grand ideas feel stifled by the competing needs of overlapping legislation.
The worst case scenario? Potential stagnation for technological innovation.
The broad scope of privacy and data protection laws is generally to ensure the free flow of personal data between the member states, while their ultimate purpose is to regulate how such data should be processed in order to maintain a balance between the various interests of the personal data ecosystem.
Of course, constant fluctuations in both technological and socio-economic contexts make achieving these grand aims a challenge. Regulation is always lagging behind new technological and market challenges, even as it struggles to keep up.
As Maria Macocinschi, who is studying for a doctorate in law at the University of Turku in Finland, notes: “The rigidity in revising and adapting the laws to the fast technological and economic developments is creating frustrations not only for consumers but also for companies.”
She also cites the much-praised General Data Protection Regulation (GDPR), which comes into force in May next year, as a well-intentioned law that may have adverse side effects.
She said: “GDPR, for example displays two contradictory trends. While it ensures a simplification of the regulatory environment and harmonisation of the standards, it also poses additional burdens and costs for companies. Therefore, the free flow of information might be quite affected by these overwhelming obligations.”
Regulation is inevitably deeply complicated, balancing as it must the conflicting interests of the various parties involved (public and private institutions, and consumers) as well as translating more traditional human values in a constantly changing digital environment.
Laws around surveillance are a good example of clashing interests and values: while surveillance such as CCTV is employed primarily for the protection of the citizens for security reasons, the same technologies are now being used in ways that seem to undermine the same values once sought to be protected.
Countries like China, for example, are trying to use technology that will predict when a crime is going to take place, before it even happens – the very stuff of sci-fi films.
The potential for horrifying consequences for those caught up in it makes it increasingly important that surveillance, and the emerging dataveillance phenomenon, should be carefully regulated to ensure a balance between the public interest, the economic rights of companies and the individuals’ privacy and data protection.
In terms of increasing the efficiency and effectiveness of current data protection laws, Maria says there are three broad areas that should be considered:
- We need to look at how traditional legal concepts should be revised, taking into account the current state of information innovations
- We need to look at how we regulate the emerging actors in this burgeoning ecosystem, as well as the new methods of collecting and processing data.
- We also need to reframe the importance of the legal requirements for consent in the intensified and opaque dataveillance systems.
So how do we balance the necessary values and rights for the democratic functioning of the society with preserving personal privacy? This, of course, raises questions of how much privacy is desirable, legally and otherwise?
As with so many other things, regulation initially and superficially seems to be the natural answer here – providing guidelines for the protection of individual interest and public good. However, the law by itself cannot achieve this goal.
Furthermore, the extent to which we all, as consumers, promote and open up our own private lives through social media poses its own problems. The internet is a growing force in all our increasingly transparent lives. With the big data crunching capabilities of all the information we have willingly or unknowingly put out there, the ability for public and private actors to know far more about us than we are comfortable with has never been more real. Our identities, behaviours, transactions and other preferences and vulnerabilities are all gathered and exploited for various obscure purposes.
Again, legislation such as the GDPR is trying to address this, by putting more power over personal information back in the hands of consumers – but here too, law-making inevitably runs behind real life, meaning we are always struggling to keep up.
A new right to data portability (Art. 20 GDPR) and a revised right to be forgotten (Art. 17 GDPR) are aiming to build a stronger protection for the data subject and redress consumer sovereignty. However, such powers for individuals are not absolute. The interest in the protection of information privacy will always be balanced against other public interests as necessary in a democratic society (Recital 73 GDPR).
So how should we try and find this balance moving forwards? Maria has three key suggestions.
She said: “Balancing conflicting interests is difficult but not impossible. A first step would be educating individuals about what informational privacy is and the real benefits and consequences of sharing personal information. In a democratic society, a person should not isolate herself from the rest of the community, but rather participate and contribute to the decision making.
“Therefore, data protection regulations should not be perceived as tools facilitating the invisibility of the individuals to the rest of the world. Rather, they provide the necessary measures to ensure their safe participation in the society. Disclosing personal information is a requisite for identification in a digital environment of disappearing bodies, and for effectively communicating their consumer preferences to the companies.
“Secondly, each participant in the personal information ecosystem should acknowledge the importance of privacy intermediaries. For controlling and managing their personal data, individuals need the technical architectures (such as digi.me) and supportive guidelines (privacy guardians).
“The technological development should not be perceived by consumers and legislators as a big threat to privacy and personal data. While technology might pose some risks, it can also provide useful solutions for the protection of individuals and their fundamental rights. Therefore privacy and sharing are not foes, but complementary to each other. “
This blog is a joint venture between digi.me and Maria Macocinschi