Tag Archives: data protection

Analysis: The pros and cons of privacy and data protection laws

The starting point for most privacy and data protection laws is creating a safer environment for all of us and our personal data – but the inevitable overreach often has far-reaching consequences

Most privacy and data protection laws have the noble aims of making us and our personal information safer – but overreach in the detail is a common side effect of attempts to do the right thing.

The consequences of this legal overreach can themselves be far-reaching – not just to personal privacy, but to technological innovation as a whole, if creators and those with grand ideas feel stifled by the competing needs of overlapping legislation.

The worst case scenario? Potential stagnation for technological innovation.

The broad scope of privacy and data protection laws is generally to ensure the free flow of personal data between the member states, while their ultimate purpose is to regulate how such data should be processed in order to maintain a balance between the various interests of the personal data ecosystem.

Of course, constant fluctuations in both technological and socio-economic contexts make achieving these grand aims a challenge. Regulation is always lagging behind new technological and market challenges, even as it struggles to keep up.

As Maria Macocinschi, who is studying for a doctorate in law at the University of Turku in Finland, notes: “The rigidity in revising and adapting the laws to the fast technological and economic developments is creating frustrations not only for consumers but also for companies.”

She also cites the much-praised General Data Protection Regulation (GDPR), which comes into force in May next year, as a well-intentioned law that may have adverse side effects.

She said: “GDPR, for example displays two contradictory trends. While it ensures a simplification of the regulatory environment and harmonisation of the standards, it also poses additional burdens and costs for companies. Therefore, the free flow of information might be quite affected by these overwhelming obligations.”

Regulation is inevitably deeply complicated, balancing as it must the conflicting interests of the various parties involved (public and private institutions, and consumers) as well as translating more traditional human values in a constantly changing digital environment.

Laws around surveillance are a good example of clashing interests and values: while surveillance such as CCTV is employed primarily for the protection of the citizens for security reasons, the same technologies are now being used in ways that seem to undermine the same values once sought to be protected.

Countries like China, for example, are trying to use technology that will predict when a crime is going to take place, before it even happens – the very stuff of sci-fi films.

The potential for horrifying consequences for those caught up in it makes it increasingly important that surveillance, and the emerging dataveillance phenomenon, should be carefully regulated to ensure a balance between the public interest, the economic rights of companies and the individuals’ privacy and data protection.

In terms of increasing the efficiency and effectiveness of current data protection laws, Maria says there are three broad areas that should be considered:

  • We need to look at how traditional legal concepts should be revised, taking into account the current state of information innovations
  • We need to look at how we regulate the emerging actors in this burgeoning ecosystem, as well as the new methods of collecting and processing data.
  • We also need to reframe the importance of the legal requirements for consent in the intensified and opaque dataveillance systems.

So how do we balance the necessary values and rights for the democratic functioning of the society with preserving personal privacy? This, of course, raises questions of how much privacy is desirable, legally and otherwise?

As with so many other things, regulation initially and superficially seems to be the natural answer here – providing guidelines for the protection of individual interest and public good. However, the law by itself cannot achieve this goal.

Furthermore, the extent to which we all, as consumers, promote and open up our own private lives through social media poses its own problems. The internet is a growing force in all our increasingly transparent lives. With the big data crunching capabilities of all the information we have willingly or unknowingly put out there, the ability for public and private actors to know far more about us than we are comfortable with has never been more real. Our identities, behaviours, transactions and other preferences and vulnerabilities are all gathered and exploited for various obscure purposes.

Again, legislation such as the GDPR is trying to address this, by putting more power over personal information back in the hands of consumers – but here too, law-making inevitably runs behind real life, meaning we are always struggling to keep up.

A new right to data portability (Art. 20 GDPR) and a revised right to be forgotten (Art. 17 GDPR) are aiming to build a stronger protection for the data subject and redress consumer sovereignty. However, such powers for individuals are not absolute. The interest in the protection of information privacy will always be balanced against other public interests as necessary in a democratic society (Recital 73 GDPR).  

So how should we try and find this balance moving forwards?  Maria has three key suggestions.

She said: “Balancing conflicting interests is difficult but not impossible. A first step would be educating individuals about what informational privacy is and the real benefits and consequences of sharing personal information. In a democratic society, a person should not isolate herself from the rest of the community, but rather participate and contribute to the decision making.

Therefore, data protection regulations should not be perceived as tools facilitating the invisibility of the individuals to the rest of the world. Rather, they provide the necessary measures to ensure their safe participation in the society. Disclosing personal information is a requisite for identification in a digital environment of disappearing bodies, and for effectively communicating their consumer preferences to the companies.

Secondly, each participant in the personal information ecosystem should acknowledge the importance of privacy intermediaries. For controlling and managing their personal data, individuals need the technical architectures (such as digi.me) and supportive guidelines (privacy guardians).

The technological development should not be perceived by consumers and legislators as a big threat to privacy and personal data. While technology might pose some risks, it can also provide useful solutions for the protection of individuals and their fundamental rights. Therefore privacy and sharing are not foes, but complementary to each other. “

This blog is a joint venture between digi.me and Maria Macocinschi

Were YOU hacked? The top data breaches of 2016

2016 was the year that records kept being broken for topping the league that no-one wants to be part of – biggest data breaches of all time.

MySpace, Yahoo and FriendFinder were the big three (although there were countless ‘smaller’ ones) – all revealed in the last few months, despite being historic breaches.

MySpace seemed – and was – a huge deal when it was announced. It emerged approximately 360m records from the now-defunct platform had been hacked after millions of user names and passwords were made available online.

That seemed a ridiculous number until FriendFinder came along to top it, with 400 million stolen records, spanning 20 years of data. That it operated a lot of 18+ sites including Penthouse merely made it more tantalising (and garnered more press coverage).

But the big daddy of breaches was still to come – although the hack happened back in August 2013, Yahoo only became aware in November that data from ONE BILLION accounts had been stolen. They told the world in December, at the same time becoming the not-so-proud holders – for now – of the biggest data breach to date.

So how do you find out if you were caught up in any of these hacks? Yahoo, for eg, has notified users it thinks were affected, asking them to change their password – and that’s probably best practice if you have an account with any service that has announced a hack. LeakedSource also maintains a database of over two billion records, which you can search to see if your accounts or email has been conpromised.

And if you want to be amazed/freaked out at the same time – check out this excellent site which visualises key data breaches over 30,000 records since 2004. There are a lot…

EU GDPR: full details of what it means for personal data and your business

Data is the currency of today’s digital economy – and the new GDPR will not only protect this valuable resource for both individuals and companies when it becomes law in 2018 but increase innovation and cut costs as well.

According to estimates, the value of European citizens’ personal data has the potential to grow to nearly €1 trillion annually by 2020 – and business opportunities will only be increased by strengthening and unifying Europe’s already high standard of data protection.

Jan Philipp Albrecht (Greens, DE), who steered the GDPR legislation through Parliament, said: “The regulation will also create clarity for businesses by establishing a single law across the EU. The new law creates confidence, legal certainty and fairer competition.” But what are the key things businesses need to know?

  • One law for the whole continent – one of the biggest attractions is that Europe will now be covered by one law, applied in the same way everywhere, instead of a patchwork of national ones. Eliminating the need to consult local lawyers in each country a business has dealings or premises will see direct cost savings as well as legal certainty. Savings from dealing with one pan-European law rather than 28 are estimated at €2.3bn per year.
  • Regulatory one-stop shop – businesses will only have to deal with one regulatory body rather than 28, making it simpler and cheaper for companies to do business in the EU. They will also profit from faster decisions, one single contact point and less red tape as well as consistency of decisions where the same processing activity takes place in several member states.
  • The same rules for all companies – all companies, whether or not they are based in the EU, will have to adher to the same rules when doing business with its citizens, creating a level playing field that does not exist at the moment where European companies are governed by stricter standards.
  • Technological neutrality – innovation will continue to thrive under the new rules.

There are also new rights aimed primarily at giving individuals more control over their personal data that will additionally benefit business. For example, the new right to data portability, which allows individuals to move their personal data between service providers without losing, for eg contacts and emails, will take away disincentives to switch which often mean building up again from scratch, meaning start-ups and small companies can compete on equal terms in markets previously dominated by industry giants. This will make the European economy more competitive. New privacy-friendly solutions are also likely to fare well in this climate.

SMEs will also benefit from a data protection reform aimed at stimulating economic growth and allowing them to access new markets by cutting costs and red tape for European business. As well as the measures outlined above, such as one law instead of 28, the obligations on data controllers and processors are adjusted based on the size of the business and/or the the nature of the data being processed, so as to avoid creating unnecessary red tape and a disproportionate regulatory burden for smaller firms. So, for example:

  • SMEs need not appoint a data protection officer, unlike larger companies, unless their core activities require regular, systematic and large scale monitoring of data subjects. or they process sensitive areas of personal data such as that revealing racial or ethnic origin or religious beliefs.
  • They also do not need to keep records of any processing activities that are occasional or are unlikely to result in a risk to the rights of the data subject
  • They will also not be obliged to report all data breaches to individuals, unless these represent a “high risk for their rights and freedoms.”

An essential principle of the new system will be that data protection is private both by design and by default, which will incentivise businesses to innovate and “develop new ideas, methods, and technologies for security and protection of personal data.”

The new rules promote techniques such as anonymisation (removing personally identifiable information where it is not needed), pseudonymisation (replacing personally identifiable material with artificial identifiers), and encryption (encoding messages so only those authorised can read it) to protect personal data.

The use of “big data” analytics, such as driverless cars, which can done using anonymised or pseudonymised data, will be actively encouraged under the new regulation, showing it goes hand in hand with innovative and progressive solutions.

Overall, the new data protection rules give businesses opportunities to remove the lack of trust that can affect people’s engagement through innovative uses of personal data.

Giving individuals clear, effective information about what their data is being used for will help build trust in analytics and innovation for the benefit of all.

The Internet of Me: something we can (and do) all agree on

Unity on identifying key emerging IT trends is something of a rarity, but everyone is singing from the ‘People First’ hymnsheet at the moment – and we’re at the very forefront of the revolution with our Internet of Me forum.

We’ll get to the who in a minute, but the obvious question is why, right? The start of a year often heralds reflection on the months past, and the chance to refine a company vision or personal perspective for the year ahead, but this is so widespread, so pervasive, that it feels overwhelmingly like a change in progress rather than an ideological wistfulness that will achieve nothing but some personal angst among the industry high ranks before moving on, never to be touched on again.

This, a converging of influential people and organisations saying that people need to come first and be the guiding lights and influence on growth and innovation rather than the other way around, is the start of a new normal, a chance to right the wrongs of the past and build a better, more connected future for all, with every one of us at the centre of, and in control of, our own digital lives.

It’s hard to put it better than Paul Daugherty, chief technology officer at Accenture, who said of their newly-launched Technology Vision for 2016, which this year promotes the importance of people over technology development: “Our Tech Vision theme of People First is really resonating – it’s touched on a raw nerve that many leaders have been feeling for a while now.  In short, their investments in technology have outpaced their investment in people – and it’s time to mind that gap.”

One of the five pillars of their annual respected Vision report, which seeks to pinpoint the emerging IT developments that will have the greatest impact on companies, government agencies, and other organisations in the next three to five years, is one for which Paul’s shorthand formula is Digital Trust = Digital Ethics + Cyber Security.

In other words, trust is crucial at many levels, both in new technologies, the businesses that want to share or use data, and those such as digi.me, which has a deeper and more personal focus to enable others to do this, unlocking the power of personal information for each and every user.  As he points out: “That’s why organizations need a new focus on cultivating trust among customers, employees, and partners – and this will differentiate those who get it right.”

Here at digi.me, trust in us and our product is crucial to our business vision – and we know we’re getting it right. We’ve been banging the Internet of Me drum for a while now, sponsoring and supporting an independent forum on what we firmly believe is a technological revolution that will transform the personal data economy (catch up with the most recent articles, including an interview with the legendary Doc Searls, here) for our business as well as many, many others. But the beauty of the movement is that we are far from alone. In fact, when it comes to trust and the need to return power over data to the person who made it in order to build it deeper and sooner for the benefit of all, we’re singing to the choir.

Here are just a selection of comments influential people in the tech world have made on the Internet of Me theme, which puts the user at the heart of their own connected world, in just the last few weeks:

Tim Cook, Apple CEO: “Over the arc of time, customers will move to people they trust with their data.”

Marissa Mayer, Yahoo CEO: “We need to afford the individual control. Users need to own their data, which they can examine, take it with them to other sites and vendors…”

Edith Ramirez, FTC head: “Consumers are going to be slow to take up these products if issues of privacy and security are of a concern.”

Marc Benioff, Salesforce CEO: “Trust is a serious problem. The reality is that we all have to step up and get to another level of transparency and openness…The digital revolution needs a trust revolution.”

And it’s so obvious, so beautifully clear, that it barely needs explaining. But in short – the days of businesses telling users what they need and how they can have it are in their dying days. And rising in time with that is a new power to the people, that gives each and every one of us a new and potent choice – the right to choose who sees and who we trust with our data, whatever form that may take.

Defining ownership and control in a digital world

Used online, terms such as ownership and control have slightly more fluid connotations than their physical counterparts – but we can still define their context and meaning.

At digi.me, we unlock the power of personal data for users by enabling them to gather and collate information from multiple services, platforms and places in one single library that they own and control.

This library is the only place all this information exists together, allowing instant greater personal insight even before users begin to exchange or share slices of data, on their terms, for convenience, service or reward.

So our users own this data, this library, this collection – but does the data still exist in the original places it was found? Yes it does, but each individual now has a vastly more useful, insightful and comprehensive body of data than ever before, gathered together in a unique form, that they can access at will and exchange or share as wanted, thus controlling as well as owning it.

We’ve occasionally been asked how digi.me can really be returning data ownership to the individual if another copy, which they have no control over, exists, but this shows a limited understanding of the realities of the online world.

More importantly, it would imply ownership could only ever mean that just one copy of something existed, over which you had 100% control that could not be subverted – and this simply does not apply digitally in the same way it does physically.

Let’s give some examples. In the physical world, if I own a car it is mine completely (ignoring any financing), I control it completely, and it is clear and undeniable to others that this is the case.

So far so simple, yes? But even offline, things can be cloudier than they first appear, as simply having something tangible I can hold in my hand does not necessarily confer complete ownership or control.

If I am sent a bank statement, for example, then I own that copy of my financial data and what happens to it- but of course the bank still has all that same data too.

When we move online, it soon becomes clear that concepts of ownership and control have, by necessity and by evolution, become even more fluid.

So if I take a picture of something or someone, I own that image. It’s physically mine, stored on my camera and phone, and I can, crime or loss aside, control who sees it or accesses it.

But if I post it online, to Facebook or Twitter, for example, then things become more complicated. I still own the original photo, but lose control of what happens to the copy that I have shared, in terms of what people can say about it and what they do with it.

Yet my original ownership of the master document, if you will, remains unchanged even despite the presence of potentially multiple other versions.

Applying this principle to our app and the data it enables users to connect to it, it is clear that when you have your data in digi.me, then you own that data.

Other copies of the slices of data that make up the whole still exist, but you own your unique, extended and enhanced version – and where possible and where you are allowed by T&Cs, can seek to delete or, in the future, ask companies to forget these other versions, when the EU-wide General Data Protection Regulation (GDPR) laws come in in 2018.

So what of the original copies of these many slices of data? What happens to them? Well, nothing is the answer. They remain where they were, being of limited or little use to both the individual who created them, or businesses hoping to gain insight into their users.

The bottom line is that our app, uniquely and appealingly, allows users to create and compile an increasingly all-compassing picture of the data from across their lives. One that will continue to evolve, develop and deepen the more they add to it, and one which they will always own and control.

TalkTalk hack: is stolen data really unencrypted?

The news that up to four million TalkTalk customers have had personal details stolen in a massive hack is serious enough – but suggestions that this crucial personal data may not have been encrypted seriously ups the ante.

The telecoms firm has revealed that information such as customers’ names, addresses, phone numbers, dates of birth, and partial bank details could now be in the hands of hackers. And we now know it may not have benefited from an extra layer of security known as encryption.

So what does this mean? Basically, unencrypted data is plain text – it can be read easily by anyone, without the need for special keys or passwords. But encrypted data is just that – encrypted. While hackers are able to steal it, they’re not necessarily able to read it or sell it on in any way – unless they have the key or code needed to unlock it, it is largely useless to them.

Encrypting data obviously has many uses, ranging from the obvious security benefits to companies holding personal data through to reassuring customers that hacks will not automatically see their personal information disseminated on the web.

It’s not a legal requirement, as TalkTalk’s CEO has been at pains to point out – but there’s a huge argument that it just makes sense to use it.

Hacking and cyber crime in general is on the increase, so no company is able to completely guarantee they will never be a victim, despite their best efforts. With this in mind, taking the best possible care with customer data, particularly sensitive information of exactly the type that can be used to scam people or clone online identities, just seems to make sense.

But that doesn’t seem to have been the case at TalkTalk, with CEO Dido Harding unable to guarantee all the data stolen was encrypted, although the company claimed that it had been kept securely (which is a very different thing).

But what does this all this talk of how secure the data was mean to us, the average user? Well, for starters, it’s a good lesson in finding out as much as we can about what each company who holds our personal data does with it, and how securely they treat it.

It’s also a good lesson, particularly if you may be one of those unfortunate TalkTalk victims, to keep an eye on your credit report, so you can see if anyone attempts to open new accounts in your name. If you do see any that you don’t recognise, contact your bank or financial services provider immediately, and also report any fraudulent activity to Action Fraud on 0300 123 2040 or http://www.actionfraud.police.uk.

Looking to the future, moving to a place where we each have control of our data so that we keep our most important details safe and secure ourselves and share them only with people or companies we want to or trust is an obvious next step in the personal data revolution.

While companies such as digi.me are working on making just this happen, across multiple industries, for now you can keep your social media content safe and backed up with our free app – click here to get your copy now.

Data privacy breach complaints leap by a third

New figures show that the Information Commissioner’s Office has received a record number of complaints from individuals concerned that their personal data is not being kept sufficiently secure by organisations holding it.

Reports to the ICO relating to personal information security jumped 30 per cent from 886 in 2013 to 1150 in 2014 – or more than two complaints a day on average.

Taken over a five year period, complaints to the ICO about the same issue have increased by 64 per cent.

International law firm Pinsent Masons, which obtained the information through a Freedom of Information request, says that the increase in consumer complaints highlights increasing levels of public unease over how big business and other organisations store personal information.

High profile attacks on trusted corporations like Sony and Target, and the recent damaging attack on infidelity site Ashley Madison, have raised public awareness about how personal data is treated, the firm says.

Luke Scanlon, technology lawyer at Pinsent Masons, said: “Information security isn’t a new issue; businesses have always had a responsibility to protect customer data. But as consumers are increasingly finding themselves left exposed as a result of cyber attacks, concern is clearly growing. The chances are that they wouldn’t be making these complaints without having been directly impacted in some way.”

Under the Data Protection Act, businesses can be fined up to £500,000 by the ICO if the regulator finds that the company has failed to take appropriate measures to protect customer information, and the financial penalties can be far higher if the individuals compromised opt to take legal action against the business.

He added: “There is increasing recognition that how an organisation responds to the compromise of customer data can impact its long term prospects as deeply as the incident itself.

“Many of the businesses and other organisations we are working with are working hard not just to implement good procedures and controls, but also to develop cross-disciplinary teams who understand the legal and reputational issues in the event of a crisis. Chief Executives, CIOs, General Counsel and Communications Directors are getting around the table to say: how do we respond if this happens to us?”

Around 90 per cent of large organisations and 74 per cent of small businesses experienced information security breaches in the past year, according to a UK Government-commissioned survey published in June 2015, however it is not currently mandatory to report data breaches.

Facebook data transfers under threat after Safe Harbour ruled invalid

Facebook’s right to transfer personal data from the EU to the US has been dealt a blow after the pact it was being done through was declared invalid by the European Court of Justice.

The Safe Harbour agreement (Safe Harbor stateside) was a voluntary pact set up 15 years ago to get around the fact that US data protection laws are significantly less rigorous than their EU counterparts.

Under the scheme, US companies self-certified that they were talking adequate data security precautions in order to be able to access and use European data.

More than 5,000 US companies take advantage of it, as well as global tech giants such as Facebook, which registers users outside of the US and Canada under its Ireland subsidiary Facebook Ireland Ltd. It is estimated to be reponsible for 83.1% of all worldwide Facebook users, but moves data from Dublin to the US to be processed.

But after whistleblower Edward Snowden revealed the mass surveillance activities of America’s National Security Agency, which were alleged to include European data, in 2013, Austrian privacy campaigner Max Schrems asked the Irish Data Protection Commission to do an audit of what material Facebook was passing on.

They declined, citing Safe Harbour, so he appealed to the European Court of Justice, which has today ruled in his favour.

Following the judgement, Mr Schrems said: “I very much welcome the judgement of the Court, which will hopefully be a milestone when it comes to online privacy.

“This judgement draws a clear line. It clarifies that mass surveillance violates our fundamental rights. Reasonable legal redress must be possible. The decision also highlights that governments and businesses cannot simply ignore our fundamental right to privacy, but must abide by the law and enforce it.

“This decision is a major blow for US global surveillance that heavily relies on private partners. The judgement makes it clear that US businesses cannot simply aid US espionage efforts in violation of European fundamental rights. At the same time this case law will be a milestone for constitutional challenges against similar surveillance conducted by EU member states.
“There are still a number of alternative options to transfer data from the EU to the US. The judgement makes it clear, that now national data protection authorities can review data transfers to the US in  each individual case –
while ‘safe harbor’ allowed for a blanket allowance.
“Despite some alarmist comments I don’t think that we will see major disruptions in practice.”

Facebook had yet to comment at the time of publication, but it may well be forced to stop EU-US data transfers at least in the short term, at least until new certified contracts are in place.

Two things are immediately obvious – this will have a wider impact not just for data processing operations like Facebook, but any company that transfers any data overseas for any reason.

And secondly that you can only have true control of your data when you hold it under your own resources, although of course you may need to trade it for access to services from external companies.

If data security and privacy concerns you – and it should – digi.me is committed to giving you back control of your data, for you to use as you wish. Download a free trial here.

Why human error is the biggest threat to data

If you think shady criminal cartels, blackmail attempts or straight-up hacking geniuses are the biggest danger to any data held about you online, then we have news for you – plain old human error accounts for far and away the most data breaches.

New research has revealed that human error continues to be the leading cause of data loss for organisations in the UK.

The Databarracks report, which was based on a survey of 400 senior IT workers, revealed that 24 per cent of organisations admitted to a data loss caused by employee accidents in the last 12 months, ahead of hardware failure (21 per cent) and data corruption (19 per cent).

This report comes hot on the heels of data released by the Information Commissioner’s Office earlier this year, which showed that 93 per cent of the 459 data breaches reported to the office in Q4 of 2014/15 could be put down to human error in some way.

It also follows a serious data breach by a London health clinic earlier this month which saw  the email addresses of hundreds of patients, many of whom are living with HIV, accidentally sent out publically to all recipients of a clinic newsletter.

Oscar Arean, technical operations manager at Databarracks, said: “Human error has consistently been the biggest area of concern for organisations when it comes to data loss. People will always be your weakest link, but having said that, there is a lot that businesses could be doing to prevent it, so we’d expect this figure to be lower.”

Interestingly, the Databarracks results weren’t fully consistent across all business sizes, with a breakdown revealing that in large companies, hardware failure led to most data loss, with 31 per cent of all cases up from 29 per cent in 2014.

Arean said: “This isn’t surprising as most large organisations will have more stringent user policies in place to limit the amount of damage individuals can cause.”

Arean goes on to suggest that SMEs should adopt more of a big business ethos when it comes to managing human error:

“The figures we’re seeing this year for data loss due to human error are too high (16 per cent of small businesses and 31 per cent of medium businesses), especially considering how avoidable it is with proper management. I think a lot of SMEs fall into the trap of thinking their teams aren’t big enough to warrant proper data security and management policies, but I would disagree with that.

“In large organisations, managers can lock down user permissions to limit the access they have to certain data or the actions they’re able to take – this limits the amount of damage they’re able to cause. In smaller organisations, there isn’t always the available resource to do this and often users are accountable for far more within their roles. That is absolutely fine, but there needs to be processes in place to manage the risks that come with that responsibility.

“Of course small organisations don’t need an extensive policy on the same scale that a large enterprise would, but their employees need to be properly educated on best practice for handling data and the consequences of their actions on the business as a whole. There should be clear guidelines for them to follow.”

So what does this mean for us and our data? While in an ideal world the individual would be at the centre of their own connected life in full control of their own data, it is unrealistic in our current world to hold all our data close to our chests when so many end users have or demand access to it.

So is it safe out there in the big, bad world? Yes, largely speaking, and the benefits to us in areas such as health of having our details instantly available to all medical services, for example, certainly outweigh the chances of being subject to a damaging data breach.

But it is certainly a sobering thought that, no matter how thorough the legislation governing data handling and the individual company policies in place, just one simple, human mistake can be enough to bring all that crashing down.

How to check your Facebook privacy settings

Facebook is a social giant that holds huge amounts of personal information about each of us.

Facebook is also renowned for changing its privacy policies frequently and not necessarily advertising this fact, so it pays to check at regular intervals that you’re only sharing what you post (as well as what you have posted and will post in the future) with the audience you expect.

So, how can you check what your current settings are? Partly in response to criticisms that it wasn’t open enough about what info was being shared, Facebook has a new tool called Privacy Check-up.

Accessed from the padlock dropdown at the top right of the page, the privacy shortcuts panel that opens up gives you options for a quick check of who can see your stuff, who can contact you and what you can do is someone is bothering you.

While these options are helpful, the top option is to open the Privacy Check-up, which then takes you through your privacy basics in three quick and easy sections.

The first looks at your Posts,  explaining that this setting controls who can see what you post from the top of your news feed or profile, as well as showing what your current setting is, and giving an obvious drop-down if you want to make changes for future posts.

The next step is Apps, with a list of what you’ve logged in to with Facebook. It explains that you can edit who sees each app you use and any future posts the app creates for you, or delete the apps you no longer use. It also gives you a link to the App Settings with a reminder that you can edit them at any time.

The third page covers your profile and personal information – so who can see the likes of your mobile number, email and date of birth if you have shared them with Facebook. It also reminds you that you may have shared more information about yourself and recommends you check your About page to see that is up to date as well.

Then you’re finished, safe in the knowledge that you’re only sharing what you post on Facebook with the people that you want to see it.

And, of course, once you’re done, don’t forget to download digi.me for free to back-up your posts and pictures forever, giving you ongoing access to them even if you decide to delete your account in the future.