Tag Archives: Security

Apple’s data debate call after ‘dangerous’ request ‘to hack own users’

Tim Cook has written a public letter criticising the US government for threatening “the security of our customers” while explaining why he is refusing to comply with a federal court order to help the FBI unlock an iPhone of one of the San Bernardino shooters.

In the letter, posted on the company’s website and headlined A Message to our Customers, Apple’s chief executive said: “This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.”

He said Apple is being asked to take “an unprecedented step…we oppose this order, which has implications far beyond the legal case at hand.”

The tech giant has been ordered to help the FBI, which wants to know more about the motives and background behind the shooting which killed 14, access the iPhone by building a bespoke operating system with fewer security features. Current security features, which have been in place since 2014, mean agents risk losing any data on the phone permanently if they make more than ten failed attempts to guess the passcode.

The government has, among other demands, asked to allow a passcode to be inputted electronically, which would make it easier to unlock an iPhone by “brute force”, especially with modern computers that are able to try thousands or millions of combinations.

Explaining his stand, Cook said: “The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

“Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

“The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

Cook cited grave personal data concerns and the security of users for the order challenge,  explaining his stance by saying: “Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

“All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

“Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

He added: “For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

He was clear that Apple is “shocked and outraged by the deadly act of terrorism in San Bernardino and has “no sympathy for terrorists”, adding: ” The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime.”, including complying with requests for “data that’s in our possession.

He stressed: “We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them.”

But that has come to an end now “the government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals.

“The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.We can find no precedent for an American company being forced to expose its customers to a greater risk of attack.”

Looking to the future, he added: “The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.

“The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

“Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.”

Apple is now likely to file an appeal, triggering a fight that could end up in the Supreme Court – and a battle that will be keenly observed at all stages by those on both sides of the data privacy debate.

In privacy sharing, context is king (and hurrah say all of us)

Major new research from America has confirmed what many in the personal data privacy world have long suspected – whether or not people want to share their information hinges largely on the context of the request.

The Pew Research Center found that there were a variety of circumstances under which many Americans would share personal information or permit surveillance in return for getting something of perceived value.

The study of 461 adults and nine online focus groups of 80 people found that the vast majority (54% to 24%) think it would be acceptable for employers to install monitoring cameras after a series of thefts, with almost half (47%) also believing that the basic deal with store loyalty cards, which sees purchases tracked in return for occasional discounts, is acceptable – although another 32% think this is unacceptable.

But interestingly, where the benefit was not as clear cut, involved greater intrusion in their lives, or the ongoing collection of data, the proportion of people prepared to trust and take part fell dramatically.

So when offered a scenario in which their energy bill could be reduced by installing a “smart thermostat” that would monitor their movements around the home, most adults (55% to 27%) considered this an unacceptable trade-off. As one survey respondent explained: “There will be no ‘SMART’ anythings in this household. I have enough personal data being stolen by the government and sold [by companies] to spammers now.”

As the report’s authors concluded: “These findings suggest that the phrase that best captures Americans’ views on the choice between privacy vs. disclosure of personal information is, “It depends.”

“People’s views on the key trade-off of the modern, digital economy – namely, that consumers offer information about themselves in exchange for something of value – are shaped by both the conditions of the deal and the circumstances of their lives.

“In extended comments online and through focus groups, people indicated that their interest and overall comfort level depends on the company or organization with which they are bargaining and how trustworthy or safe they perceive the firm to be.

“It depends on what happens to their data after they are collected, especially if the data are made available to third parties. And it also depends on how long the data are retained.”

Here at digi.me, this completely chimes with both our business vision, whereby users are put back in control of their personal data to exchange or share for convenience, service or reward as they see fit, and our belief that the current system is broken beyond repair, with consumers and businesses at war, with neither getting what they want or need.

We are addressing this huge shift through our permissioned access model, releasing later this year, which will allow users to make the best use of their data, for their needs and wants, by adding health and financial data to their current social media life curation to provide a fuller version of their life.

Crucially, as users’ collect the data themselves with digi.me as the enabler, we never see any personal data, ever, and cannot access it under circumstances, so it will only be shared with the businesses they share with directly.

Additionally, new rules coming in across the EU in the next couple of years explicitly forbid details taken for one purpose to be sold on and used for another, so will effectively end third-party data selling.

All of which makes for very pleasing personal data developments, for all of us both personally and commercially.

Are you a reluctant personal data sharer?

A new global study looking at attitudes to privacy and security amongst mobile users has identified lack of trust as the single biggest barrier to growth.

The third annual Global Consumer Trust Report spoke to 5,000 users in both developed and developing markets.

Of these, 36% said the main reason they don’t download or use more mobile apps or services is because they either don’t want to give up their personal information (14%), don’t trust the security (13%) or have had a bad experience or heard negative news stories (both 4 per cent).

However, some better news was that the number of people who had been completely put off using apps by privacy and security concerns has more than halved from 33% to 14%.

The study concludes that this is in part down to the rise of the ‘reluctant sharer’ – with the number of people who do not want to share personal data but accept that they must if they want to use an app rising eight percentage points to 41%.

These reluctant sharers accounted for at least a third of all respondents across the eight countries surveyed, increasing to half of all US and German mobile users (53% and 47%) – a rise of a quarter in the US and a third in Germany.

In terms of the importance of privacy, almost half of those surveyed would pay extra for all app that didn’t share their personal data. Of these, 30 per cent would pay a premium of between 5 and 10%, with 5 % of all consumers even willing to pay more than 50% extra.

Other key findings were that just 6% of people said they were always happy to share personal data from an app – a fall of 15% from 2013.

Social networks are the least trusted app category, with health and medical apps overwhelmingly trusted (86%) despite the sensitivity of the data involved.

Financial information is seen as the most sensitive data (26%), above photos (18%) and contacts (15%).

 

TalkTalk hack: is stolen data really unencrypted?

The news that up to four million TalkTalk customers have had personal details stolen in a massive hack is serious enough – but suggestions that this crucial personal data may not have been encrypted seriously ups the ante.

The telecoms firm has revealed that information such as customers’ names, addresses, phone numbers, dates of birth, and partial bank details could now be in the hands of hackers. And we now know it may not have benefited from an extra layer of security known as encryption.

So what does this mean? Basically, unencrypted data is plain text – it can be read easily by anyone, without the need for special keys or passwords. But encrypted data is just that – encrypted. While hackers are able to steal it, they’re not necessarily able to read it or sell it on in any way – unless they have the key or code needed to unlock it, it is largely useless to them.

Encrypting data obviously has many uses, ranging from the obvious security benefits to companies holding personal data through to reassuring customers that hacks will not automatically see their personal information disseminated on the web.

It’s not a legal requirement, as TalkTalk’s CEO has been at pains to point out – but there’s a huge argument that it just makes sense to use it.

Hacking and cyber crime in general is on the increase, so no company is able to completely guarantee they will never be a victim, despite their best efforts. With this in mind, taking the best possible care with customer data, particularly sensitive information of exactly the type that can be used to scam people or clone online identities, just seems to make sense.

But that doesn’t seem to have been the case at TalkTalk, with CEO Dido Harding unable to guarantee all the data stolen was encrypted, although the company claimed that it had been kept securely (which is a very different thing).

But what does this all this talk of how secure the data was mean to us, the average user? Well, for starters, it’s a good lesson in finding out as much as we can about what each company who holds our personal data does with it, and how securely they treat it.

It’s also a good lesson, particularly if you may be one of those unfortunate TalkTalk victims, to keep an eye on your credit report, so you can see if anyone attempts to open new accounts in your name. If you do see any that you don’t recognise, contact your bank or financial services provider immediately, and also report any fraudulent activity to Action Fraud on 0300 123 2040 or http://www.actionfraud.police.uk.

Looking to the future, moving to a place where we each have control of our data so that we keep our most important details safe and secure ourselves and share them only with people or companies we want to or trust is an obvious next step in the personal data revolution.

While companies such as digi.me are working on making just this happen, across multiple industries, for now you can keep your social media content safe and backed up with our free app – click here to get your copy now.

Digital dependence is ‘eroding our memories’

Excessive reliance on the internet and search engines for fact finding is damaging our long-term memories as well as compromising IT security, a new study has found.

Fuelled by an increasingly connected world that is always online, we no longer hold in our minds information we can store and retrieve from a digital device or the Internet, causing what the report has termed Digital Amnesia.

Crucially, it found that one of the far-reaching consequences of a failure to make use of our existing stored memories – for example by preferring to search online – can ultimately result in their dilution or disappearance.

The study, which involved 6,000 consumers aged 16 and up from across Europe, found that when faced with a question, over a third will head straight to the internet for an answer, rising to 40 per cent of those aged 45 and over.

Amost a quarter (24 per cent) of respondents admit they would forget the online answer as soon as they had used it, rising to 27 per cent of those aged 45 and over, with 12 per cent assuming the information will always be out there somewhere.

Dr Maria Wimber, a pyschology lecturer at the University of Birmingham, said that the trend of looking up information  “prevents the build-up of long-term memories”.

She added: “Our brain appears to strengthen a memory each time we recall it, and at the same time forget irrelevant memories that are distracting us.

“Past research has repeatedly demonstrated that actively recalling information is a very efficient way to create a permanent memory.”

The report’s finding that many people rely on computers instead of memorising information was highlighted by the fact that many of those questioned could still recall their own phone numbers from childhood, but did not know the current numbers of family members or their place of work.

The report also found that IT security can be an early casualty of our impatience to access information online. Kaspersky Lab, the cybersecurity firm which carried out this study, has found that just under a fifth (18 per cent) of consumers – 22 per cent of those aged up to 24 – will opt for speed over protection when downloading files.

This leaves the door wide open for malicious software intent on stealing personal data and compromising the device and any other devices connected to it.

If consumers haven’t protected their data, their online accounts and devices with strong passwords and data back-ups, the memories and information these hold could be lost or damaged forever.

Of course, digi.me users can protect their data (if not their actual memories!) as regular back-ups will ensure that all their social media history remains in their digi.me app on their desktop, safe, secure and always available.

What is the Internet of Things?

As the latest estimates claim the number of devices connected to the Internet of Things (IoT) will jump from 15 billion now to 50 billion in 2020, we look at what a connected world actually means.

What is the IoT? Well, at its most basic level, it is a network of devices fitted with data-capturing sensors that can connect to the internet, talking wirelessly to each other, applications – and indeed us. And these devices? They’re things in your home, things you wear, wearables such as Fitbit and the car you drive.

The phrase IoT has been in circulation for nearly a decade in technology circles, but only now with smart, connected devices such as thermostats and refrigerators, as well as driverless cars, becoming a reality is it something that is becoming relevant to the majority of the population.

What would a truly connected world look like? More straightforward is one answer, as all these intelligent little machines that between them know so much about us and our lives start to co-ordinate.

In classic examples, your alarm clock wakes you up and then tells your coffee machine to start boiling ready for a morning cuppa, while on the drive to work your car knows the quickest route for where and when you need to be, and can even text whoever you’re meeting if you’re running late.

Lots of smart devices, collecting and streaming huge amounts of user data and providing real-time information on, well, just about anything. Performing nominated tasks on demand and combining to make life as frictionless as possible. After all, how much easier would life be if your house’s heating could tell it was about to break and was able to summon an engineer itself before it actually did so?

And these devices could bring real benefits, not least cost as well as convenience, to all our lives. The heating that knows to turn itself off or down on a sunny day will save individual users money, as potentially could smart cars that send data about how they are being driven to insurance companies to feed into premiums.

The decreasing cost of computer power means there is no cost barrier to entry for putting sensors that can generate data in the most mundane items, and there is clearly no shortage of opportunities for smart machines that can do something in addition to their primary, practical purpose.

With so much data zipping around, questions about privacy and security are at the forefront of concerns and there are clearly many debates to be had around the IoT, its limitations and indeed its strengths.

But one thing is not in doubt – a huge amount of data is going to be generated, and how that is analysed and interpreted is going to be key to how successful the IoT is, for individuals and businesses alike.

Of course, at digi.me, we believe in returning the power of data to the owner, for them to use and permission as they wish, in both their personal and public lives.

The Internet of Things, and its natural successor the Internet of Me, where the individual is at the centre of their connected life, is a natural fit for us, as control returns to the user. Businesses need accurate rich data, which an individual is best placed to provide – but only if they want to and only if it is worth their while.

Leveraging the IoT is the dream for many companies, but here at digi.me we’re already got a headstart – and you can  try it out for yourself with a free download of our amazing app.

data privacy

Ashley Madison and Spotify: lessons about personal data privacy

It’s been an interesting week for observers and chroniclers of data issues, especially around privacy and what we can reasonably expect to happen to information we trust to the web and individual websites.

First there was the Ashley Madison leak, following an earlier hack, where millions of email addresses and account details of users, including sexual preferences and credit card information, were dumped online and made visible to anyone who had the time and inclination to go through them (and plenty did).

The extramarital affairs website offered a full delete service, where users could pay an extra fee to erase any trace of their usage, but this appears to have been all but useless. It was also interesting to see reports of how many company, government and military email addresses had been used, when plenty of services offer free and therefore anonymous accounts, implying a clear trust that because Ashley Madison said they were discreet, then this must be true.

Then, as the ramifications of this hack/leak were still becoming clear, Spotify hit its own technological bump in the road, when it was forced to withdraw a wide-ranging new privacy policy that expanded the data it collected from users and who this was shared with.

As the backlash intensified, with angry  users wondering why a music streaming service needed access to their phone contacts and photos, Spotify’s CEO Daniel Ek apologised for how it had been implemented, promising an “update” to the new policy and better communication in future (although interestingly not backtracking on the content of the policies themselves).

He also said that Spotify would not access or import people’s photos, contacts, sensor or GPS data without their permission.

So, what do both of these sagas tell us about the state of and awareness of data privacy online? I would argue quite a bit – and much of it positive.

While the fallout of the Ashley Madison data will have wide-ranging implications for anyone unmasked, the huge amount of coverage around the hack, subsquent leak and celebrity or well-known users will also undoubtedly raise the profile of the state of data privacy online. Namely, it has been made crystal clear that users need to take full responsibility for their own data and who they trust that with, as even sites claiming to be uber secure are just not able to ensure that is always true, particularly in the wake of a concerted hacking attack.

While not many sites are likely to suffer the fate of Ashley Madison, which was targeted by hackers The Impact Team who had an issue with the content of the site, every site holding personal data has the potential for a breach, and users often have no more than their word that all standard protocols have been followed before handing over what can be sensitive information. Indeed, companies themselves may believe they are protecting data adequately but just not have the technological know-how for that to be correct.

Equally, the Spotify backlash, while primarily among the internet-savvy Twitter usergroup, also shows a promising swell against overarching privacy policies, proving that users won’t accept absolutely anything in return for free use of a service, and increasingly have enough awareness to check what exactly they are signing up to.

Awareness of what we give away with many online transactions (excluding the likes of digi.me, which never sees your data) is the first step in making sure that anyone we hand our data to will treat it with respect, amoving on to holding those who don’t to public rebuke and account.

And thus the vastly greater awareness around data privacy issues following recent events can only be a good thing as more and more of our lives are lived online.

What Does Your Phone Know About You?

These days we really do rely on our mobile phones and it is quite scary to think how much your phone knows about you, where you have been and who you have seen.  It even knows some of your favourite hobbies, interests and activities. It is in essence your digital brain!  What would you do without it?

Mobile phones have moved on an incredible amount over the past 30 years, from a device that is clunky and cumbersome to small, light incredibly fast computers that fit in our pockets and handbags. We connect other devices to them such as our fitness trackers, smartwatches, children’s toys and much more.  They are the central hub of our daily lives.  As such they collect a massive amount of data about us.  Some of which is passed on to the applications that we use and some just sits idle on the phone.  Then there is some data that goes back to the carrier as well and some that is collected by the sites that we browse. They are complicated little devices and often we forget just how valuable that data is to us until we lose or break our phone.

A couple of weeks back I wrote a piece on how you can find your phone using the data stored online about you that relates to your phone and it’s location.  This week I thought we would look more at just what data there is on these devices and why it is important to secure and back up your phone and it’s content.

Most mobile phones these days have the option for you to store a copy of your photo’s and contacts in the cloud.  This means that every new contact and photo is saved both on your phone and somewhere on the internet.  The chance of losing this data is low unless of course you haven’t set your phone up to do that. It is one of the first things I set up whenever I get a new phone and I would recommend that if you haven’t done this already then do it as it is a life saver when your phone is damaged or lost as you still have all your contacts and those precious pictures of friends and family.

The next thing that I always set up is a way to secure my phone so that if I lose it someone else can’t just use my phone, run up a massive bill and cause all sorts of trouble. I have heard too many friends lose their phones abroad and because it is abroad they are still liable for the call charges made. Put a pin on it and it is at least a deterrent. You can also turn on phone tracking and remote wipe which take that process one step further. The only issue with these is that you need to have GPS turned on and this can be a bit of a battery drain. You can still find your phone’s last known location through other means so to me this is not essential.  Android phones track where you are using a process called triangulation which uses WiFi and cellular data to identify where you were last so I tend to use that as my fallback.

The apps that you have installed on your phone and have paid for are all stored by the app store where you bought them from so these too are recoverable. The data within these apps is stored remotely too by the app creators. As long as you have stored your contacts, pics and videos remotely you should be able to pretty much recreate your device time and time again. This is the beauty of distributed data.

Looking at this another way though all that distributed data is accessible from a single point – your phone. Once someone has that they have access and potentially control of everything. Just bear that in mind the next time you turn your phone on and you haven’t got any security turned on. You are putting your online identity at risk. That digital footprint that we have talked about here on the blog a few times could become compromised if you don’t protect it properly.

This article was brought to you by digi.me who put you in control of your social media content. Download it now to protect your digital memories. 

Who uses your social media data and how?

Have you ever wondered who is using your social media data and for what purpose they are using it?  How do you keep up with the latest privacy policy changes and do they matter to you? This article will cover a few different approaches to managing and auditing your social media footprint.

We all use various social media platforms such as Twitter, Facebook and LinkedIn to name just a few.  When we initially signed up all those years ago we signed up with a purpose and an expectation about what information we were happy to share with these companies and how that data was likely to be used.  We may not have read all the pages and pages of terms and conditions but all in all we got what we wanted. Access to share content with our friends, family and colleagues in a quick, simple and easy to use way on any device we choose.

The platforms that we use have changed over the past 5 years and so has the direction of many of these social media companies.  They quickly realized that the data about you was and still is one of their most valuable assets. So much so that they are now starting to close the doors on third parties accessing this data without paying for it. Companies now have to pay to advertise to you where in the early days anyone with a Facebook page or Twitter account could do that. But what does this all mean for you and your data?

Put simply your data is now a commodity. Something that can be bought, exchanged and sold. But that is my digital life I hear you say! Indeed it is, and you should be able to control who has access to your data and how.  All the social networks have privacy settings, almost every time terms and conditions change there is a change to the privacy settings and how these work across your data.

When did you last check your privacy settings? An important question that many of us don’t necessarily know the answer to.  Quickly log into your social networks and check your privacy settings. Tighten up the security and review which apps and services have access to your data along with which people can view it.

You may also want to do an audit of who you have on your contacts lists at the same time. There is no point having contacts accessing your data that aren’t of any interest to you. They flood your timelines with irrelevant updates and cut the usefulness of the services.

Auditing your social media platforms isn’t just something that business users of social networks should do, it is something that we all need to do to make sure we aren’t vulnerable to identity theft at worst or naïvely giving away too much of our personal information at best.

If you would like to share your top tips and social media insights on our blog get in touch.

Friday Fun: Password Management

If you are anything like me then you have passwords for everything from banking to logging into your social media accounts. So many passwords to remember or in my case forget! But what happens when you find an issue or alerted to an issue?

Have you ever seen Facebook come up with an alert stating that there has been a suspicious login on your account and that you need to change your password with immediate effect? I certainly have and it rattled me somewhat.  It made me realise how much I value my social media content.  Until then I didn’t enable the two step authentication on Facebook, but now I do.  And you know what it’s well worth enabling.

I don’t know about you but these days as a result of that little scare on Facebook I now treat my social media data and services differently.  I treat them more like banking services. I also make sure that anything that I put on social media I would be happy for people to see in public.  After all you never know what data breaches may occur.  It is worth being wary of third parties holding and storing your data.

Whilst you have probably already got all your new years resolutions sorted for this year you may want to consider adding one to your list.  Take security more seriously especially when it comes to your personal social media data.  Consider using unique strong passwords and two factor authentication on your social media data and remember to back up your social data regularly.