Personal data: trust, power and innovation

Digital technology has opened up countless opportunities for collecting, sharing and using data: “a fundamental paradigm shift in our world,” according to Digital Catapult’s Lucie Burgess.

“What we’re seeing now is new models of companies being able to engage with users through their personal data in a way that builds trust,” she explained. “That can build all sorts of commercial opportunities and is really good for the user because it gives them access to new products and services, ways of living their life that they might not have thought about before, but you’ve got to do that in a way which really builds trust.”

Lucie is the Head of Personal Data and Trust at Digital Catapult, which works with a wide variety of organisations to support the development of new technologies – to remove the “barriers of innovation which mean that that technology hasn’t reached its full maturity, capability or adoption”. The Personal Data and Trust Network works with industry, research and the public sector to explore innovation with personal data.

Data sharing “both good and bad”

Some GP surgeries are starting to give patients access to their medical records online, but this progress is slower than Lucie would like. She envisions a world where people participate more directly in their health, hoping that “we move away from this patriarchal model that we used to have towards one were people are actively involved in managing their own health and wellbeing and fitness.”

She continued: “there’s lots of evidence to show that you have better health outcomes if that’s the case. So personal data is really important.”

Thirty years ago mobile phones let people phone on the move, but this has now become another source of personal data, and location data is now collected by default on mobile devices. Lucie highlighted the benefits of this – “it means that you can access all sorts of services like online maps and content that takes into account your locality” – it has its downsides. “It also means that people potentially know where you are, and that’s both good and bad,” Lucie suggested.

So while on the one hand we now have technology to send people notifications if they pass a shop that sold something they were looking for, we also see ads based on keywords we have written in our notes.

“I was writing some notes recently in a business meeting about block chain, and then five minutes later there were all sorts of books and services being advertised to me about block chain,” Lucie recalled. “In actual fact it was quite useful in that particular context, but it might not be, and there are obviously massive issues for security.” She pointed out the potentially life-threatening danger of a hacked autonomous vehicle.

Decisions based on algorithms

Concerns about data are about more than just hacking, though. Trust and perception play a big part, and part of Digital Catapult’s research is into public attitudes towards data. The Investigatory Powers Bill has attracted controversy as it passes into law, and Lucie expressed concern at the level of surveillance and the quantity of data that telecoms providers will now be required to keep. More generally, though, she believes the issue is in a lack of understanding about the types of data that companies collect. Another important piece of upcoming legislation is the General Data Protect Regulation, which comes into force in 2018.

The GDPR will give people more power to question decisions made by algorithms based on their personal data. “On the most anodyne side of things, they’ll predict our shopping behaviours – it’s not really that much of a problem if you get offered a red pair of shoes instead of a black pair of shoes – but if you’re prevented from access to credit, or your health insurance is going to be more expensive because of data that you’ve provided… those are important decisions that are made about us and we’re going to have a right under the new legislation to challenge those decisions.”

Food labels and data receipts

One of the projects Digital Catapult is working on is a system of “food labels for privacy”. Like food labels on food, these little symbols will let people see at a glance what kind of data is being collected about them and what it will be used for. In the spirit of participatory design and empowering people when it comes to their data, the public will be asked to choose the five things they feel most warrant these labels, and even what the icons should look like.

“There’ll still be a link to the privacy policy,” Lucie assured. “But we’re trying to help organisations convey privacy policies in a much simpler way so that people don’t have to read something the length of Hamlet to understand what’s being done with their data.” For data protection and privacy, you can contact new york privacy compliance, that provides layers of online security

The other project which really excites Lucie is the idea of personal data receipts. These are currently being trialled at the Digital Catapult Centre, but Lucie hopes it will grow and be used more widely.

Rather than signing into a book with a pen, visitors to the Digital Catapult Centre sign in electronically and provide basic information such as their name, company and email address. They will then automatically be sent a personal data receipt which will tell them the type of data they provided – “we don’t say back to them what the data is, obviously, because that in itself would be a security breach,” – and how Digital Catapult will use that data. Crucially, the receipt informs people how to have their data removed from Digital Catapult’s databases.

Several people have taken up this offer to have their data wiped, and also people that have provided false email addresses because “they don’t want to give us the data in the first place!” Interestingly, one person gave a false email address which happened to be a real email address for somebody else, demonstrating why the receipts do not repeat the data itself, only the type of data provided.

Lucie stressed the advantages of sharing data, and believes that by making data collection and usage more transparent, and by giving people more control over their own data, data can benefit everybody. She concluded: “Choose what data you share. Do it in the knowledge of how it’s going to be used. Recognise that it can have some benefits and some risks.”