The Not So Private Personal Informatics

The language used around HCI research into digital health and wellbeing monitors, trackers and coaches reinforces the idea of information ownership by the data subjects themselves. We read about personal informatics, quantified self, life logging, self-tracking, and personal enhancement.

These labels suggest an intimacy and a sense of possession. Are we to believe personal informatics are exemplars of Suchman’s complementary machines and humans, where the outcome is relational, situational and can change over time? People’s capacity to act is reconfigured as they interact, but is their agency being thwarted by lack of awareness?

Neither an individual view, nor a peer-to-peer view, seem to capture the richness of relationships, or the variety of motivations, or the range of use. As individuals interact with their technology and social clusters, and as they experience, curate and share their lived data, we need a broader perspective (Kuutti, 1996) which examines how other parties in the data universe interact. The situation boundaries are wider than the person and their apps.

Personal informatics extends beyond the individuals themselves and their own social clusters. The data have value as assets to the organisation providing the services; it also has value to other organisations who might want to use the data for legitimate or improper purposes, and finally it has societal value as interpreted by local and national government, regulators and other bodies (Watson and Leach, 2010).

Individuals have little knowledge of the ways their data might be used, and how or when this might affect them negatively, as it flows through these different parties. It is not so much an ambiguity in explanation, but a complete hopelessness in understanding and control, despite changing legislation and regulation or the existence of privacy notices. The situation they believe they are in is far removed from reality – the plans exist and the individuals are not in control.

If citizens cannot achieve an understanding and insight into what is happening in personal informatics and the trade-offs being made (Pirolli and Russell, 2011), what hope is there that they can take intelligent action? Personal informatics is a long way from individual sensemaking, and currently more akin to data collection sensors for exploitation by other parties – individuals as instrumentation.

Fortunately, the individual-centric research view is changing, with research into the social motivations of these technologies such as understanding the social contexts and practices. Our models, methods, techniques need to perceive the wider picture to understand what is happening and what the side-effects are, at both individual and societal levels. In the meantime, change the language from Personal Informatics – it is Exposure Informatics.


Chris Elsden, David S. Kirk, Abigail C. Durrant. 2016. A Quantified Past: Toward Design for Remembering With Personal Informatics. Human-Computer Interaction.

Kari Kuutti. 1996. Activity Theory as a Potential Framework for Human-Computer Interaction Research. Context and Consciousness: Activity Theory and Human Computer Interaction, MIT, Massachusetts, USA.

Peter Pirolli, Daniel M Russell. 2011. Introduction to this Special Issue on Sensemaking. Human–Computer Interaction 26, 1–2: 1–8.

Colin Watson, John Leach. 2010. The Privacy Dividend : the business case for investing in proactive privacy protection. UK Information Commissioner’s Office.


Author’s own. Cyclists participating in Sky Ride London 2010.

From Yesterday’s Ubiquitous Computing to the ethics and security of Internet Of Things

This week, I read “Yesterday’s tomorrows: Notes on ubiquitous computing’s dominant vision”[2]. The author exploited the contemporary practice of ubiquitous computing in three different countries with the help of cross-cultural investigations. The research proposed three main arguments related to ubiquitous computing “in the wild”.


The first is that the “Ubicomp” application should be more focused on present and everyday life matters to explore the prototypes of tomorrow’s technology and daily experience from a pervasive thinking perspective[2]. We should not stick to Weiser’s vision as a single driven factor in research and practice. If we look at the electronic products around us, ubiquitous computing already exists in our everyday life. It has simply not taken the form we initially thought it would and continued to conjure up in our vision of tomorrow.


The second argument is that the Ubicomp technology-based application and practice in the “wild” was effectively influenced by society, culture, and religion because technology is a site of social and cultural production [2]. Bell and Dourish argued that the Western culture has significantly influenced its initial application, the contemporary practice such as Disney’s Tomorrowland Theme Park is one of example. By the time being, the “Ubicomp of the present” developed various applications and purposes that that were differentiated in accordance with individual cultures and countries. The author presented a few case studies from Singapore and South Korea. For instance, the smart card ticketing system provided a platform for residents travelling between different public transportation systems seamlessly [2]. All transportation data was aggregated into a centralised data centre, and the system could dynamically adjust the traffic congestion fee depending on the real-time traffic situation . However, most Asian countries, to date, have an in-depth collective culture and custom background[1]. People use ubiquitous technology to not only resolve their own personal needs and desires, but also to adapt to the cultural environment [2]. The government of Singapore adopts censorship to regulate access to political, religious and pornographic content online resources. The practice is commonly considered as an opposition to the wildly accepted free speech value propagated by most Western countries. However, in this scenario, the use of ubiquity or information technology has to adapt to the cultural and political context.

The third argument is the availability of infrastructure. The author noted that the infrastructure influenced the form of ubiquitous computing practice in a different countries. Such factors consist of the mobile network availability, electric power supply, and so on.

Of late, the Internet of Things (IoT) has become an indispensable part of the new era of ubiquitous computing application. In addition to the practical challenges of everyday life, social culture and infrastructure, I also want to discuss a few problems and risks that arise from globalised rapid product iteration and development, worldwide services and goods deliveries. Globalised e-commerce has provided a platform for the companies developing products and services in one country and distributing it worldwide. It raises concerns about the privacy, safety and ethics of data collection, storage, and use of IoT products, as I will now discuss:

Privacy and Ethics

Many IoT products have been collecting personal data from our daily life. However, some information such as sexual experience, medical situation, and so on, are extremely sensitive data. How do we know the company will properly manage those data? Will they conduct research into the data without our informed consent? When the companies supply these products and services from a different country; there are fewer opportunities for the consumer to query or investigate such concerns. Another ethical issue is that of pervasive data collection. Ubicomp products are becoming more and more invisible now, especially when we walk into a Ubicomp environment, and our biological data (face, gait, heartbeat, and so on) might be collected without our consent.

Image Source: [4]

Data storage and use

If the collected information is stored in another country, have the companies encrypted the privacy-related data? How do they manage the backup of customer data? For a country with less security and privacy awareness and regulation, the data might be located in a high risk of leakage or stolen.

Safety and Security

Wifi-connected house appliances are becoming increasingly popular. Turning on/off a kettle remotely is no longer a difficult task. If these products were part an insecure system, they may be running the risk of being compromised by a hacker who for instance could turn on a kettle without water.

Image Source:  [3]

As mentioned earlier, I agreed that the application of ubiquitous technology should focus on the everyday life context by taking into consideration the culturess, laws and customs of different countries. We should also consider privacy, ethics and safety issues when we carry out research into product design and development.

[1]         CUI, Y., CHIPCHASE, J., and ICHIKAWA, F., 2007. A cross culture study on phone carrying and physical personalization. Usability and Internationalization. HCI and Culture, 483-492.

[2]         GENEVIEVE, B. and PAUL, D., 2007. Yesterday’s tomorrows: notes on ubiquitous computing’s dominant vision. Personal and ubiquitous computing 11, 2, 133-143.

[3]        CAVE, H., 2017. HardSploit: A Framework To Audit IoT Devices Security.

[4]        SCHEDULE, S., 2017. Rating the IoT: How Do We Test Consumer Privacy?


Building a city of ethical conundrums

Doing research is difficult, and no amount of training is going to prepare us for every single potential ethical question or incident in the field. While applying for ethics approval from the university is supposed to help you think about potential issues that may arise in your research, they don’t always make you think about all the little details, the small things that can happen when doing fieldwork.

When looking at ethics as a constant conversation you are having with yourself, your supervisors, your colleagues, and maybe even the ethics board, it helps you address these conundrums that come up through the process; the invisible questions.

These conversations are hard to have with colleagues or supervisors, let alone the official ethics board of the institution, as many of the issues that come up may be very personal and complex. On top of this, it seems to be that safe, judgement free spaces to talk about these types of issues openly are also sometimes lacking.

A difficult topic to talk about

To address this issue, three PhD researchers from HighWire Centre for Doctoral Training Centre at Lancaster University and Centre for Doctoral Training in Digital Civics at Newcastle University put together a workshop for other PhD researchers studying in Centres for Doctoral Training (CDTs) within the Digital Economy Network (DEN) at the annual DEN summer school – this year hosted by Open Lab at Newcastle University in July 2016.

This workshop was an opportunity for researchers to have a safe space to address any ethical questions, conundrums, or concerns that they may have come across in their work so far, or are worried they may come across in any of their future work.

10002-angelika-den-workshop-4Since we knew this was going to be a difficult topic to talk about, we addressed the topic in a serious, but fun and creative way: we built a city of ethical conundrums.

Using ideas from anarchist and critical pedagogies where embodiment, creativity, reflexivity, communication, and collaboration are important, we came up with the idea of creating a common language among participants to talk about these personal concerns.

The workshop started with a short activity to get to know one another, and a longer conversation on the importance of safe spaces including how we were going to make sure our workshop was a safe space. After this, there was a short period of individual reflection where participants created unique pieces of art to represent their own ethical concerns in silence. After sharing these with the rest of the group, the building of the city began.

Starting with simple language and concerns, we used black building blocks and markers to document the conversations that came out of the individual presentations. To address invisible issues that arise throughout the research process we added little clay ghosts, and to further complicate the conversations we ended up building in some Lego figurines to populate our city of ethical conundrums. After all these conversations, we tied balloons to the ghosts and came up with strategies of addressing these invisible issues.

Learning from one another’s concerns


At the end of the day, we ceremoniously popped these balloons to let the glitter that had been filled in them fall onto the hostile-looking city we had built throughout the day. This made the city prettier and shinier, adding to the metaphor: while the kind of research we are doing in sensitive settings is difficult and at times may look hostile, when we talk about and address these concerns the research will be less hostile and more beautiful.

This workshop helped us learn from one another’s concerns and allowed us to address many difficult issues in a safe environment. It was a great opportunity to exchange experiences, reflect on ethical implications of previous, current, and future projects, and to engage in discussions around these concerns. We were able to map commonalities between the different research projects we discussed in the workshop, and to see that while all participants were working in very different environments, many of the ethical concerns appeared in all of our work.

The safe space we created through the workshop allowed us to address both very detailed and unique concerns, but also broader ethical issues we see in academic research as a whole.

This article was originally published on on 3 August 2016. For more information please contact Angelika Strohmayer.