Citizen Tagger is a mobile platform that facilitates the tagging of audio-based chat-show content. Users can listen to a pre-recorded show or a live show (for example, one run using Citizen Radio), and add unstructured text or audio tags as they listen to the show.
This process of social tagging can then be used to categorize the content within the chat show more easily, and help future listeners of the show find information they are looking for more easily.
This project was inspired by previous work which has explored the role of audio-based approaches to annotate information that could be useful for low-literate users in resource-constrained settings. Social tagging (a community of users applying free-form tags to digital objects) was investigated as a way to give listeners of chat shows an additional role in knowledge production.
Citizen Tagger is Android-based and is supported by a Python Flask enabled backend. The application prompts users to create regular tags as they are listening to shows that are listed within the application (or when they are taking part in a community-run radio show hosted by a Citizen Radio user). The application allows users to configure their tagging experience by changing the frequency of the tagging prompts and their preferred audio-tag length.
As part of the prompts, users were encouraged to summarise the section of the show they had just listened to. Criteria for assessing tag quality was necessary as free-form tags were allowed by the application, and these criteria included word frequency (analysing words that are used across the tag dataset), tag conciseness (tags that use less space to get their message across are rated higher), and tag objectivity (how much of the user’s interpretation is present in the tag content).
Through an iterative design process, Citizen Tagger underwent a rapid prototyping and testing phase over a four-week period. The users listened to and tagged chat shows and were interviewed afterwards. The application was improved using observational notes, logs and bug reports that emerged in this phase.
After the prototyping process, Citizen Tagger was further refined and deployed with 16 individuals, recruited through an opportunistic sampling approach, who tagged a panel discussion. Based on usage statistics, created tags, and the use of other qualitative data, the experience of tag creation using manual tagging and tagging prompts was assessed. Questions around how to configure tagging-related parameters were investigated, and how to motivate users to create effective tags.
The findings indicate that users subjectively understood that tagging experience, and expressed a desire to be able to configure it to their own needs. When doing a tag quality analysis, audio tags were more popular and allowed greater expression from users.
The study also highlighted the balance required between the effort required to tag, and the enjoyment of listening to the show. To many, tagging was a cognitively demanding task. The project explored these themes and suggested some design implications that emerged as a result for future work in this area.