PRiMMA: Privacy Rights Management for Mobile Applications
EPSRC Funded Starts April 2008
The Open University: Bashar Nuseibeh, Arosha Bandara, Blaine Price, Adam Joinson, and Yvonne Rogers
Imperial College London: Morris Sloman, Emil Lupu, Naranker Dulay, Alessandra Russo
As part of IBM Open Collaborative Research Initiative:
IBM TJ Watson Research Laboratory: Seraphin Calo, Jorge Lobo
Purdue University: Elisa Bertino
Carnegie Melon University: Lorrie Cranor
Objectives and Research Issues
The age of Ubiquitous Computing is approaching fast: most people in the UK over the age of 8 carry mobile phones, which are becoming increasingly sophisticated interactive computing devices. Location-based services are also increasing in popularity and sophistication. There are many tracking and monitoring devices being developed that have a range of potential applications, from supporting mobile learning to remote health monitoring of the elderly and chronically ill. However, do users actually understand how much of their personal information is being shared with others? In a recently released report from the UK Information Commissioner, we were warned that the UK in particular is 'sleepwalking into a surveillance society', as ordinary members of the public give up vast amounts of personal information with no significant personal or societal advantage gained. In general, there will be a trade off between usefulness of disclosing private information and the risk of it being misused. This project will investigate techniques for protecting the private information typically generated from ubiquitous computing applications from malicious or accidental misuse.
The project will investigate privacy requirements across the general population for a specific set of ubiquitous computing technologies. These requirements will be used to produce a Privacy Rights Management (PRM) framework that enables users to specify privacy preferences, to help visualize them, to learn from the user's behaviour what their likely preferences are, and to enforce privacy policies. We will make use of a large cohort of over 1000 OU students with a broad range of ages and backgrounds, both for identifying requirements and for evaluating tools for privacy management
The overall objective of the proposed project is to determine how users perceive privacy issues related to information they will generate in pervasive systems, and to develop a Privacy Rights Management (PRM) System to enable them to specify privacy controls which will be enforced by the system. Interface evaluation, especially for novel interfaces, typically involves small numbers of users (usually computer science or psychology undergraduates) who have been trained on the experimenter’s equipment and perform a lab-based or other brief evaluation. Our work will move the evaluation of novel interfaces to the next level by allowing users to use their own equipment (mobile phones) doing real world tasks over a period of weeks. Unlike the narrow demographics of most HCI studies, our work will involve larger numbers across a demographic that is representative of society. Although we intend to produce a Digital Study Assistant as a demonstrator, our aim is to determine users’ perceptions of privacy and to develop a PRM system for a broader range of applications, by determining additional requirements and capabilities for PRM from our US academic and industrial collaborators. Our more specific objectives are:
- To use requirements elicitation and analysis techniques to determine how potential users perceive privacy of the information they will generate, what they would like to specify about how the information is used, what reassurance the system should offer with respect to privacy, and how they consider the system should manage their privacy.
- To evaluate various techniques such as data encapsulation within PRM code, encapsulating policy statements with data, context anonymisation, and psuedonymity mechanisms to determine their suitability for specific types of applications and data.
- To develop a generic PRM toolkit to allow people to specify and visualize their required privacy related to the information they generate, and to transform this into policies and programs that control the usage of the information.
- To develop tools to monitor how people actually manage privacy in various applications, in order to determine if the privacy settings they use corresponds with stated perceptions about privacy.
- To develop tools and techniques for automatically learning privacy policies based on context information and people’s specified privacy requirements. As part of this effort we will also develop techniques for analysing privacy policies so that people can be made aware of the consequences of, and potential inconsistencies in, their privacy requirements.
- To perform large-scale evaluations across a wide demographic from OU students to validate the usability and performance of the PRM toolkit.
We will develop the context and location-aware Digital Study Assistant application described in the introduction for use by a large cohort of OU students. This application will integrate our PRM framework and will allow monitoring and evaluation of our approach.
The research issues and questions related to the above objectives include:
- Determining how users perceive privacy of information they generate. Who will they share it with? What sort of controls do they want over the information? Do they understand the consequences of sharing a particular piece of information? How does privacy relate to the demographic of the user?
- What is the granularity of context information that users are willing to divulge in the different contexts of work, learning, and play? How do privacy requirements change between individual and group contexts?
- How to design a PRM system that it easy to use by users of devices with a simple interface such as a mobile phone, as well as large screen devices?
- What mechanisms are needed to automate the control of privacy and how should these be distributed between mobile devices and the infrastructure?
- Can we predict the privacy requirements over a range of users from monitored information and how do these change over time?
- Can we detect and resolve inconsistencies in users’ privacy requirements? In the cases where automated resolution of an inconsistency is not possible, how do we present this information to the user in a useful manner?
The Imperial College work will focus on privacy policy specification, learning and analysis as well as techniques for enforcing the policies in ubiquitous systems within a PRM framework.
The project is associated with the IBM Open Collaborative Research Initiative

