Maintaining a user model in a user-adaptive system (UAS) may be in conflict with privacy laws and regulations, and more importantly, with user concerns about privacy. Given that privacy laws consider data personal when it can be linked to an identifiable person, the solution often proposed is to enable users to remain anonymous to Web sites. The main contribution of this paper is that it proves that personalization and anonymity are not antagonistic, as many UAS designers believe. Anonymity can even allow for a more frank interaction, more extensive information about the user, and, hence, a better basis for personalization.
The proposed reference architecture for pseudonymous and secure user modeling offers three main services: encryption, provided by two function libraries, the Skunkworks application programming interface (SKAPI), and the secure knowledge query manipulation language (SKQML); application-independent anonymization, provided by the KQMLmix framework, implementing the mix technique through new performative “mix-it” additions to the knowledge query manipulation language (KQML); and selective access to the user model, provided by a hierarchical, role-based access control model.
This architecture is generic and flexible enough to allow for a specific UAS to select required components according to its privacy requirements. The only demand on the internal design of a UAS is that communication between the user modeling clients and servers is carried out using KQML.
The first, introductory, part of this paper clearly presents all the concepts about secrecy, anonymity, and pseudonymity necessary to allow a non-expert in security to understand the proposed solution. A final analysis of potential privacy threats to the architecture and possible countermeasures, promising extensions, and obstacles to its deployment in practice provides a better understanding of the architecture’s scope and limitations.