The Data Protection Design & Privacy-Enhancing Design Manifesto
and how data protection law needs to evolve to tackle new challenges
In today's edition of the newsletter, I will propose and explain a new discipline: Data Protection Design, in which the worlds of Data Protection Law and UX Design are bridged, and Privacy-Enhancing Design is the gold UX Design standard. As I explain below, this is a much-needed next step for data protection legislation and data protection practices around the world. I hope to involve other privacy and data protection specialists, UX designers, product managers, lawmakers, and anyone interested in building better privacy and data protection practices. I invite you to read the proposal below and join the conversation.
A) WHY DATA PROTECTION DESIGN?
The acknowledgment of the ubiquity of dark patterns (deceptive design) in data protection and the inadequacy of privacy policies and written privacy notices to support transparency in the online environment make it clear that the current legal data protection framework is incomplete. And by that, I mean that if we do not embrace UX design as an essential data protection component, a very meaningful part of the interaction between users and organizations will be out of reach. As a result:
Deceptive design practices that impact personal data and fundamental values will continue flourishing;
Users will continue being uninformed about the rights, data practices, and privacy risks involved in their online activities (as they do not read privacy policies and should not be expected to read them);
Privacy principles and goals will continue not to be an essential part of an organization's business model, value proposition, and UX design strategy. (But organizations will keep marketing that they care for privacy because privacy sells.)
B) DATA PROTECTION DESIGN - A NEW DISCIPLINE
I am proposing here that it must change. The worlds of Data Protection Law and UX Design must be bridged. Data Protection Design must be a discipline in itself, in which the focus is:
Following privacy-enhancing design practices;
Translating legal data protection principles and rules into privacy-enhancing UX design practices;
Through an iterative process, developing principles, goals, rules, and tools that should be followed by UX designers and product managers in order to implement privacy-enhancing design;
Developing best transparency practices to help organization publicize their UX design practices (and be held accountable when not following privacy-enhancing design principles and practices);
Applying design thinking and design methods to the implementation of privacy-enhancing design;
Involving users and users' perspectives in the implementation of privacy-enhancing design;
Establishing the role of Data Protection Designers and Data Protection Design Officers (DPDO) within an organization. Defining protocols and best practices that should be followed.
C) PRIVACY-ENHANCING DESIGN - THE 7 PRINCIPLES
The 7 principles (or heuristics, as UX designers prefer) of Privacy-Enhancing Design are:
Autonomy and Human Dignity are Central. User autonomy and human dignity are fundamental rights and must be respected throughout the UX design. The UX design must allow users to exercise their choices and preferences freely, autonomously, and in an informed way. Users should not be pushed or forced to take a certain action. Users should be able to easily retract a certain choice or preference.
Transparency. UX design practices should foster transparency and accessibility so that users know ongoing data transactions. Every new data transaction (collection or processing) should be clearly signalized in an accessible way so that users can realize that data is being exchanged. Users should be made aware that their personal data is being collected or processed. Symbols, colors, and various design features might be used to transmit information.
No Previous Data Protection Knowledge. UX design should presuppose that users have no background data protection knowledge. Interfaces that involve data collection and processing should be clear and accessible, with simple and user-friendly indications of the scope and extent of the data transaction, including possible risks (even if it seems obvious to the designer).
Acknowledgment of Cognitive Biases. Cognitive biases must be broadly recognized and acknowledged. The exploitation of cognitive biases to collect more - or more sensitive- personal data (i.e., through dark patterns in data protection) must be stopped throughout the UX design process. Users should be seen as vulnerable and manipulable, and it is the responsibility of the organization to shield users from manipulation.
The burden on Organizations. Organizations should be responsible for designing UX interfaces that do not exploit users’ cognitive biases. Organizations should be able to prove - at any time - that their UX design practices are privacy-enhancing (and not privacy-harming). If users are committing errors, it is the responsibility of organizations to detect and correct the design practice that is fostering these errors.
Holistic implementation. The principles above must be implemented throughout the UX design and present in every interaction between users and organizations (i.e., not restricted to privacy settings). Privacy and Data Protection should be made an integral part of the interaction between the organization and the user.
D) NEXT STEPS
At this point, it would be very important to set practical standards, rules, and best practices for Privacy-Enhancing Design.
I made an exercise to show what Privacy-Enhancing Design would look like in practice in this post. There, I imagined a hypothetical Facebook "user post" interface that would follow extremely privacy-enhancing premises. My goal with that exercise was to say that change towards more privacy is possible and can be made through UX design. In that exercise, I showed that some of the premises embedded into Facebook's current "user post" interface do not reflect a privacy-enhancing framework. In any case, how the version with improved premises will be implemented in practice is up to the data protection designer in charge (and aspects such as usability and the fluidness of the experience should also be considered).
I am a lawyer and not a designer, therefore, insights from UX designers on what are the best tools and best design practices to implement privacy-enhancing design will be extremely welcome and helpful. If you are a designer and have suggestions, I invite you to read item "E" below (Join the Conversation) and get in touch.
The practical implementation of privacy-enhancing design should be a task performed by a multidisciplinary group involving data protection lawyers, data protection designers, and product managers. Knowledge of both the legal and the UX design aspects of Data Protection Design and Privacy-Enhancing Design (as described supra) is required.
As you might have noticed (especially in case you have a legal background), I am purposefully refraining from using terms such as "data subject" and "controller" or "data processor" and using "user" and "organization" instead. This is to facilitate joint communication between legal specialists and designers.
I will continue writing about Data Protection Design and Privacy-Enhancing Design in the next editions of the newsletter, so stay tuned.
E) JOIN THE CONVERSATION
Building a new discipline and revolutionizing data protection law is not a mission for a single person. I invite everyone interested to join the conversation and this "manifesto." You can start by reading the previous posts of this newsletter, where I explained dark patterns in data protection, the cognitive biases that are exploited by dark patterns, transparency-by-design as one of the theoretical goals of privacy-enhancing design, and how to implement meaningful choices online. These posts will give you the foundation on the most relevant issues and challenges that motivate me to propose Data Protection Design as a discipline and Privacy-Enhancing Design as a UX design framework.
If you have a little bit more time available, you can read my academic articles on Dark Patterns in Data Protection and on Transparency-by-Design, which are part of my ongoing Ph.D. at Tel Aviv University (supervised by Prof. Michael Birnhack and Prof. Eran Toch, to whom I am extremely grateful). They are publicly available on SSRN, and there you can have the full theoretical background of the topics I am discussing here.
There is a lot of new research coming out on the topic of dark patterns / deceptive design, especially through the Human-Computer Interaction perspective, but also from legal scholars. You can find them online on public repositories and on the SSRN.
The European Union recently published the "Behavioural study on unfair commercial practices in the digital environment. Dark patterns and manipulative personalisation: final report." There they discuss dark patterns extensively (on pages 32-33, they quote and use the taxonomy I proposed for dark patterns in personal data collection), signaling that not only data protection researchers realize the need to tackle unfair design but also lawmakers.
See you next week. All the best, Luiza Jarovsky