Continuing last week's discussion about Data Protection Design as a new discipline, this week I would like to raise the topic of Privacy-Enhancing Design and its technical role as a framework of heuristics and practical UX design guidelines aiming at translating data protection law principles and rules to UX practices.
A) WHAT IS PRIVACY-ENHANCING DESIGN? WHAT IS IT FOR?
Privacy-Enhancing Design - or PED - is a framework that I am proposing to offer legally-based practical guidelines to help UX designers implement data protection law principles and rules within UX design.
Privacy-Enhancing Design aims at tackling unfair and deceptive design practices in the context of data protection (such as dark patterns) by empowering UX designers and educating organizations about the positive and transformative impact UX design can have on privacy protection.
The main goals of Privacy-Enhancing Design are:
To serve as a uniform communication language and knowledge between data protection lawyers, product managers, and data protection designers (UX designers that desire to specialize in privacy-enhancing design) within an organization regarding the implementation of data protection law principles and rules through UX design;
To offer practical guidance to data protection designers regarding how to implement data protection law principles and rules through any online interface;
To help the data protection design officer (DPDO) and his team of data protection designers to produce an organization's Privacy Design Policy (see below the explanation about it);
To serve as an accountability parameter to privacy advocates or anyone interested in questioning an organization's privacy-enhancing UX design practices.
2) WHAT ABOUT PET, TET, AND PRIVACY BY DESIGN? DON'T THEY COVER THAT ALREADY?
Privacy-Enhancing Design can be seen as Privacy-Enhancing Technologies’ (PET) & Transparency-Enhancing Technologies' (TET) cousin. PETs are "technologies that are designed for supporting privacy and data protection", and TETs "aim at reducing (...) information asymmetry by providing users with information regarding providers’ data collection, analysis, and usage." Privacy-Enhancing Design, on the other hand, is a set of heuristics and practical design guidelines directed to UX design professionals aiming at translating data protection design and principles to UX practices. Therefore PET and TET are groups of technologies, meanwhile, Privacy-Enhancing Design is a set of heuristics and design practices.
Regarding the relationship between Privacy-Enhancing Design and Privacy by Design (PbD), despite the similar name, they are not the same (and not so similar as they might seem). Privacy-Enhancing Design can be seen as a continuation and an advancement of PbD. First, Privacy-Enhancing Design is directed to UX design practice specifically, aiming at tackling unfair and deceptive design practices that are ubiquitous in the data protection context today. PbD, on the other hand, first issued in 2009, is broader and more general and, in the view of this author, insufficient to solve some issues being faced by data protection law. Second, Privacy-Enhancing Design has as its utmost goal the development of extensive practical UX design guidelines and sets of practices that will help implement data protection law principles and rules (and protect the user) in real online interfaces. PbD, on the other hand, targets the implementation of the seven foundational principles in a more general sense; it does not aim at becoming specific to any discipline or itemized in practical guidelines.
C) WHERE DOES IT COME FROM? WHAT ARE ITS MAIN PRINCIPLES?
An important idea behind Privacy-Enhancing Design is that users are vulnerable, manipulable, and easily influenced by cognitive biases. UX designers can maliciously exploit cognitive biases through deceptive design (i.e., dark patterns), negatively affecting user privacy. Privacy-Enhancing Design proposes that UX designers must acknowledge the existence of cognitive biases and human errors and create interfaces that respect user autonomy, and prioritize choices that preserve user privacy.
A privacy-enhancing UX design practice is a UX practice that acknowledges cognitive biases and human errors, respects user autonomy and prioritizes choices that preserve user privacy
To be able to correctly implement Privacy-Enhancing Design, UX designers (and product managers) must have some understanding of privacy and data protection law. In my view, Data Protection Design and the implementation of Privacy-Enhancing Design form a new discipline that has the potential to radically transform data protection law. It is the bridge between data protection law and UX design. From my point of view, it should be taught at design and business schools (and also in law schools as part of data protection law).
The 7 principles (or heuristics, as UX designers prefer) of Privacy-Enhancing Design are:
Autonomy and Human Dignity are Central. User autonomy and human dignity are fundamental rights and must be respected throughout the UX design. The UX design must allow users to exercise their choices and preferences freely, autonomously, and in an informed way. Users should not be pushed or forced to take a certain action. Users should be able to easily retract a certain choice or preference.
Transparency. UX design practices should foster transparency and accessibility so that users know ongoing data transactions. Every new data transaction (collection or processing) should be clearly signalized in an accessible way so that users can realize that data is being exchanged. Users should be made aware that their personal data is being collected or processed. Symbols, colors, and various design features might be used to transmit information.
No Previous Data Protection Knowledge. UX design should presuppose that users have no background data protection knowledge. Interfaces that involve data collection and processing should be clear and accessible, with simple and user-friendly indications of the scope and extent of the data transaction, including possible risks (even if it seems obvious to the designer).
Acknowledgment of Cognitive Biases. Cognitive biases must be broadly recognized and acknowledged. The exploitation of cognitive biases to collect more - or more sensitive- personal data (i.e., through dark patterns in data protection) must be stopped throughout the UX design process. Users should be seen as vulnerable and manipulable, and it is the organization's responsibility to shield users from manipulation.
The burden on Organizations. Organizations should be responsible for designing UX interfaces that do not exploit users’ cognitive biases. Organizations should be able to prove - at any time - that their UX design practices are privacy-enhancing (and not privacy-harming). If users are committing errors, it is the responsibility of organizations to detect and correct the design practice that is fostering these errors.
Design Accountability. Organizations should be held accountable for their design practices. Organizations should publicly publish their privacy-design practices (perhaps through a Privacy Design Policy similar to a Privacy Policy but focused on UX design practices). It should be possible to legally question an organization’s UX design practices.
Holistic implementation. The principles above must be implemented throughout the UX design and present in every interaction between users and organizations (i.e., not restricted to privacy settings). Privacy and Data Protection should be made an integral part of the interaction between the organization and the user.
D) EXAMPLES, PLEASE? HOW WOULD PRIVACY-ENHANCING DESIGN BE APPLIED IN PRACTICE?
Below is a non-exhaustive list of practices that can be considered aligned with Privacy-Enhancing Design:
any default setting that favors zero data sharing;
building default settings that favor the most privacy-protective option;
using colors, fonts, sizes, or contrasts to prioritize the most "privacy-fostering" option in a menu;
building an interface that does not force or pressure users to constantly share more data;
transmitting any privacy-related information in a concise, usable user-friendly, and user-centered manner;
communicating a product or service's privacy features (and possible risks) in a proactive and straightforward way;
making available a more restricted version of a product or version (i.e. with fewer features) that is also 100% privacy-oriented;
not using pressuring language or terminology to induce users to share more or more sensitive data;
making it easier for users to choose a privacy-protective option;
making the privacy-protective option faster or more prominent;
offer prompt help (i.e., online chat, 24/7 customer service, email with quick answers by a human) to support users in navigating privacy settings and choices;
doing user experience research to check, in practice, if the user understands and can navigate properly the available privacy options and settings;
when building features that are less privacy-preserving (but are desired by users), help users understand what are the risks and possible weakness points;
constantly conducting user research to check for privacy weaknesses of the UX design or additional privacy risks that the user might be experiencing;
Additionally, in this previous post, I made an exercise to show what Privacy-Enhancing Design would look like in practice. There, I imagined a hypothetical Facebook "user post" interface that would follow extremely privacy-enhancing premises. My goal with that exercise was to say that change towards more privacy is possible and can be made through UX design. In that exercise, I showed that some of the premises embedded into Facebook's current "user post" interface do not reflect a privacy-enhancing framework. In any case, how the version with improved premises is going to be implemented in practice is up to the data protection designer in charge (and aspects such as usability and the fluidness of the experience should also be considered).
What is not Privacy-Enhancing Design:
dark patterns (or unfair design) in data protection. You can read my previous articles in this newsletter talking about the topic, as well as my full academic article, where I discuss a definition, taxonomy, and the lawfulness of dark patterns. In fact, dark patterns in data protection are the exact opposite of Privacy-Enhancing Design.
ignoring privacy and data protection concerns when planning and executing the UX design strategy. My whole point when presenting Privacy-Enhancing Design is to show that users are vulnerable, they have cognitive biases and need the support of UX design to help them navigate privacy and data protection. Building a concise privacy policy is not enough (and, in my view, users should not be expected to read privacy policies anyway). So when not doing any effort, through UX design, to help users choose wisely regarding their privacy preferences, organizations are part of the problem (even if they are following data protection law).
E) WHAT IS A DATA PROTECTION DESIGNER? WHAT IS A DATA PROTECTION DESIGNER OFFICE?
A data protection designer is a UX designer that decides to specialize in the implementation of Privacy-Enhancing Design. A data protection design officer (DPDO) is the leader of the data protection design team in an organization (analogously to the role of the data protection officer - DPO - regarding the data protection legal team). They are responsible for planning and implementing the privacy design policy (or data protection design policy) and for handling any external claims regarding the privacy design policy or the implementation of Privacy-Enhancing Design.
In my opinion, similarly to what happens with the DPO, the data protection design officer should be mandatory in all organizations that systematically collect or process personal data from users.
F) WHAT IS A PRIVACY DESIGN POLICY (OR DATA PROTECTION DESIGN POLICY)?
The Privacy Design Policy (or Data Protection Design Policy) is a document I am proposing where the organization will summarize how its UX design practices are reflecting Privacy-Enhancing Design. It should contain images and graphs detailing information about the user's privacy experience on the organization's website or app. It is analogous to a Privacy Policy regarding its accountability role but focused on UX design.
I propose that every organization that systematically collects or processes personal data should have a privacy design policy.
G) I LIKE THAT. HOW CAN I KNOW MORE?
First, you can read the previous post of this newsletter, in which I presented the new discipline that I am proposing called Data Protection Design. Privacy-Enhancing Design is the technical aspect of this discipline, offering practical guidelines to implement data protection law rules and principles through UX design.
This is just the beginning, and there is so much more to talk about.
-
See you next week. All the best, Luiza Jarovsky