Privacy Settings Are Too Complicated. Here Are Some Ideas On How To Change Them
This week I would like to start a discussion about privacy settings: are they purposefully complicated in order to trigger cognitive biases and be an obstacle to privacy-preserving decisions? Or privacy is such a complicated topic that it would be impossible to design an easy-to-navigate menu? Or both? My opinion is at the end of the post.
Take a look at the screenshots below. I took them from the Facebook mobile app today.
Inside every category, after you tap on them, there are additional choices to be made, sub-menus, and sometimes external links. This is just a "map" of the macro-categories, there are dozens of choices to be made. You can try and explore each of these categories yourself.
The settings above, in my opinion, are bad. I have been studying privacy for years, I consider myself a tech-savvy person, I have used this app since it was launched, I am a millennial, and... I have difficulty navigating these settings. I do not know where to click to get what I need. I get lost in the number of choices I must make, which seem confusing and misplaced. I frequently need to go to Google Search and type "how to (...) on Facebook" regarding how to implement a privacy command.
I cannot even imagine the difficulty that people that are not tech-natives or heavy users of tech must have to navigate them. I guess they give up, or they do not even know that they have the option to choose. Also, when they have a doubt, I am not sure they know they can resort to Google Search to look for quick answers. This is intimidating, unfair, and simply wrong. It should not be like that.
A few weeks ago, in this newsletter, I proposed Privacy-Enhancing Design so that tech experts - and especially UX designers & product managers - can realize how they can be positive forces in transforming the web and making privacy more accessible to all. If Facebook designers, product managers, developers, lawyers, and everyone involved in the design of their privacy settings had learned Privacy-Enhancing Design, I very much doubt that the screenshots I pasted above would be the final version available to the user.
To recap, I defined Privacy-Enhancing Design as:
a framework of heuristics and practical UX design guidelines aiming at translating data protection law principles and rules to UX practices
The main goals of Privacy-Enhancing Design are:
To serve as a uniform communication language and knowledge between data protection lawyers, product managers, and data protection designers (UX designers that desire to specialize in privacy-enhancing design) within an organization regarding the implementation of data protection law principles and rules through UX design;
To offer practical guidance to data protection designers regarding how to implement data protection law principles and rules through any online interface;
To help the data protection design officer (DPDO) and his team of data protection designers to produce an organization's Privacy Design Policy (see below the explanation about it);
To serve as an accountability parameter to privacy advocates or anyone interested in questioning an organization's privacy-enhancing UX design practices.
The 7 principles (or heuristics, as UX designers prefer) of Privacy-Enhancing Design are:
Autonomy and Human Dignity are Central. User autonomy and human dignity are fundamental rights and must be respected throughout the UX design. The UX design must allow users to exercise their choices and preferences freely, autonomously, and in an informed way. Users should not be pushed or forced to take a certain action. Users should be able to easily retract a certain choice or preference.
Transparency. UX design practices should foster transparency and accessibility so that users know ongoing data transactions. Every new data transaction (collection or processing) should be clearly signalized in an accessible way so that users can realize that data is being exchanged. Users should be made aware that their personal data is being collected or processed. Symbols, colors, and various design features might be used to transmit information.
No Previous Data Protection Knowledge. UX design should presuppose that users have no background data protection knowledge. Interfaces that involve data collection and processing should be clear and accessible, with simple and user-friendly indications of the scope and extent of the data transaction, including possible risks (even if it seems obvious to the designer).
Acknowledgment of Cognitive Biases. Cognitive biases must be broadly recognized and acknowledged. The exploitation of cognitive biases to collect more - or more sensitive- personal data (i.e., through dark patterns in data protection) must be refrained throughout the UX design process. Users should be seen as vulnerable and manipulable, and it is the responsibility of the organization to shield users from manipulation.
The burden on Organizations. Organizations should be responsible for designing UX interfaces that do not exploit users’ cognitive biases. Organizations should be able to prove - at any time - that their UX design practices are privacy-enhancing (and not privacy-harming). If users are committing errors, it is the responsibility of organizations to detect and correct the design practice that is fostering these errors.
Design Accountability. Organizations should be held accountable for their design practices. Organizations should publicly publish their privacy-design practices (perhaps through a Privacy Design Policy similar to a Privacy Policy but focused on UX design practices). It should be possible to legally question an organization’s UX design practices.
Holistic implementation. The principles above must be implemented throughout the UX design and present in every interaction between users and organizations (i.e., not restricted to privacy settings). Privacy and Data Protection should be made an integral part of the interaction between the organization and the user.
Below is a non-exhaustive list of practices that can be considered aligned with Privacy-Enhancing Design:
any default setting that favors zero data sharing;
building default settings that favor the most privacy-protective option;
using colors, fonts, sizes, or contrasts to prioritize the most "privacy-fostering" option in a menu;
building an interface that does not force or pressure users to constantly share more data;
transmitting any privacy-related information in a concise, usable user-friendly and user-centered manner;
communicating a product or service's privacy features (and possible risks) in a proactive and straightforward way;
making available a more restricted version of a product or version (i.e. with fewer features) that is also 100% privacy-oriented;
not using pressuring language or terminology to induce users to share more or more sensitive data;
making it easier for users to choose a privacy-protective option;
making the privacy-protective option faster or more prominent;
offer prompt help (i.e., online chat, 24/7 customer service, email with quick answers by a human) to support users in navigating privacy settings and choices;
doing user experience research to check, in practice, if the user understands and can navigate properly the available privacy options and settings;
when building features that are less privacy-preserving (but are desired by users), help users understand what are the risks and possible weakness points;
constantly conducting user research to check for privacy weaknesses of the UX design or additional privacy risks that the user might be experiencing;
What is not Privacy-Enhancing Design:
dark patterns (or unfair design) in data protection. You can read my previous posts in this newsletter talking about the topic, as well as my full academic article, where I discuss a definition, taxonomy, and the lawfulness of dark patterns. In fact, dark patterns in data protection are the exact opposite of Privacy-Enhancing Design.
ignoring privacy and data protection concerns when planning and executing the UX design strategy. My whole point when presenting Privacy-Enhancing Design is to show that users are vulnerable. They have cognitive biases and need the support of UX design to help them navigate privacy and data protection. Building a concise privacy policy is not enough (and, in my view, users should not be expected to read privacy policies anyway). So when not doing any effort, through UX design, to help users choose wisely regarding their privacy preferences, organizations are part of the problem (even if they are following data protection law).
*Regarding the question I raised in the first paragraph, in my opinion, the answer is that they are purposefully complicated, as companies, especially those that offer services for free (in exchange for data collection), have no incentive to support users in being more private or in properly managing their privacy choices. And you, what do you think?
-
See you next week. All the best, Luiza Jarovsky