Privacy-Enhancing Design vs Deceptive Design Cookie Banner: Case Study
Continuing last week's conversation about User Vulnerabilities in the Data Cycle and How to Mitigate Them, today I would like to talk more about Privacy-Enhancing Design, specifically by analyzing a very good and a very bad example of cookie banners on the web (and why one is a Privacy-Enhancing Design and the other is a deceptive design or dark pattern).
As a recap, if you read the post from two weeks ago, you know that Privacy-Enhancing Design is a framework of heuristics and practical UX design guidelines aiming at translating data protection law principles and rules to UX practices. It aims to tackle unfair and deceptive design practices in the context of data protection (such as dark patterns) by empowering UX designers and educating organizations about the positive and transformative impact UX design can have on privacy protection.
An important idea behind Privacy-Enhancing Design is that users are vulnerable, manipulable, and easily influenced by cognitive biases. UX designers can maliciously exploit cognitive biases through deceptive design (i.e., dark patterns), negatively affecting user privacy. Privacy-enhancing design proposes that UX designers must acknowledge the existence of cognitive biases and human errors and create interfaces that respect user autonomy, and prioritize choices that preserve user privacy.
A privacy-enhancing UX design practice is a UX practice that acknowledges cognitive biases and human errors, respects user autonomy and prioritizes choices that preserve user privacy
The 7 principles (or heuristics, as UX designers prefer) of Privacy-Enhancing Design are:
Autonomy and Human Dignity are Central. User autonomy and human dignity are fundamental rights and must be respected throughout the UX design. The UX design must allow users to exercise their choices and preferences freely, autonomously, and in an informed way. Users should not be pushed or forced to take a certain action. Users should be able to easily retract a certain choice or preference.
Transparency. UX design practices should foster transparency and accessibility so that users know ongoing data transactions. Every new data transaction (collection or processing) should be clearly signalized in an accessible way so that users can realize that data is being exchanged. Users should be made aware that their personal data is being collected or processed. Symbols, colors, and various design features might be used to transmit information.
No Previous Data Protection Knowledge. UX design should presuppose that users have no background data protection knowledge. Interfaces that involve data collection and processing should be clear and accessible, with simple and user-friendly indications of the scope and extent of the data transaction, including possible risks (even if it seems obvious to the designer).
Acknowledgment of Cognitive Biases. Cognitive biases must be broadly recognized and acknowledged. The exploitation of cognitive biases to collect more - or more sensitive- personal data (i.e., through dark patterns in data protection) must be stopped throughout the UX design process. Users should be seen as vulnerable and manipulable, and it is the organization's responsibility to shield users from manipulation.
The burden on Organizations. Organizations should be responsible for designing UX interfaces that do not exploit users’ cognitive biases. Organizations should be able to prove - at any time - that their UX design practices are privacy-enhancing (and not privacy-harming). If users are committing errors, it is the responsibility of organizations to detect and correct the design practice that is fostering these errors.
Design Accountability. Organizations should be held accountable for their design practices. Organizations should publicly publish their privacy-design practices (perhaps through a Privacy Design Policy similar to a Privacy Policy but focused on UX design practices). It should be possible to legally question an organization’s UX design practices.
Holistic implementation. The principles above must be implemented throughout the UX design and present in every interaction between users and organizations (i.e., not restricted to privacy settings). Privacy and Data Protection should be made an integral part of the interaction between the organization and the user.
As I have said in other posts, data protection design is a new discipline connected to data protection law that needs to be acknowledged by lawmakers, regulatory authorities, commercial and non-profit organizations, privacy advocates, and the general public. There are signs of regulatory change coming (at least in the US, EU, and UK), and a shift in social and cultural norms regarding privacy is already happening. New technologies and data-invasive commercial practices will require stronger and more interdisciplinary data protection frameworks that will acknowledge that privacy harm, in a data-fueled world, is dignity harm. This, in practice, means that design, code, and technology will not be able to remain outside of the lawmaker's field of vision or field of action. I believe that sharing relevant content and getting people involved in big conversations like this is key to a transparency & autonomy-based future, so I hope to keep sharing more with you here and in my other channels more about these topics. Stay tuned and join the conversation.
Now let us get to the real-life example of this new type of design I call Privacy-Enhancing Design. There are definitely more examples out there (I hope to bring more of them in future posts), and, despite currently being outliers, I am an optimist, and I believe that in a few years, they will become the mainstream practice.
Before I start, I would like to say that I was not paid or benefited in any way by Privado.ai, and I have nothing against The Guardian. I am just objectively analyzing their UX design practices and showing how UX design can positively or negatively affect privacy. Additionally, I cannot check their data practices and verify if they are honoring user choice or not or if they are doing anything different from what the banner says. I am just analyzing what I see through the UX design, as any user would see. Lastly, Privado.ai is a privacy company offering privacy-related services, and The Guardian is a reader-funded news organization offering articles for free, so of course, take that into consideration when reading my analysis below.
1- Privacy-Enhancing cookie banner by Privado.ai:
Why it is a Privacy-Enhancing Design cookie banner:
a) usability & beauty: the cookie banner is discrete and presented at the bottom of the interface, with low impact on the website usability;
b) autonomy supporting: as the cookie banner is located at the bottom of the page, it allows the user to have a first glimpse of the content the website has to offer and then decide if it is worth clicking and continuing in the website; it does not foster a sense of urgency by blocking all the screen and making the user feel compelled to click in anything without thinking just to unblock the view;
c) accessibility: the cookie banner's textual description is 34 words long and uses simple language.
d) transparency: the language used is clear and goes straight to the topic of how the cookies will be used without any beautifying language. The banner offers a link for further details if the user wishes to know more.
e) acknowledgment of cognitive biases: the colors used in the button are coherent with the color patterns of the website. The banner does not favor the "accept cookies" options, as both the "accept" and "deny" buttons have dark and contrasting colors. Studies on cognitive biases have shown that items in low contrast put side-by-side with high contrast items can be seen as a less important or a less desirable option. (most websites use low contrast for the privacy-protective option).
f) design accountability: there is the option to "deny cookies." Not "learn more about cookies" or "configure your cookies" (as many websites do it and require you to unselect dozens of toggles), but a straightforward "deny cookies." It is as easy to accept as it is to deny cookies. This is how it should be so that autonomy is respected. The organization is probably losing a high percentage of user data here, but privacy was the priority, and user trust and autonomy were favored. Again, this is a privacy company and is naturally more inclined and expected to do cookie banners properly. Still, many other companies and government websites that deal with privacy do not do it well.
If it is unclear to you why I used the criteria above to classify this banner as a Privacy-Enhancing one, please check the previous post of this newsletter explaining Privacy-Enhancing Design.
Now, for you to understand why I liked Privado.ai's cookie banner so much, please see below an example of a very different one.
2- The Guardian cookie banner - deceptive design / dark pattern:
Why it is NOT a Privacy-Enhancing Design cookie banner (and is actually a dark pattern):
a) no usability: the cookie banner covers the interface, and you cannot see anything besides it. It makes the UX looks heavy it is not usable.
b) not autonomy supporting: as the cookie banner does not support autonomy. It covers the whole page and does not allow the user to take a look at the content the website has to offer and then decide if it is worth clicking and continuing on the website; it fosters a sense of urgency by blocking all the screen and making the user feel compelled to click in anything - without thinking - just to unblock the view;
c) accessibility: the cookie banner's textual description is 202 words long (but yes, it uses simple language). If users, on average, stay on webpages for 10-20 seconds, they do not have time to read it and will click "Yes, I'm happy" just to get rid of this big banner and see the content (and then decide in 10-20 seconds whether they like the content).
d) transparency: the language used is clear and transparent, and there are links to learn more. However, because the text is too long, people will not read it, so transparency is unfortunately neutralized, and the user remains uninformed.
e) acknowledgment of cognitive biases: the whole banner is exploiting cognitive biases so that users will click "yes, I am happy," which makes this banner an example of a dark pattern (which I defined somewhere else as "user interface design choices that manipulate the data subject’s decision-making process in a way detrimental to his or her privacy and beneficial to the service provider”):
Large banner - covers the screens, makes the user get annoyed and click on "Yes, I'm happy" to proceed to the experience;
Long text, users will not read and will not engage with any rational choice;
"Manage my cookies" in lower contrast;
"Yes, I am happy" (very positive random affirmation) instead of "accept cookies," which is what is really happening. Disguising language.
Absence of a "deny cookies" option. It is difficult to deny the cookies.
After you click on "manage my cookies," the screen below appears, which is another cluster of dark patterns. I will not analyze each bad practice of these additional screens (which you need to go through in order to have more privacy) otherwise, this newsletter will become too long, but you can have a better idea by reading my previous posts about cognitive biases and dark patterns:
If you click on "purposes," you will get this other screen (another cluster of dark patterns):
If you click on "features," you will get this other screen (another cluster of dark patterns):
If you click on "site vendors", you will get this other screen (another cluster of dark patterns):
Can you imagine requiring that a user goes through all these steps to have a more private online experience? Privacy-Enhancing Design proposes a different user experience, one where privacy is considered a priority and where organizations recognize that users are vulnerable and need to be supported also through UX design.
I am sure The Guardian is managing to collect massive amounts of user data with this banner, as probably a low percentage of users will go through the necessary steps to reject the cookies and choose a more private navigation. However, I am not sure users feel this is a practice that respects their autonomy.
-
See you next week. All the best, Luiza Jarovsky