Dark Patterns in Code: Ignoring User Choice
I have discussed privacy UX and dark patterns in privacy extensively in this newsletter and in my paper. When approaching the topic, we are usually referring to deceptive design practices in the context of privacy that happen in the interface of websites and apps.
Researchers, however, have noticed that cookie banners do not always respect user choice and have identified "dark patterns in code." These would be situations in which a privacy dark pattern would involve both UX and code. How would that happen in practice?
When a user rejects all cookies through a cookie banner and the website continues storing cookies;
When a user did not yet provide his or her choice through the cookie banner and cookies are already being stored;
When the cookie banner declares that only necessary cookies will be stored, however, cookies for additional purposes are also stored;
When the consent management platform (CMP) is in a position to restrict the ability of the publisher to (de)select vendors.
Zengrui Liu et al., in their Feb/2023 paper, went as far as to conclude that:
"Our results indicate that in many cases user data is unfortunately still being collected, processed, and shared even when users opt-out. Our findings suggest that several prominent advertisers might be in potential violation of GDPR and CCPA. Overall, our work casts a doubt if regulations are effective at protecting users’ online privacy."
One of the problematic aspects here is that, differently from UX-only dark patterns, the code-based dark patterns cannot be perceived by the user; they need to be audited to be discovered.
In a recent article for the IAPP, Dan Frechtling discussed why these mismatches between consent preferences and data practices happen. According to him, there are three main reasons:
Improper consent management platform (CMP) integration with the publisher's website;
The site is using methods that aren’t universally adopted throughout the digital ad ecosystem, so signals are dropped during the ad request;
Code patches and other updates by vendors accidentally break or cause downstream issues.
In terms of the legal consequences of dark patterns in code, the first one is that, at least according to the GDPR, data processing will be unlawful due to the absence of valid consent. If consent was the legal ground to collect and process data and what happened in practice was that user choice was not respected, then there was no consent, and the data processing was unlawful.
Another legal issue that arises is how to allocate responsibility. Who will be responsible if the user choice is not being honored?
Before we get to the legal part, Dan Frechtling helps us understand the technical elements of the data exchanges happening here:
"Data exchanged throughout this complex process contains user consent information that allows advertisers and publishers to determine whether they have the appropriate user consent needed to run IBA. This information is passed in the form of cookies or signals (in the case of the GPC) that help inform all CMPs and servers involved in the supply chain. This effectively creates hundreds of opportunities for signal failure, where data can be mishandled or leaked during any stage of the chain."
In this complex chain of events inaccessible to the end user, we must understand what entities are in control and are capable of making sure that the user choice is transmitted accordingly.
Cristiana Santos et al., for example, have argued in favor of considering consent management platforms as controllers in four scenarios:
When including additional processing activities in their tool;
When they perform scanning and pre-sorting of tracking technologies;
When they include third-party vendors by default; and
When they deploy interface manipulative design strategies.
Most authors seem to agree that, in any case, constant auditing will be necessary, and additional parameters to make sure that the user choice is being honored throughout the chain of transmission.
As I discussed two weeks ago, dark patterns in privacy are an issue of autonomy harm. When we are talking about dark patterns in UX, they happen through the exploitation of cognitive biases. When we are talking about dark patterns in code, they happen due to the black-box nature of data operations - the absence of transparency and accountability that makes us unaware of what happens at the code level.
💡 If you want to bring a competitive advantage to your company and dive deeper into privacy-enhancing design and avoiding dark patterns, join our live course about the topic in April (4 weeks, 1 live session per week + additional material). Check out the program and register now using the coupon TPW-10-OFF to get 10% off. To learn more about our courses, visit: implementprivacy.com/courses
📅 Upcoming privacy event
On 16/Mar, in the 2nd edition of Women Advancing Privacy, I will discuss with Dr. Ann Cavoukian, the inventor of Privacy by Design:
The origins of Privacy by Design
How it is essential for businesses, especially today
Her new Privacy by Design ISO certification
How we should think of Privacy by Design in the Age of AI
🔁 Trending on social media
Interact with this tweet here.
📌 Privacy & data protection careers
We have gathered relevant links from large job search platforms and additional privacy jobs-related info on our Privacy Careers page. Bookmark it and check it periodically for new openings. Wishing you the best of luck!
✅ Before you go:
Did you enjoy this article? Share it with your network so they can subscribe to Luiza's Newsletter.
At Implement Privacy, I offer specialized privacy courses to help you advance your career. I invite you to check them out and get in touch if you have any questions.
See you next week. All the best, Luiza Jarovsky