👋 Hi, Luiza Jarovsky here. Read about my work, invite me to speak, tell me what you've been working on, or just say hi here.
This week's edition of The Privacy Whisperer is sponsored by The State of US Privacy & AI Regulation:
Want to hear directly from the people shaping the US Privacy & AI Regulation at the federal and state levels? Then join this LinkedIn Live on August 28 at 11am PST (2pm EST), with speakers Rep. Ro Khanna (member of Congress representing Silicon Valley), Alastair Mactaggart (co-author of CCPA & CPRA, and board member of the California Privacy Protection Agency), and moderator Tom Kemp (co-author of the California Delete Act, and author of the new book Containing Big Tech). Free registration here.
🔥 How privacy UX reflects the privacy culture
As I teach privacy UX to tech and privacy professionals and frequently discuss the topic in this newsletter (see the archive and read my academic article), when I am online, I cannot stop noticing dark patterns in privacy.
The topic is fascinating, as it is an interdisciplinary area where legal data protection concerns intersect with user experience (UX), marketing, and computer science-related issues.
Another reason why analyzing dark patterns and privacy UX is interesting is because they offer us a glimpse into companies’ data practices and how they treat their customers - beyond what is being done by the legal department. Privacy UX can be a window into a company's privacy culture.
To exemplify, see the screenshots below, taken in the last few days somewhere in the EU:
This cookie banner uses three tactics to foster unaware acceptance: first, it states in the title that the person is in control of their cookies (but does not offer any button/choice to reject all cookies in the first layer of interaction); it uses the expression “fine by me” to deviate from the main topic (accept/reject), and a bright blue color so that inattentive reader can click without reading. Without even taking into consideration compliance aspects, the UX design is built to deviate the reader's attention from the fact that there is a data-intensive business model going on.
This cookie banner has no title to alert the reader what it is about. It is a block of text in small font and legalese describing the technical aspects of cookie usage. The cookie banner blocks the website, so the reader has the impression that the inevitable action is to press the blue button. There is a large blue button written “Accept,” and the option to reject is much more subtle on the top right. The banner does not let the reader realize that there is a choice.
Above, you can see two screenshots from Facebook: the “Privacy Checkup” and “Privacy Center” sections. There are so many issues to comment on Meta's practices, and I have discussed some of them in previous editions of this newsletter.
Today, I would like to comment on the fact that the way Facebook organizes privacy choices is absolutely confusing and labyrinthic, and every time I wanted to configure my privacy settings (when I was an active user), I had to use a search engine to find a “support” page with the information I needed. I doubt most users would go that far, they most probably would give up on trying to improve their privacy on the platform.
Additionally, for every one of the items in the Privacy Checkup and Privacy Center, a user needs to navigate four or more screens with additional information to get to what they need. This is wrong and should not be allowed. The user does not want to have crash courses on each Facebook privacy setting. The user just wants to have transparency and choices regarding their personal data - and this is the bare minimum, according to the law. Facebook/Meta knows what the critical privacy issues are and could put them one click away from their users. But they choose not to do that. And there is nothing users can do.
*
Coming back to privacy culture: data protection law does not cover privacy UX in detail (register for my September Masterclass to dive deeper into the topic). It offers data protection principles, data subject rights, and a few articles that can help to set a direction and general guidelines, but the details of the privacy UX interface are up to companies to decide. Most companies have no idea of privacy UX best practices or how to start implementing them.
For a trained eye, through the way a company structures its privacy communication and interactions, it is possible to see which companies let the legal department juggle privacy efforts/investments to the minimum possible to avoid fines and brand harm and which companies see privacy as part of their culture (and make additional efforts to spread it beyond the legal department).
Lastly, most tech companies do not see privacy as part of their core business, and it should not be this way, especially in the context of today's data-intensive business models.
With AI-based functionalities spreading to every consumer-facing product and privacy law still lagging behind the challenges in the intersection between privacy & AI, it is sad to see that companies do not invest in transparency, trust, and fairness.
However, as I wrote two weeks ago in the case study analyzing Zoom's practices (available to paid subscribers only): although the law is slow, customers are fast, and they notice unfair behavior. Brands that do not realize that privacy compliance is changing will fall behind.
🔥 Case study: Airbnb vs. Booking.com privacy UX practices
Continuing the discussion above on the relationship between privacy UX and privacy culture, in this week's case study, I compare some of Airbnb's and Booking's privacy UX practices and how they impact the users of these services.