Is Behavioral Advertising Dying?
Tracking and behavior advertising have been more closely regulated in the last few years, especially after the General Data Protection Regulation (GDPR), which brought, among other provisions, stricter requirements for user consent. What I argue in today's newsletter is that there has been a series of legislative, regulatory, market, and social trends that suggest that behavioral advertising - at least the way we know it - is on a not-so-slow path to death. And the upcoming changes will make the online environment more transparent and fair.
Read time: 10 minutes.
I would like to start today's newsletter with an excerpt from James Clear's bestseller "Atomic Habits" (2018). On page 208, he says:
"Laws and regulations are an example of how governments can change our habits by creating a social contract. As a society, we collectively agree to abide by certain rules and then enforce them as a group. Whenever a new piece of legislation impacts behavior - seat belt laws, banning smoking inside restaurants, mandatory recycling - it is an example of a social contract shaping our habits. The group agrees to act in a certain way, and if you don't follow along, you'll be punished."
I begin with this quote to break with the old dogma that law is always behind tech and that meaningful changes are almost always a result of economic and market incentives. What is happening to behavioral advertising today, as I will describe here, is a result of various factors, but the most powerful one is a legislative and regulatory movement - led by the European Union and more recently joined by the United States - toward the protection of privacy and fairness online.
It all began in 2011 when, due to the ePrivacy Directive, cookie banners started to pop up. Despite their obvious flaws, cookie banners brought the idea that our personal data is part of our identity, and if there are going to be trackers following us around and collecting our personal information - e.g., for advertising - there should be transparency and consent.
A few years later, in 2018, came the GDPR, which brought a massive wave of changes in regard to data protection principles, user rights, and privacy compliance in the business world. The privacy career was officially blooming, companies had to find their own data protection officer and build their privacy management system, fines began to be issued, and privacy was (and still is) on the news almost daily.
An important advancement of the GDPR was its new "informed consent" requirement. In order to collect, process, and use personal data - such as for behavioral advertising - consent was necessary. And only opt-in consent was valid. In the GDPR's words (art 4.11):
"any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her"
The idea behind this definition is that users should have a say on what happens to their personal data. To give some context, remember that before the GDPR, the common practice was to use opt-out mechanisms, such as pre-selected boxes, to bypass conscious awareness and get easy access to personal data - e.g., to use in behavioral advertising. So the GDPR had to use 40 words just to define consent to make sure that organizations will not try to bypass it.
But yes, organizations tried to bypass it.
Meta/Facebook, for example, for many years argued that behavioral advertising was part of the contractual agreement they had with users, and therefore there was no need to request user consent for that purpose. On this topic, yesterday, we had perhaps one of the most important privacy decisions since the beginning of the GDPR era. Max Schrems & noyb filed two complaints on May 25th, 2018 (the day the GDPR became applicable). It was decided that Meta (Facebook & Instagram) cannot use a clause in its terms and conditions as the lawful basis to collect user data. Meta was fined 390 million euros.
According to noyb's website: "The European Data Protection Board (EDPB) has rejected the Irish DPC and Meta's bypass of the GDPR based on noyb complaints against Facebook and Instagram. Meta is now prohibited to bypass the GDPR via a clause in the terms and conditions. Meta has to get 'opt-in' consent for personalized advertisement and must provide users with a 'yes/no' option."
The decision can still be appealed, and privacy experts worldwide are waiting to see what will happen next. However, what is clear is that privacy advocates and authorities are tireless and will not stop here, whatever the outcome. People deserve privacy.
One possible consequence is that Facebook will have to ask for consent from all its users in order to track them and show behavioral advertising. And not any consent, it must be informed consent, ensuring users understand the complex machinery that happens with their personal data on the social network. This is the spirit of the law.
The consequence of a consent mandate would be that a high percentage of users would not accept tracking or behavioral advertising, and revenue would plummet, as 97.6% of Facebook's total revenue in 2022 came from ads.
Mark Zuckerberg once approached entrepreneurship and innovation as "move fast and break things." I hope that, by now, he has understood that his main business model perhaps does not fit current legislative and regulatory trends and that they should be moving fast in another direction.
What is behind this Meta/Facebook decision is one more push in the direction of more autonomy, transparency, and fairness for users. As European Commission President Ursula von der Leyen said, in the context of the upcoming Digital Services Act (DSA): "what is illegal offline should be illegal online." I couldn't agree more. People deserve to be treated fairly and with respect - both online and offline. The Universal Declaration of Human Rights is also valid online. This is a straightforward idea, but for a strange reason, many organizations behave as if the Web was different. They forget that each individual must have their rights and dignity respected at all times - and treat people as data points in a growth graph.
Talking about the DSA, it also complements the GDPR and brings additional provisions that affect behavioral advertising by some service providers: it bans targeted advertising of minors based on profiling and targeted advertising based on profiling using special categories of personal data, such as sexual orientation or religious beliefs.
For very large online platforms, the DSA establishes that "they will have to maintain and provide access to ad repositories, allowing researchers, civil society and authorities to inspect how ads were displayed and how they were targeted. They will also need to assess whether and how their advertising systems are manipulated or otherwise contribute to societal risks, and take measures to mitigate these risks."
The DSA also says (art. 26) that for each specific ad, the person seeing the ad should be able to identify the following:
"(a) that the information is an advertisement, including through prominent markings, which might follow standards pursuant to Article 44;
(b) the natural or legal person on whose behalf the advertisement is presented;
(c) the natural or legal person who paid for the advertisement if that person is different from the natural or legal person referred to in point (b);
(d) meaningful information directly and easily accessible from the advertisement about the main parameters used to determine the recipient to whom the advertisement is presented and, where applicable, about how to change those parameters."
These are game-changing rules, and, in my view, online advertising will never be the same again. For the better.
The online advertising transformation also crossed to the other side of the Atlantic and reached the United States, especially California. If the GDPR needed 40 words to define informed consent, check out the California Privacy Rights Act's (CPRA) 121 words approach to consent:
So the CPRA had to use 121 words just to define consent to ensure that organizations will not try to bypass it to collect more data - for purposes such as behavioral advertising.
And it also prohibits dark patterns. The readers who follow me here, on Twitter, and on LinkedIn know that I have a lot to talk about it, as my academic article and my various newsletter articles on the topic show. The issues involved with the CPRA's approach to dark patterns will be a topic for another edition of the newsletter, stay tuned.
Still in California, the California Consumer Privacy Act (CCPA) - and the CPRA that we just mentioned - also bring new obligations to organizations when user information is "sold" or "shared," such as in the context of behavioral advertising.
There are still discussions going on regarding what situations are encompassed by each of these two terms, however, as Alastair Mactaggart, privacy advocate and now California Privacy Protection Agency board member, said:
"I think that the language in the CCPA is clear, and I think the intent is clear. I was really surprised to see a thread developing among some attorneys saying, “don’t worry about ‘sell,’ because that means exchange for valuable consideration,” and essentially, “we can ‘share,’ and it’ll all be OK.” Even though I don’t think the CCPA is ambiguous, if some people are saying it is ambiguous, let’s make sure we close that out. It is now crystal-clear, when it comes to sharing consumer information for cross context behavioral advertising, that the law gives consumers the right to opt out."
On the topic, it has also been said that "CCBA [cross-context behavioral advertising] is 'sale.' It is time to get over it." It's great to see this type of discussion in the United States, as it shows that the legislative/regulatory privacy trend is not restricted to the European Union.
"once third-party cookies are phased out, we will not build alternate identifiers to track individuals as they browse across the web, nor will we use them in our products."
In another strategic move, Apple launched its iOS App Tracking Transparency (ATT) feature after the release of iOS 14, which heavily impacted the mobile advertising industry. It was certainly not without its controversies, as Apple is using this strategic advantage to grow its own ad network. It was also fined for not obtaining user consent for its own data collection.
Both Google's and Apple's move toward more privacy was strategically motivated: legislation is coming strong. Users are also looking for privacy-supporting alternatives. Privacy sells. For any tech company, the smarter move, at this point, is to move toward more privacy.
Lastly, now talking about cultural changes, more people are using adblockers and other technologies to avoid being tracked online. Those ads that follow you wherever you go are not funny anymore, they are creepy, and people want to get rid of them. People are talking more openly about privacy and ways to keep our autonomy, identity, and dignity online. The growth of this newsletter is one more example of how privacy is becoming more and more mainstream. People want to be respected and empowered online, not exploited by platforms and intermediaries.
Going back to the quote that opens this article: there is a privacy revolution going on, which is already transforming the advertising industry. Behavioral advertising is dying, and online advertising will never be the same again. And do you know who is leading this revolution? Legislation, regulation, enforcement, and all privacy advocates spread around the globe. The market is following - and not the other way around.
Do you agree that behavioral advertising is dying? If yes, what would be other symptoms/reasons behind this trend? If you don't, what is your view on the topic? Privacy needs critical thinkers like you: share this article and start a conversation about the topic.
See you next week. All the best, Luiza Jarovsky