TikTok's €345M fine might not be enough
Plus: case study on privacy UX fines
This week's edition is sponsored by Containing Big Tech:
From our sponsor: The five largest tech companies - Meta, Apple, Amazon, Microsoft, and Google - have built innovative products that improve many aspects of our lives. But their intrusiveness and our dependence on them have created pressing threats, including the overcollection and weaponization of our most sensitive data and the problematic ways they use AI to process and act upon our data. In his new book, Tom Kemp eloquently weaves together the threats posed by Big Tech and offers actionable solutions for individuals and policymakers to advocate for change. Order Containing Big Tech today.
🔥 TikTok's €345M fine might not be enough
On September 15, the Irish Data Protection Commission (DPC) announced a €345 million fine for TikTok due to privacy violations. As a reminder, in this investigation, the Irish DPC was examining whether TikTok complied with its GDPR obligations from July to December 2020 regarding children's data in the context of:
“Certain TikTok platform settings, including public-by-default settings as well as the settings associated with the ‘Family Pairing’ feature; and
Age verification as part of the registration process.”
The conclusion was that there were infringements of the following GDPR articles:
5(1)(a), 5(1)(c), and 5(1)(f): lawfulness, fairness, transparency, purpose limitation, and data minimization;
12(1): specific transparency obligations;
13(1)(e): provide to data subjects details about “the recipients or categories of recipients of the personal data, if any”
24(1): (…) “the controller shall implement appropriate technical and organizational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation”
25(1) & 25(2): data protection by design and by default.
The Irish DPC then exercised the following corrective powers:
An order requiring TikTok to fix its practices within 3 months;
€345 million in administrative fines.
The European Data Protection Board (EDPB) published a press release with more details about the infringements, which contains important information on the EDPB's perspective on children's privacy and privacy UX. For example:
“In the Video Posting Pop-Up, children were nudged to click on “Post Now”, presented in a bold, darker text located on the right side, rather than on the lighter button to “cancel”. Users who wished to make their post private first needed to select “cancel” and then look for the privacy settings in order to switch to a “private account”. Therefore, users were encouraged to opt for public-by-default settings, with TikTok making it harder for them to make choices that favoured the protection of their personal data. Furthermore, the consequences of the different options were unclear, particularly to child users. The EDPB confirmed that controllers should not make it difficult for data subjects to adjust their privacy settings and limit the processing.”
It is great to see how the EDPB's analysis is aligned with what I have been discussing in this newsletter every week for more than a year regarding children's privacy, fairness, and privacy UX. For those that did not realize yet: privacy UX matters, and it can lead to significant fines.
About the fine, TikTok's head of privacy wrote:
“The DPC's investigation focused on the period between July and December 2020 only. The DPC did not find that TikTok's age assurance measures violated the GDPR, and most of the decision's criticisms are no longer relevant as a result of measures we introduced at the start of 2021 - several months before the investigation began.”
I have to disagree that the criticism is irrelevant, and maybe this fine might even not be enough, as TikTok still has practices that go against its users’ best interests, especially in terms of privacy UX.
I remind you of a recent analysis I did on a TikTok popup:
This popup is a cluster of dark patterns in privacy. In the same popup, there are 2 calls to action and 3 explanations/assumptions:
Calls to action:
- It asks me to give access to my Facebook friends list;
- It asks me to give access to my email.
- It says that doing the above will lead to experience improvement;
- It says that connecting me with friends (from my email and Facebook) is a form of experience improvement;
- It says that ad personalization is a form of experience improvement.
1. What is the connection between giving access to my Facebook friends list and giving access to email? Why ask these two permissions in the same privacy notice if these are two different contexts and types of data?
2. Why are personalized ads described as experience improvement? Is every additional access now disguised as experience improvement? Do I open TikTok to see ads?
3. Why the 2 calls to action are under one "OK"? Shouldn't I have separate choices over my data?
4. Why are 2 different data processing purposes (experience improvement and personalized ads) under one "OK"? Shouldn't I have separate choices over my data?
5. Why are the buttons asymmetric ("OK" in capital letters and bold and "Don't allow" not capitalized and not in bold)? Why not "allow" and "don't allow" (nothing in bold) to make it clear that there are permissions being asked and that I have a choice?
6. Does email access include all my email content or only the email contact list? It's not clear.
7. Bonus: when I clicked on Help Center link, it took me to a blank page.
Here's a screenshot of the TikTok analysis I wrote last year - “I Was On TikTok For 30 Days: It Is Manipulative, Addictive, And Harmful To Privacy”:
Read the full article here.
TikTok should urgently improve its privacy UX practices or risk receiving more fines and harming its brand further.
To learn more about the topic, register for my live Dark Patterns and Privacy UX Masterclass (90 minutes, 1.5 pre-approved IAPP credits, limited seats).
🔥 Case study: growing privacy UX fines
Privacy UX (user experience) is one of the most popular topics in this newsletter, and it is great to see that there are more and larger fines. Today, I discuss two of the largest fines in this context - one from the Irish Data Protection Authority (DPC), and one from the US Federal Trade Commission (FTC), and what companies should learn from it.
Keep reading with a 7-day free trial
Subscribe to Luiza's Newsletter to keep reading this post and get 7 days of free access to the full post archives.