Discord Was Fined 800k Euros - Why You Should Worry
On November 10th, 2022, the French Data Protection Authority (CNIL) fined Discord 800,000 euros for numerous GDPR infractions. Despite not receiving much media attention - perhaps because this fine is lower than other recent ones, such as the last one received by Meta - there are important details in this procedure and fine that should be noticed and learned by every operating company.
On November 10th, 2022, the French Data Protection Authority (Commission Nationale Informatique & Libertés - CNIL) fined Discord 800,000 euros for numerous GDPR infractions. Let us take a look at the infractions:
Failure to define and respect a data retention period appropriate to the purpose (Article 5.1.e of the GDPR);
Failure to comply with the obligation to provide information (Article 13 of the GDPR);
Failure to ensure data protection by default (Article 25.2 of the GDPR);
Failure to ensure the security of personal data (Article 32 of the GDPR);
Failure to carry out a data protection impact assessment (Article 35 of the GDPR).
The first two refer to the data retention policy and data retention periods. According to the GDPR, Art. 5.1.e:
"Personal data shall be (...) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) subject to implementation of the appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject (‘storage limitation’);"
According to CNIL's investigation, Discord did not have a written data retention policy. Besides not having a data retention policy, the information on data retention or the criteria used for that was not available to the public.
The next issue raised was data protection by design and by default. This is one of my favorite topics, and I have extensively discussed dark patterns in privacy and privacy-enhancing design in this newsletter.
The issue raised by the CNIL was very specific, and it is so interesting to notice how legislation and enforcement are more and more entering the design field.
At some point in the Microsoft Windows version of the application, there was an X which did not really close the program - as the user would expect. From the perspective of the CNIL:
"Discord's behavior is different and may lead to users being heard by other members in the voice room when they thought they had left. The restricted committee considered that Discord should specifically inform users by making them aware that their words are still being transmitted and heard by others."
To solve the issue, Discord set up a pop-up to alert users that the application was still running
Like it or not, this is Law bringing Design to the table (I love this, and I think it is absolutely necessary for data protection law to progress and protect users, as I have argued in previous articles). In my personal view, more and more, we are going to see fines due to a lack of privacy-enhancing design. Take note.
Next, there was a security issue - for all those that think that data protection and security are not correlated. Discord required a password of six characters, including letters and numbers. The CNIL considered this to be a weak password management policy. After the CNIL procedure, Discord now requires:
a password of at least eight characters
with at least three of the four character types (lower case, upper case, numbers and special characters)
after ten unsuccessful login attempts, there will be a captcha (question and answer, e.g. via a checkbox or an image selection) to be solved.
Then, there was the absence of a data protection impact assessment - DPIA. According to the GDPR, Art. 35:
"Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. 2A single assessment may address a set of similar processing operations that present similar high risks"
Discord considered it was not necessary to carry a DPIA, but the CNIL disagreed.
As a result, Discord then carried out two data protection impact assessments which concluded that the processing is not likely to result in a high risk to individuals' rights and freedoms. This shows that the problem was the lack of the impact assessment in itself, not necessarily the result of this assessment.
In my view, the Discord procedure and fine can serve as a learning opportunity for thousands of companies around the globe who are impacted by the GDPR but assume that they will not be as scrutinized as Facebook or Google.
More and more, the regulatory examination in the field of data protection will evaluate design and product-related practices, which are not usually overseen on a daily basis by the legal department. Everyone in a tech company, especially consumer-focused, must have some level of understanding of privacy and data protection - not only the legal department. If you need help with that, get in touch.
What do you think are additional lessons we can learn from this case? Privacy needs critical thinkers like you: share this article and start a conversation about the topic.
See you next week. All the best, Luiza Jarovsky