👋 Hi, Luiza Jarovsky here. Welcome to the 74th edition of The Privacy Whisperer, and thank you to 80,000+ followers on various platforms. Read about my work, invite me to speak or join your panel, tell me what you've been working on, or just say hi here.
🌎 If you enjoy The Privacy Whisperer and think that others might find it helpful, recommend it broadly and join the leaderboard. Thank you!
✍️ This newsletter is fully written by a human (me), and illustrations are AI-generated.
This week's edition is sponsored by MineOS:
Asia has seen a wave of data protection regulations since 2020, each one unique to its country. There’s a distinct lack of a one-size-fits-most approach like in US state regulations, making Asia the region to watch globally for privacy professionals. Get to know key points about the new regulations in India, Vietnam, Indonesia, China, and Japan. See which data rights they grant individuals, compliance requirements, and other must-know details like data localization clauses. Get a detailed breakdown by MineOS and stay in the know.
🧬 Ethnically targeted data theft
Last week, the genetic testing company 23andMe confirmed that hackers had obtained data from “certain accounts.” According to the company:
“We recently learned that certain 23andMe customer profile information that they opted into sharing through our DNA Relatives feature was compiled from individual 23andMe.com accounts without the account users’ authorization.”
I would like to draw four comments on what happened:
The challenges of genetic data
My first comment here is that this shows some of the peculiar challenges of genetic data. Genetic data reveals information not only about you but also about your relatives.
When you do a genetic test and, for example, post it on social networks, you are not only choosing to disclose your personal data but also from your relatives, who might not agree with it.
In this specific case, both relatives had to agree to share information with themselves (DNA relatives), so it was not a consent issue. However, it shows how genetic data is interconnected, and by invading one account, a hacker could have access to the genetic information of all the victim's DNA relatives.
Genetic data gave rise to anti-semitism and ethnic hate targeting
Genetic data made available to the public can also give rise to new forms of anti-semitism, racism, and ethnic hate targeting.
In this case, despite 23andMe not having publicly acknowledged it, there was an anti-semitic and racist incident.
News outlets who had access to the data on the dark web remarked that this was a list targeting 1 million people of Ashkenazi Jewish descent. According to NBC News, the data:
“includes their first and last name, sex, and 23andMe’s evaluation of where their ancestors came from. The database is titled “ashkenazi DNA Data of Celebrities,” though most of the people on it aren’t famous, and it appears to have been sorted to only include people with Ashkenazi heritage.”
According to TechCrunch, “Another user on BreachForums claimed to have the 23andMe data of 100,000 Chinese users.”
Making available in the dark web a database with personal data of 1 million people of Jewish descent is an anti-semitic incident making Jewish people vulnerable to third parties that might identify or harm them because they are Jewish. Chinese users were also targeted and made vulnerable to malicious parties that, for any reason, would like to single them out and potentially cause them harm as well.
Genetic data is extremely sensitive for various reasons, and one of them is the possibility of ethnic targeting that might lead to ethnic hate, racism, and anti-semitism.
23andMe's privacy & security practices were not enough
The company is one of the largest in the world in the business of direct-to-consumer genetic data and should be extremely careful of its privacy and security practices.
One of the peculiar aspects of genetic data is that it's immutable, you cannot change it. If your genetic information is leaked, it's leaked forever. Companies dealing with this type of data have to be extremely cautious.
The company did not implement 2-factor or multi-factor authentication practices by default, which would be an important step to prevent this kind of massive leak and ethnic targeting. 2-factor authenticator, as of today, is broadly available to users, but it was not encouraged by 23andMe, and it is not their default practice.
Also, the DNA relatives feature should receive further protection (perhaps an additional password to access it or additional privacy measures). Once inside a user account, it was easily accessible by bad actors, allowing massive targeting, as it happened in this case.
23andMe is not being fully transparent with this incident
You can read 23andMe's official statement on this incident here.
They don't acknowledge a security incident or a data breach, despite all the details that were made online. They merely say they are investigating.
They do not say that the people targeted were Ashkenazi Jews (1 million users) and Chinese people (100,000 users). This leak has ethnic targeting, and this was not specified in their official communication.
They do not specify what type of data was leaked, causing insecurity and distress.
The details above were made available through media outlets but not through the company's official channels.
23andMe is being sued due to this incident
Lastly, as I write this, I'm informed that 23andMe is being sued for the incident. You can read the lawsuit here. I quote from the lawsuit:
“23andMe’s Notice of Data Breach was woefully deficient, failing to provide basic details concerning the Data Breach, including, but not limited to, how unauthorized parties accessed its employee’s e-mail account, whether the information was encrypted or otherwise protected, how it learned of the Data Breach, whether the breach was a system-wide breach, whether servers storing information were accessed, and how many customers were affected by the Data Breach. Even worse, 23andMe has not offered any identity monitoring to Plaintiffs and other Class Members.” (page 3)
“As a provider of DNA testing services, 23andMe knew, or should have known, the importance of safeguarding its customers’ Private Information entrusted to it and of the foreseeable consequences if its data security systems were breached. This includes the significant costs that would be imposed on 23andMe’s customers as a result of a breach. 23andMe failed, however, to take adequate cybersecurity measures to prevent the Data Breach from occurring.” (page 17)
It's an important lawsuit, I recommend reading it.
*
Failing to protect genetic privacy can lead to hate and harm, and the highest privacy and security measures should be available to protect this type of data.
Additionally, users should be educated and made aware of possible privacy and security risks before they are prompted to opt in for any feature that might undermine their privacy or make it vulnerable.
📌 Job Opportunities
Looking for a job in privacy? Check out our privacy job board and sign up for the biweekly alert.
🖥️ Privacy & AI in-depth
Every month, I host a live conversation with a global expert. I've spoken with Max Schrems, Dr. Ann Cavoukian, Prof. Daniel Solove, and various others. Access the recordings on my YouTube channel or podcast.
🎓 Last AI & Privacy Masterclass of 2023: October 23
Our 90-minute live Masterclass will help you navigate current challenges in the context of privacy & AI. I'm the facilitator, and I hope to make it an interactive opportunity to discuss risks, unsolved privacy issues, and regulation. After we finish the live session, you'll receive a quiz, additional reading material, and a certificate. Most people get reimbursed by their companies, let me know if you need assistance with that. Read more and sign up here. Looking forward to meeting you on October 23rd at 5pm UK time.
📖 Join our AI Book Club
We are now reading “Atlas of AI” by Kate Crawford, and the next AI book club meeting will be on December 14th, with six book commentators. To participate, register here.
🔎 Case study on October GDPR fines
This week I discuss some of the October GDPR fines and some of the questions they help us raise in the context of data protection. Let's get started: