Apple's strategic privacy approach
Plus: AR/VR headsets are trending, not without privacy issues
🔥 Instagram, child protection, and the algorithmic wild west
The Wall Street Journal, in collaboration with researchers at Stanford and the University of Massachusetts Amherst, uncovered that Instagram's (owned by Meta) algorithms are helping to connect a vast network of pedophiles with child exploitation content. According to the journal: “Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.” On the topic, Thierry Breton, the EU Commissioner for Internal Market, tweeted yesterday: “Meta's voluntary code on child protection seems not to work. Mark Zuckerberg must now explain & take immediate action. I will discuss with him at Meta’s HQ in Menlo Park on 23 June. After 25 August, under DSA Meta has to demonstrate measures to us or face heavy sanctions.” This is one more example of algorithmic externalities, as I discussed in a recent edition of this newsletter. Current data protection regulations still do not tackle the negative consequences emerging from the application of algorithms to online social systems and personal data, such as social networks. Companies should be transparent, accountable, and liable for the algorithms they decide to apply when conducting their businesses. Perhaps the Digital Services Act (DSA) will help. For now, we are still living in the algorithmic wild west.
🔥 AR/VR headsets are trending, not without privacy issues
🔥 Apple's strategic privacy approach
Talking about Apple again, let us discuss its strategic privacy approach. Among existing large tech companies, Apple is the one that