AI as a Manipulative Informational Filter
AI adds unsolicited noise, bias, distortion, censorship, and sponsored interests to raw information, altering how people understand the world and exposing society to new risks | Edition #215
๐ Hi, Luiza Jarovsky here. Welcome to our 215th edition, now reaching 65,800+ subscribers in 168 countries. To upskill and advance your career:
AI Governance Training: Apply for a discount here
Learning Center: Receive free AI governance resources
Job Board: Find open roles in AI governance and privacy
AI Book Club: Discover your next read in AI and beyond
Become a Subscriber: Read my full analyses and stay ahead:
๐ Join the 23rd cohort in September
If you are looking to upskill and explore the legal and ethical challenges of AI, as well as the EU AI Act, I invite you to join the 23rd cohort of my 15-hour live online AI Governance Training, starting in September.
Cohorts are limited to 30 people, and over 1,200 professionals have already participated. Many have described the experience as transformative and an important step in their career growth. [Apply for a discounted seat here].
AI as a Manipulative Informational Filter
As the generative AI wave advances and we see more examples of how AI can negatively impact people and society, it gets clearer that many have vastly underestimated its risks.
In today's edition, I argue that due to the way AI is being integrated into existing systems, platforms, and institutions, it is becoming a manipulative informational filter.
As such, it alters how people understand the world and exposes society to new systemic risks that were initially ignored by policymakers and lawmakers, including in the EU.
-
Two weeks ago, Marc Andreessen said on Jack Altman's podcast:
โAI is going to be the control layer for everything (โฆ) how you interface with the education system, with the healthcare system, with transportation, employment, with the government, with the law, right? It's going to be AI lawyers, AI doctors, AI teachers (โฆ)โ
and also
โthe analogy is not with cloud or the internet, but with the invention of the microprocessor. This is a new kind of computer; everything that computers do can get rebuilt. Everything is going to be rebuiltโ
This is, of course, the hopeful view of a tech entrepreneur and venture capitalist who is interested in this cycle of destruction and reconstruction, high valuation, hype, and profit.
At the same time, recent developments in tech show that Andreessen's view is already unfolding. Additionally, the โcontrol layerโ he refers to is also, from a cognitive perspective, a manipulative informational filter whose medium and long-term consequences have been ignored from ethical and legal perspectives.
Powered by the AI-first and AI fluency trends, companies everywhere are racing to adopt AI as an end, not a means, and forcing employees to uncritically embrace AI or lose their jobs (well, many will likely be fired anyway).
This creates a new set of incentives and dramatically changes corporate culture: companies in every sector are now primarily focused on both profit and AI-powered automation, multiplying AI's ubiquity, as well as its informational influence on people.
There is also the hardware aspect, which is essential when discussing informational filters. AI is already embedded in the systems that power computers and smartphones, and each new operating system update adds more AI features, ensuring that people will only interact with information through AI lenses.
Meta's glasses and OpenAI's mysterious companion-surveillance gadget are designed to reinforce the informational filter, potentially adding AI intermediation to a person's every single daily interaction.
It reminds me of what Shoshana Zuboff once said: โIt is no longer enough to automate information flows about us; the goal now is to automate us.โ
AI is a manipulative informational filter because it adds unsolicited noise, bias, distortion, censorship, and sponsored interests to raw human content, data, and information, significantly altering people's understanding of the world. How?