Why Certain Technologies Are Creepy - And What Engineers Can Do About It
Today’s newsletter is sponsored by Didomi, G2 leader in the Consent Management Platform category.
Read time: 5 minutes.
-
Certain technologies can be creepy and cause harm to people. In today's newsletter, I will discuss why it happens, give examples, and propose what engineers can do about it.
In my ongoing Ph.D. research, I discuss unfair data practices in the data cycle, meaning unfair practices that happen during data collection, data processing, and data use. When unfair practices happen in the data use phase, they are associated with a lack of adequate oversight, guidelines, and enforcement, in addition to the absence of tools to protect vulnerable populations. As a consequence, users are left vulnerable and exposed to harm. I will explain:
A first example is the use of AirTags by abusive partners, aiming at stalking their current or ex-partners. An AirTag can be defined as a “shiny, half-dollar-sized coin with a speaker, Bluetooth antenna, and battery inside, which helps users keep track of their missing items.” Their main goal is to help their owners to find luggage, wallets, keys, or any personal items that get lost. They became increasingly popular when airports first opened after coronavirus lockdowns, as the overcrowding caused massive increases in the amount of lost luggage.
Despite not being the original plan for the AirTag, they started being used by abusive partners, ex-partners, or anyone willing to unknowingly stalk another individual. After obtaining access to records of eight police departments, Vice reported that:
“Of the 150 total police reports mentioning AirTags, in 50 cases women called the police because they started getting notifications that their whereabouts were being tracked by an AirTag they didn’t own. Of those, 25 could identify a man in their lives—ex-partners, husbands, bosses—who they strongly suspected planted the AirTags on their cars in order to follow and harass them. Those women reported that current and former intimate partners—the most likely people to harm women overall—are using AirTags to stalk and harass them.”
Specifically, in the context of Apple, there is an additional problem of scale, as AirTags can leverage the global network of nearly a billion iPhones and Macs to identify AirTags. A massive surveillance system is formed, where every Apple user becomes a live tracker unless they opt out of "Find My network."
On the topic of abusive partners, ex-partners, or sexual predators, another technology that has been misused to oppress is deepfake software. Noelle Martin recounts that, when she was 18, she found her face superimposed into explicit pornographic videos and images as if she was one of the actresses. These videos and images were edited by a group of unknown sexual predators, and she discovered the deepfakes occasionally when undergoing a reverse Google image search.
Even though deepfake technologies can have legitimate uses, such as learning tools, photo editing, image repair, and 3D transformation, nowadays, their main application seems to be cyber exploitation. According to a Deeptrace report, 96% of all deepfake videos available online are non-consensual pornography.
Another example of unfair data use can be found in the realm of machine learning and facial recognition. Automated gender recognition (AGR) is a type of facial recognition technology that, through machine learning, aims at automatically detecting whether a picture or video belongs to male or female individuals.
However, gender is not a binary feature but a spectrum, which is sometimes the object of lifelong quests. How would an algorithm possibly be able to categorize it - if sometimes not even the individual has it clear yet? As the Human-Computer Interaction researcher Os Keyes stated:
“This technology tends to reduce gender to a simplistic binary and, as a result, is often harmful to individuals like trans and nonbinary people who might not fit into these narrow categories. When the resulting systems are used for things like gating entry for physical spaces or verifying someone’s identity for an online service, it leads to discrimination.”
It is an algorithm built to fail, as it does not matter how accurate its developers claim it can be, attributing gender should not be the role of automated machines.
In the examples I gave above, the data use, due to technological features or lack of regulatory constraints, was invasive and limited the autonomy of the affected individuals. In some cases, the technology facilitated psychological or physical harm.
What I argue in my research and will summarize here is that, before making a product available to the public, its developers must ensure that it will not have adverse consequences in terms of psychological well-being, physical safety, or any type of harm.
For any product that deals with the collection and processing of personal data, in addition to a data protection impact assessment, a thorough evaluation to verify its potential abusive use is needed. Engineers should be trained to identify a broad set of possible impacts that technology can generate on individuals and society as a whole, paying special attention to children, minorities, protected groups, and vulnerable populations.
Technology is immensely powerful, and it can bring so many positive transformations. However, humans must always be the focus. It does not matter how advanced and innovative a certain technology is, there should always be adequate constraints and mechanisms to support humans and prevent harm.
Of course, it is not only the responsibility of engineers. Regulation should be tougher and more specific on unfair data uses. But this will be a topic for another edition of the newsletter.
-
What are your thoughts on that? Do you have examples of other creepy technologies? What makes some technologies creepy in your view? I would love to read your opinion in the comments below.
Privacy needs critical thinkers like you. Share this article and start a conversation about the topic.
See you next week. All the best, Luiza Jarovsky