MANILA, Philippines – Widespread, deep-seated social inequalities and injustices kept society divided and marginalized groups oppressed for thousands of years. So what happens when you add tech into the mix?
This is what The Signal Foundation president Meredith Whittaker and Rappler CEO and Nobel laureate Maria Ressa tackled in a discussion on “Tech and Its Harms” during Rappler’s 2024 Social Good Summit on Saturday, October 19.
Whittaker and Ressa discussed the problems surrounding coded bias, which refers to the way human biases are integrated into data and tools powered by artificial intelligence. For instance, the 2020 documentary Coded Bias tackled researcher Joy Buolamwini’s discovery that facial recognition technologies do not accurately detect dark-skinned faces.
During the summit, Whittaker explained that because data is information collected about the world we live in, it already takes into account certain power structures among people and institutions that exist in society today.
“Data is encoding power structures and interests and economic relationships that are never neutral. That will always reflect existing hierarchies, regimes of racialization and classification, and [this] cannot be taken as a snapshot of our reality, as something that is somehow objective…. without accounting for the fact that it is already reflecting these power structures,” she said.
Whittaker added that data is “the answer to the question of the powerful,” and “not necessarily a reflection of the rest of the world.”
Labor exploitation in the Global South
Much of the discussion surrounding AI and tech also focuses on labor issues in the Global South, including the Philippines, known for its cheap labor and English proficiency. The widespread use of AI triggered job displacement fears and prompted workers of various industries to upskill and rethink their existing workflows.
“We in the Philippines did a lot of the content moderation for social media companies. We also do the cleanup for the large language models. It’s almost colonization twice,” Ressa said.
A report from The Washington Post found that workers based in the Philippines who sort and label data for AI models are often paid at extremely low rates, and their payments are routinely delayed or withheld. Another report from Bloomberg discovered that fast-food chains in the US with supposed AI-powered drive-thrus actually rely on Philippine-based workers to get customers’ orders right.
This isn’t just happening in the Philippines. Outsourced AI workers based in other poor areas, including African countries such as Uganda and Kenya, are also paid very little, despite working for billion-dollar industries.
Whittaker said the power imbalance between developed and developing countries is “arguably not a product of technology,” but rather driven by “the type of business model that sees mass-scale global domination as the end goal, and structures itself for that end.”
“It really is that desire to scale everything, to touch everything, to become the everything infrastructure in service of growth and profits that we’re talking about. And of course, that is the engine of colonialism. That is the engine of empire,” she added.
Whittaker also warned against believing that AI is “super capable,” and said people should instead be looking at “the man behind the curtain.”
“When we scratch the surface, it often looks a lot more like this neocolonialism that is shifting labor and shifting resources into the hands of these tech companies than it does like massive scientific advancements in autonomous systems,” she said.
Weaponization of tech for military purposes
Modern technology has also been exploited for war, particularly in Israel’s military aggression in Gaza, where at least 42,600 Palestinians have been killed since the October 2023 Hamas attack.
Whittaker and Ressa cited an AI-powered program called Lavender, designed to identify supposed Hamas and Palestinian Islamic Jihad operatives as potential bombing targets. Lavender’s software relied on information collected on residents of the Gaza Strip, acquired through mass surveillance.
+972 Magazine, an independent magazine run by Israeli and Palestinian journalists, reported that the Israeli army “almost completely relied” on Lavender’s detection system during the early stages of the war, without thoroughly checking the raw intelligence data the machine’s choices were based on.
The Israeli army used Lavender data to attack the identified targets in their homes, at night, when everyone in their families were present. The +972 Magazine report also found that Lavender was known to identify Palestinians who were only loosely linked to the militant groups, or had no connection to them at all.
“It really, to me, exposes exactly that power asymmetry between who collects the data, who gets to decide what’s done with it, and who the data is collected on, and how that can shape their lives or bring about their murder,” Whittaker said. – Rappler.com