Kate Crawford (2021) — In Atlas of AI : power, politics, and the planetary costs of artificial intelligence

Rina Chen’s living notebook on digital craft and design.


artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications.

artificial intelligence as we know it depends entirely on a much wider set of political and social structures. And due to the capital required to build AI at scale and the ways of seeing that it optimizes AI systems are ultimately designed to serve existing dominant interests. In this sense, artificial intelligence is a registry of power.

we need to go beyond neural nets and statistical pattern recognition to instead ask what is being optimized, and for whom, and who gets to decide. Then we can trace the implications of those choices.

The AI industry is making and normalizing its own proprietary maps, as a cen- tralized God’s-eye view of human movement, communication, and labor.

Rather than debating whether humans will be replaced by robots, in this chapter I focus on how the experience of work is shifting in relation to increased surveillance, algorithmic assessment, and the modulation of time.

Crawford, Kate. The Atlas of AI : Power, Politics, and the Planetary Costs of Artificial Intelligence, Yale University Press, 2021. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/oculocad-ebooks/detail.action?docID=6478659.
Created from oculocad-ebooks on 2025-10-07 17:48:59.

Clever Hans and the Question of Intelligence

  • The Hans story
    • Hans (the horse) performed feats that seemed intelligent.
    • In reality, Hans was trained to produce what his owner wanted.
    • this was not the “extraordinary intelligence” audience/society/academics/business expected.
    • Yet, one might argue: isn’t this a greater intelligence, since it shows interspecies communication and patience?

“We anthropomorphize the nonhuman.”

  • Use in AI discourse
    • Hans is now a cautionary tale in machine learning: systems that look impressive in training can fail in real-world conditions.
    • Neural nets are organized in opaque ways → a “black box.”
    • Raises the question: How is intelligence “made,” and what traps does that create?
  • Social construction of intelligence
    • Financial, cultural, and scientific interests shaped the framing of Hans’s “intelligence.”
    • Parallel with today’s AI hype.

Competing Mythologies of AI

  • Myth 1: Nonhuman systems (horses, computers) are analogues for human minds.
  • Myth 2: Intelligence exists independently → AI as disembodied, removed from material/social world.
  • Related arguments:
    • Human intelligence can be formalized and reproduced (mid-20C assumption).
    • “Humans are meat machines” (Minsky).
    • “Human nervous system is digital” (von Neumann).

Joseph Weizenbaum and ELIZA

![[Pasted image 20251002151829.png]] ![[Pasted image 20251002152555.png]]

“I was dismayed to see how quickly and intensely people… established an emotional relationship with the computer and how they attributed clearly human characteristics to it.”

Joseph Weizenbaum, Power of Computers, 19

“The most important fundamental insight … is that we currently know of no way of making computers intelligent, and that we should therefore not at present assign to computers tasks that require intelligence.”

— Joseph Weizenbaum, The Power of Computers and the Impotence of Reason (1977 [1976]), 300.

“However intelligent machines can be created, I remain of the opinion that certain acts of thought should be reserved exclusively for humans.”

Joseph Weizenbaum, Power of Computers, 28

“The belief that science and technology will save the Earth from the consequences of climate change is misleading. Nothing will save our children and grandchildren from an earthly hell. Unless we organize resistance against the greed of global capitalism.”

Joseph Weizenbaum, Us Against Greed

“If only global society were reasonable, the knowledge already achieved by humanity could turn this earth into a paradise.”

Joseph Weizenbaum, What I Believe at the End of My Life

  • ELIZA (1960s chatbot)
    • Users confided in ELIZA, attributing empathy → the “ELIZA effect.”
    • Even therapists thought it could become automated psychotherapy.
  • Weizenbaum’s critique
    • Shocked by how quickly people formed emotional bonds with machines.
    • Warned: we know no way of making computers truly intelligent → don’t assign them tasks requiring intelligence.
    • Certain acts of thought should be reserved for humans.
    • Criticized technological solutionism.
    • Argued only humans can solve problems in human ways (social, cultural, biological, emotional).
  • Later positions
    • “Nothing will save our children … unless we resist greed of global capitalism.”
    • “If only global society were reasonable … paradise.”
    • Legacy: Weizenbaum Award for critical computer science research.
  • Overlap with Dreyfus
    • Dreyfus: Human intelligence relies on unconscious processes, while computers require everything explicit.

AI as Material, Political, and Extractive

![[Pasted image 20251002162034.png]] AI professor Fei-Fei Li describes her ImageNet project as aiming to “map out the entire world of objects.”

  • Embodiment of AI
    • AI is embodied in natural resources, fuel, human labor, infrastructures, logistics.
    • Built with and dependent on wider political/social structures.
  • Registry of power
    • AI systems serve dominant interests due to capital requirements.
    • They are designed to optimize for particular values → need to ask: what’s being optimized, for whom, and who decides?
  • AI’s colonizing impulse
    • Mapping and classification as centralizing power.
    • Example: Fei-Fei Li’s ImageNet project (“map out the entire world of objects”).
    • Creates a God’s-eye view of human activity while denying its political nature.
  • Expanded view of AI as an extractive industry
    1. Resource extraction: lithium, latex, toxic waste lakes.
    2. Digital piecework: low-paid microtasks.
    3. Data treated as infrastructure, stripped of personal meaning.
    4. Classifications reinforce hierarchies and inequities.
    5. Surveillance applications (motion detection, policing, military).
    6. AI as tool/structure of state power.
  • Result: AI expands asymmetries of power.

Broader Reflections on Technology & Power

  • Alondra Nelson, Thuy Linh Tu, Alicia Headlam Hines: tech contests are always tied to economic mobility, politics, community.

  • Ursula Franklin: technology’s viability depends on justice and limits to power.

“The viability of technology, like democracy, depends in the end on the practice of justice and on the enforcement of limits to power.”


Other Resources

  • https://jw.weizenbaum-institut.de/wp04
    • Since 2010, the Forum of Computer Scientists for Peace and Social Responsibility (FIfF) has sponsored the Weizenbaum Scholarship Award. In memory of Weizenbaum’s contributions to promoting a critical perspective on computer science, the award is given for outstanding achievements by young researchers who critically examine the social, ethical, and political impacts of computer and information technology.

![[The_Atlas_of_AI_Power,Politics,_and_the_Planetary…—-_(Introduction).pdf]]