Mais il ne suffit pas de crier ‘Vive le multiple!’; le multiple, il faut le faire. —Bergson, in a letter addressed to Gilles Deleuze
About care
The idea of care offers an intriguing starting point for designing human-technology relationships.
With an Eastern worldview, I can easily imagine that humans have the natural ability to project their sense of well-being onto different forms. Think about the animals, objects, or spaces we care for: their healthy, well-maintained state becomes part of our own happiness.
The author, Puig de la Bellacasa, draws on Haraway’s worldview, who was in turn influenced by Whitehead’s process philosophy and early Marxist-feminist thinkers like Nancy Hartsock.
Read: “Tentacular Thinking: Anthropocene, Capitalocene, Chthulucene,” by Donna Haraway
This article is an earlier version of chapter 2 of the author’s book Matters of Care (2017).
Read: Puig de la Bellacasa, María. 2017. Matters of Care: Speculative Ethics in More Than Human Worlds.
As illustrated by the author, care involves multiple things at the same time, interconnected and rarely confines itself in a single domain:
-
Care is repetitive, mundane, and concrete actions we undertake to maintain our world and its inhabitants.
-
Care involves practical everyday commitment, “constant fostering, not only because it is in its very nature to be about mundane maintenance and repair, but because a world’s degree of liveability might well depend on the caring accomplished within it.”
-
Caring is multisensory. (Haptic visions: we think, therefor we touch.)
-
Caring is specific, a mode of caring is not necessarily translatable elsewhere.
-
Care is not about fusion, it can be about the right distance.
-
Care involves “hands-on, ongoing process of re-creation of “as well as possible” relations.”
-
Care involves the non-normative ethics: a “non-normative obligation”, “it is concomitant to life—not something forced upon living beings by a moral order; yet it obliges in that for life to be liveable it needs being fostered.” (Puig de la Bellacasa, from the article, p.198),
-
Care is interdependence, the wellbeing of the object and the subject (physical, or mental) is closely interlinked.
-
Care is reciprocal, but rarely bilateral, “the living web of care is not maintained by individuals giving and receiving back again but by a collective disseminated force.” (Matters of Care)
-
Care is vital in interweaving a web of life, involving multiplicity of agencies and materials.
-
Care is “standing for sustainable and flourishing relations, not merely survivalist or instrumental ones.” (p.198)
-
Care engages much more than a moral stance; it involves affective, ethical, and hands-on agencies of practical and material consequence.
The future is brighter if people can care for the AI and surrounding technology just like they care for their garden. The garden analogy especially applies to the maintenance of an open ecosystem, regulations, shared resources surrounding AI.
But does it apply to AI agent? Would it be interesting if the agent is another pairs of helpful hands in pruning the world we live in? See [[Game of Life with AI]]
Thinking for
I believe that learning to think about and yearn toward reproductive freedom from the analytical and imaginative standpoint of “African American women in poverty” — a ferociously lived discursive category to which I don’t have “personal” access — illuminates the general conditions of such freedom. Haraway, Modest_Witness@Second_Millennium,1997, p.199
Yes, we can think for others—and in many professional contexts, we not only should, but must. However, this is easier said than done. As Haraway reminds us, “all too easily it can lead to appropriating the recipients of ‘our’ care, instead of relating ourselves to them” (p.209).
Hence, thinking with.
Thinking with = living with
Laugh with, not laughing at, comes from thinking embedded in communities one cares for, and it is an example of a form of thinking with care that I propose to call dissenting-within. —Puig de la Bellacasa, p.205
Living with is laborious. Relations of otherness are more than about accommodating ‘difference,’ co-existing or tolerating. Thinking with should always be a living with, aware that relations of significant otherness transform those who relate and the world they live in. —Puig de la Bellacasa, p.207
To care is to co-exist with the fragility and complexity of things; to think with is also to live with, to respond to and be transformed by what and whom we care for.
This resonates with Haraway’s call to “stay with the trouble” — to remain with the friction, the unknown, the partial. If knowledge is partial and situated, then so too is care. It’s not about saving, optimizing, or fixing the world from a distance; it’s about being entangled in the world, implicated in its maintenance, its decay, and its possibility.
Get dirty
The point is to make a difference in the world, to cast our lot for some ways of life and not others. To do that, one must be in the action, be finite and dirty, not transcendent and clean. —Haraway, Modest_Witness@Second_Millennium, 1997, p.36
In the extreme sense, dirty, sweaty interaction might be just part of the deal. Think about caring for a patient, an animal, or a city street, it’s impossible to neglect the importance of removing, cleaning up the waste so that the remaining part is healthy and well-maintained.
Learning curve, time, and effort is not something to be avoided. Instead, how to make people embrace that, and take the necessary time and effort, especially at the starting phase?
Care as a coping mechanism against the unknown
Knowing is not about prediction and control but about remaining ‘attentive to the unknown knocking at our door.’ (Deleuze, ‘Qu’est-ce qu’un dispositif?’ in Michel Foucault philosophe, 1989, p.193) But though we do not know in advance what world is knocking, inquiring into how we can care will be required in how we will relate to the new. — Puig de la Ballacasa, p.212
By this, care may be an ultimate coping mechanism against any future coming up, even if we don’t know what’s coming ahead.
Matters of Care (2017)
In her book, Matters of Care, Puig de la Bellacasa explores human-soil relations around a notion of soil as living.
Caring for soil is deeply entangled with countless factors—many of which remain beyond our full perception. We can only grasp a fragment of what there is to care for, often acting on hope, only to discover new dynamics constantly emerging. Tending to something so complex is inevitably slow and filled with trial and error, demanding effort, patience, and persistence. Yet, it is also profoundly rewarding.
An interesting concept of temporalities of care is also presented in this book, that, they differ by subjects. “The pace required by ecological relations with soils could be at odds with accelerated, future-oriented responses characteristic of the pace of technoscientific innovation.” (Puig de la Bellacasa, Matters of Care, Introduction, p.23)
How do we make time for care, especially when it is directed toward a complex, unknown, vast concept, such as technology? I think a lot can be learnt from an analogy to soil, and vines, as suggested by Krzywoszynska.
Further readings:
Krzywoszynska, Anna D. 2016. “Empowerment as Skill: The Role of Affect in Building New Subjectivities.” In Participatory Research in More-than-Human Worlds.
![[Empowerment as Skill for WR.pdf]]
Designing with Care
To think care-fully about design means to embrace partiality, and slowness. It asks us to resist scalable, frictionless futures in favor of deeply situated, context-bound engagements.
I propose care as a mentality we should have towards AI and technology in general. Not for the “well-being” of the AI, but the well-being of us, and the society where we are in relation to others.
An interesting case in point, is a fermentation support robot, Nukabot, developed by Dominique Chen, where he refers to the idea of care. (AI・ロボットと共存の倫理, 2022)
He breaks down the process of caring into:
| Ethopoietic process | Key action that can be designed |
|---|---|
| Daily maintenance | Noticing |
| Emotional attachment | Acting-on |
| Non normative obligation | Co-presencing |
It’s also observed that in the phase after emotional attachment, there’s a diminution of a sense of forced obligation and associated efforts, and the caring process is more likely to sustain itself.
A point to consider in this line of thought is that, how we can design care time? The most tricky design theme is “nudging”. I don’t believe in the power and benefit of nudging especially in the situation where the user does not have the natural desire and objective in the first place. It falls too easily into dark pattern. Design is only validated by empowering people to do whatever they want to do in the first place, or offering other options and realization. User autonomy is respected in the foremost. So when I ask, how we can encourage people to take daily, practical, repetitive actions of care in their relationship with technology? The experience needs to be designed in a way where people a priori has a desire to accomplish certain things, and the design helps them to achieve that through daily actions.
The analogy of care does not, in any sense, apply to all design decisions, in all products and services. It merely suggests that, autonomation and efficiency are not the silver bullet: Even if some parts can be automated, the remaining you can never automate—otherwise the entire concept of care disappears with the automation.
We are the carers
But should or would the AI agent return the caring attitude? It’s not yet a sterile solver of human problems, but will it ever have the capacity, or even the reason, to engage in the messy, demanding labor of care for fragile, unpredictable, sometimes self-centred human beings?
At the moment, I remain skeptical of framing AI as a cohabitant, let alone a replacer of humans, since it can’t be a co-struggler. However, it can still be a powerful co-thinker on a specific task, with human-defined goals, and human-initiated actions grounded in deep, situated observation of the complex world we inhabit, one shaped by affect, feeling, sense of justice, struggle, humility, and empathy. See [[New Context Conference 2025 Summer]].
The crucial point here is that we must move beyond a human-centered perspective when setting our goals and actions. Otherwise, AI might one day have a compelling justification for replacing humans—at least in terms of objectively balancing the needs of naturecultures. (I won’t confirm whether that’s entirely a joke.) In this perspective, the concept of care, and haphazard interrelationship among different cultures, species and things are valuable source of wisdom to guide us through the finite journey we have on earth.
At this point, an artificial system, by design, inevitably presumes it knows the answer—because it lacks the capacity to recognize that it cannot fully know. This epistemic limitation makes it unfit to be a genuine partner in human uncertainty.