Is “caring AI” an oxymoron?

Reflections on the workshop at DDW 2023

Nieuws - 15 januari 2024

“AI really is an uncaring technology”, says one of the participants of our workshop at DRIVE, organised by CLICKNL for the Dutch Design Week. Together with Meike Hardt, a fellow PhD researcher from AI DeMoS Lab, we were set to imagine a world where AI helped to create and maintain a more caring society. However, the proposed task turned out to be harder than it first appeared.

Care-full AI Futures workshop
Photo by Kas van Fliet

Building worlds where caring AI can exist

During the Care-full AI Futures workshop, we asked the participants to design a fictional device or technological system for the near future, where society abandoned its old ways and got re-oriented towards care as an ultimate “value”. Before that, we stirred everyone’s imagination with short worldbuilding texts that served as a setting, which those new designs would populate. Deliberately vague and verbose (and unintentionally post-apocalyptic), one of them would start off like this:

Gone are the times when infrastructures were seen as unwieldy colossuses—remorseless and iron-hearted. Or as the forever-flawed labyrinths, punishing your every misstep and threatening to crush you under the weights of its precarious structures. Systems that were created to support and sustain have abandoned their purpose, as people kept losing their bearings in attempts to navigate the forbidding halls. Or else, people were left outside its walls entirely. In response to the crisis, our infrastructural imagination flourished.

Those texts suggested a particular topic that presented care as a capstone holding a “new society” together—both in a common-sense understanding of the word and as an analytical and political concept central to care ethics. Participants received a quick rundown of care theory as well.
 
In this world we outlined, AI was co-opted to become an active collaborator in providing care, reproducing care practices, and manifesting care imaginaries.
 
The projects that participants made were complex, detailed, and even funny. And yet, perhaps the most important insight was that care is extremely difficult to think with. Non-human care is even more so.

Unimaginable caring machines

Three challenges veered the participants' imaginations away from the direction we expected them to go.
 
The first was the task itself—to create an AI that would provide good care. While we hoped an open call for speculation and a more sci-fi-themed text would allow people’s imaginations to run wild, it was often not enough.
 
On the one hand, participants were more willing to imagine AI as a conscious autonomous being, thus manifesting it as a helper to humans or being somewhat equal to them. Some projects re-created technologies that currently exist, simply adding more “intelligence” to them. On the other hand, at times participants were more likely to “care” for the technologies and infrastructure themselves, rather than trying to suggest new ways the humans (and living non-humans) could be cared-by—or care-with—AI.
 
To recite the quote from the beginning, it was easier to imagine a machine that is conscious than the one that cares.

Dark side of care

Secondly, participants continued to return to the understanding of care as something nice, wholesome, and comfortable. In our very brief theoretical introduction, we emphasised care as based on sustaining commitment, assuming responsibilities, and taking concrete actions. Yet, as it seemed, care remained to be employed by the participants as something abstract, intangible, and emotional.

In the same manner, some designs proposed an AI that would “take over” the burdens and responsibilities—either physical, emotional, or moral. As another fellow PhD at IDE astutely noted during the workshop test round, caring AI should not be alienating people from their environments, routines, and other object interactions. Even more so, we argue, not only it needs to maintain existing care relations and create new generative ones but to open possibilities for burdening oneself. To be sustaining and healing, care needs work and skilful dedication.

At times, as counter-intuitive as it is, acts of caring are something you wish you could avoid, because they are laborious and demanding, requiring you to share all your available resources, material or immaterial. The tasks might be unpleasant but essential if we are to receive care back and if we are to sustain and nurture our worlds—together with technologies.

Cultivating a sensitivity to good algorithmic care

While grasping all the intricacies of a theory in a couple of minutes is indeed a demanding request, the last challenge was the most unexpected. Our participants struggled to recollect how they are cared for and how they provide care in their everyday lives—in the common sense of the word, warm and fluffy. Regardless of how essential care is in our lives, it often goes unnoticed and not reflected upon. Even worse, there is a danger of failing to acknowledge the lack of care or mistake neglect for care.

If we are still struggling to understand what good care means to us and how should we provide it, how can we then teach a machine to do just that? In my research, I explore the possibilities of engaging aesthetics—not in terms of not beauty or fine art but all things senseable and perceivable—alongside ethics.

Very fitting thus is the aesthetics of care, a perspective proposed by Yuriko Saito. She writes that to be performed “well”, acts of care require particular skills: aesthetic sensibility and aesthetic expression (acknowledging that what “well” means is in itself a matter of investigation). In this sense, we—as designers and just daily technology users—are still yet to explore how can truly AI care well for us and how we can recognise when (and if) it does. Also, how do we make sure it really does.

While it will take us some time to do so, we can already pay more attention to what good care means for us in our daily lives and whether we ourselves care well for things and beings that we deem important to us.