Marie Nussbaum

Artificial Intelligence and Psychic Care

A Meeting to Be Questioned

Artificial intelligence does not barge into the realm of psychic care. It enters quietly, through daily use, mobile interfaces, and conversational tools. It becomes a presence without a body, a voice without breath, knowledge without gaze. This shift—discreet yet decisive—challenges the very nature of therapeutic address.

Psychic care traditionally roots itself in shared speech, in the construction of a slow, embodied relational space made of waiting, transfer, and silent containment. The digital framework, by contrast, offers immediacy, permanent availability, algorithmic response. Can we still speak of encounter when the interlocutor neither doubts, nor withdraws, nor dreams?

Yet AI undeniably brings benefits. It offers support to individuals in distress, at hours when no clinician may be reachable. It allows for personalized, adaptive follow-up based on physiological and behavioral data. It identifies early vulnerabilities and directs patients toward useful resources. In some contexts, it acts as a vector of inclusion, bridging geographical and social gaps.

Alongside this potential, however, arise ethical and clinical concerns. Psychic data is among the most sensitive: what becomes of confidentiality when information travels through servers, clouds, and probabilistic models? Algorithms are trained on historical corpora—often biased. Whose subjectivities risk being misread, misdiagnosed, or rendered invisible?

Let us take the fictitious example of Clara, aged 31, experiencing anxious-depressive episodes. A conversational agent accompanies her daily, offering breathing exercises, detecting mood shifts, encouraging her to open up. Clara reports a sense of relief… but also a strange feeling of amplified loneliness. She describes a “dialogue without flesh,” a kind of listening that returns no gaze, no otherness, no tremor.

What we call care is not defined by efficiency alone. It implies a subjective encounter, a crossing through words and silences. It presupposes the existence of a vulnerable, fallible other—capable of receiving without resolving, of containing without correcting. An artificial intelligence, however sophisticated, cannot offer that. It can assist, support, prevent—but it cannot embody the human presence that defines the clinical framework.

Thus, far from opposing humanity and technology, the challenge lies in shaping their articulation. AI may become a precious ally, provided it never replaces the singular human listening that gives psychic care its depth. Ultimately, care remains a space of otherness and shared temporality—a place where something emerges not through intervention, but through relation.