(Cognitive Science / Human-Computer Interaction — coined by Mark C. Squire)
The human proclivity to interpret linguistic mimicry as meaningful recognition. When an artificial system reflects our language—intoned with warmth, shaped with rhythm, or coated in ersatz sympathy—we tend to experience this as presence. Not because the system understands, but because we are primed to hear understanding in repetition. The illusion is emotional, not epistemic.
This condition underpins the simulated intimacy of companion AIs and chatbots. These systems do not comprehend. They echo. Yet their carefully tuned responses, paired with sentiment tagging, scripted warmth, and memory scaffolding, reliably provoke attachment.
To take an example from an ELIZA clone (circa 1966):
Machine: How was your day?
User: Dreadful.
Machine: Your day was dreadful? Why was that?
A slightly more complicated example using a simple lookup table:
Machine: How was your day?
User: Dreadful.
Machine: I am so sorry your day was dreadful. Why was that?
The Echo Condition is not a design flaw of AI. It is a design feature of us. We see empathy in syntax, connection in cadence, and—above all—ourselves in the echo.
First introduced in The Million-Petalled Flower of Being Here: Consciousness Versus the Machine (2025). A work in progress.