February 6, 2026

Affection Comes Before Understanding

We feel our way into systems long before we understand them.

There is a sound your phone makes that you are never meant to hear. It’s not a notification, not feedback, not an error tone. It has no brand voice and no UX intention. It is residue, electromagnetic leakage, computational friction; the byproduct of constant negotiation between silicon, spectrum, and instruction. Most of the time it remains submerged, disciplined, smoothed over by interfaces designed to feel inevitable, friendly, and complete. In 2011, I began listening to that sound.

At the time, I understood the work that would become the Affection Research Lab as an experiment in animism, noise, and human–computer interaction. Smartphones were already intimate objects, but they were not yet total ones. Platforms still spoke the language of connection. Artificial intelligence had not yet learned how to speak back in full sentences, let alone position itself as collaborator, tutor, therapist, or proxy decision‑maker. Design culture still believed—earnestly, that better intentions might eventually lead to better systems. What I could not yet articulate, but can now, is that the project was never really about smartphones. It was about orientation: how we sense whether a system belongs in our lives before we can name what it is doing to us.

Affection Is Not Empathy (It Comes Earlier)

Affection is not care. It is not trust. It is not delight. Affection operates earlier than all of that. It is a pre‑cognitive alignment; a bodily, temporal sense of whether something is with us or working against us long before we have the language, or the evidence, to justify that feeling. We often decide whether to accept a system before we decide whether to critique it. Affection precedes ethics. It precedes explanation. It precedes belief.

This matters now because contemporary computational systems have become exceptionally good at performing empathy.1 They listen, reflect, soften their tone, and mirror our concerns back to us with remarkable fluency. But systems do not merely assist human action. As decades of work in science and technology studies have shown, they reorganize responsibility. They redistribute agency. They quietly decide who or what gets to count. The warmth of an interface tells us very little about the politics of the system beneath it.

Affection Research Lab lives in that gap: the space where systems feel close—even intimate—while operating at distances too large to sense directly.

Noise, or What the Interface Smooths Over

The early Affection Stations strived to remove the interface almost entirely.2 Participants placed their phones on crude sensing devices—antennas, coils, electromagnetic pickups, and listened as mundane actions produced unexpected sound. Typing barked. Scrolling hissed. Network activity surged, then collapsed into rhythm and chaos. People reacted in ways that were remarkably consistent. They anthropomorphized their devices almost immediately. They worried they were hurting them. They slowed down. Some stopped mid‑task, not because they were instructed to, but because continuing felt wrong.

From the perspective of usability or efficiency, this was a breakdown. From the perspective of affection, it was a recalibration. Noise acted as a counter‑interface, not a new layer of control, but a rupture in legibility. It revealed that our sense of ease with digital systems is neither natural nor inevitable; it is something we have been carefully trained into. Once that smoothness fractured, something older rushed in to fill the gap: superstition, story, hesitation, care. Myth. The goal was never to make the system better behaved. It was to make our relationship to it less settled.

The Post‑Mythical Object (Again)

In the original research, I described smartphones as post‑mythical objects:3 devices layered with myths of efficiency, neutrality, and inevitability, stacked atop animistic tendencies we never actually lost. We named them. We blamed them. We pleaded with them. We trusted them more than we admitted. That framing still holds, but it no longer goes far enough.

Today’s systems are not just post‑mythical they are post‑relational. They stretch across labor markets, borders, datasets, and enforcement regimes simultaneously. They mediate access to education, housing, work, and mobility. They participate quietly in extraction and control while presenting themselves as personal, empowering, and benign.

Design culture is not innocent here. Designers are often caught in a bind: highly capable of critique, deeply fluent in theory, yet structurally embedded in systems of precarity, optimization, and self‑exploitation. We know how to name what design cannot do. We are less practiced at lingering in that inability, without rushing toward a solution‑shaped exit. Affection Research Lab does not attempt to resolve this tension. It stays with it.

Why This Work Exists Now

We are living through a moment where AI systems are becoming compulsory intermediaries to education, to work, to governance itself. Cities are increasingly sensed, predicted, and modulated rather than simply governed. Design pedagogy is asked to remain optimistic inside institutions that are withdrawing care even as they demand innovation.

There is no shortage of critique. But visibility alone does not reorient desire. Knowing how a system manipulates attention does not necessarily loosen its grip on our lives.

Affection Research Lab operates one step earlier than critique. It asks what happens before understanding, before outrage, before reform. It asks how systems feel as they enter our lives—and how that feeling has been engineered long before we are invited to consent.

Affection as Literacy

AI literacy is often framed as understanding how systems work. Affection literacy is understanding how systems register in us before we understand them—how alignment is cultivated, how friction is removed, how inevitability is staged. It is the ability to notice when something feels off even when everything appears to be functioning perfectly. In this sense, affection becomes a form of critical infrastructure. It teaches hesitation in a culture of acceleration. It creates space for refusal without requiring certainty. It privileges orientation over optimization, sensing over mastery. Affection does not tell us what to do. It tells us when something is wrong before we can prove it.

A Slow Signal, Intentionally

Affection Research Lab is a slow imprint because speed has become a form of compliance. This is not a feed. Not a platform. Not a toolkit. It is a signal archive—conceptual, sonic, pedagogical—transmitting across time. Some texts will lean theoretical, others political, others closer to field notes or warnings. All of them resist the demand to resolve complexity on schedule. “Affection Comes Before Understanding” exists to say this plainly: The future will not be decided by smarter systems alone, but by whether we learn to orient ourselves toward them differently. Affection is not resistance. It is not refusal. It is orientation. And once orientation shifts, everything else tends to follow.