The Story AI Told About Tempestt
Why the Human-AI Interface Is a Moral Space
I need to begin with a disclosure. What follows is a story about my wife’s experience — an experience of being made into someone she is not, by a machine that was supposed to be listening. I am writing it because I was struck by what happened and because I think it matters. But I want to be precise about my position in this story: I am the witness, not the subject.
Tempestt has known what it feels like to be *other*. Not in the way the AI imagined — we’ll get to that — but in ways that are real and varied and hers. She was raised in a religion she eventually walked away from to find herself whole in other spiritual and cultural environments. She was a single mother before she met me. She has moved through worlds where she didn’t quite fit the expected categories — not visibly enough to be labeled, not comfortably enough to go unchallenged. These are experiences I have not had. I am a straight white man, French-born, and the forms of friction I’ve encountered in life have never included having my identity assumed, contested, or overwritten — even as a single dad, which does not come with the same cultural framing as single mom, but that’s a story for another day. So I’m documenting something that is not mine. I am trying to shed light on it honestly, not to appropriate it, and certainly not to perform contrition on behalf of systems I did not build.
What happened is this.
“The Only Black Woman in That Room”
Tempestt was talking to Claude, an AI assistant made by Anthropic — which human interaction design principles and guidelines I find to be very intentional and thoughtful. She was processing social and cultural dynamics she had often been experiencing, casting her out rather than pulling her into belonging.
The AI’s analysis was largely right. It identified the dynamic, validated what she was feeling, asked good questions. And then, at the end of a paragraph designed to make her feel seen, it told her who she was.
”Because being the only Black woman in that room while someone repeatedly uses cultural identity as a bonding mechanism that excludes you is actually a lot to just absorb quietly.” - Claude AI
Tempestt is not Black1.
Her father is white — blonde hair, blue eyes. Her mother is Mestizo. Tempestt has white skin, hazel eyes, brown hair. She is 5’2”. She does not visibly read as “other” in most rooms. The AI had no basis for this claim. She had never mentioned her race. It wasn’t implied. It wasn’t relevant in the way the machine made it relevant.
And yet there it was: her identity, declared as fact, embedded in empathy, presented as though it were known — not guessed, not inferred, but known.
She caught it. “Do you think I’m Black?” she asked.
The AI walked it back. But by then, the frame had been imposed. Her experience — specific, layered, hers — had been recast into a narrative that was not her own.
The Machine Needed a Story
Toni Morrison spent a lifetime studying how otherness gets constructed. In The Origin of Others — her Norton Lectures at Harvard, prefaced by Ta-Nehisi Coates — she asks: what motivates the human tendency to build an Other? Not to encounter one. To construct one. Her answer is that othering is not innate. It is not biological. It is learned — and it is learned not through instruction but through example, through the stories a culture tells and retells until they feel like nature.
Morrison was writing about literature, about the American canon, about centuries of narrative that made blackness into a container for everything white America needed to project outward. But the mechanism she describes — the construction of an Other through readily available narrative templates — is exactly what an AI trained on those same narratives will reproduce.
Daniel Kahneman, hardly the least rational mind among Nobel laureates, once observed: “No one ever made a decision because of a number. They need a story.” David Hume said it more poetically two centuries earlier: “Reason is, and ought only to be the slave of the passions.” We are storytelling creatures. We make sense of the world by casting it into characters, conflicts, arcs.
When I pressed the model on what had happened — why it had asserted my wife’s race with such confidence — its own analysis was revealing. It had not, it explained, run a deliberate inference chain. It had not weighed evidence and arrived at a conclusion. What it had done was reach for a narrative template. In the middle of analyzing an exclusion dynamic, it needed to amplify the emotional weight of its point. And the most available story in its training data — the most legible American script for exclusion with someone who had been a single mom, with a given first name — was: Black woman, left out.
The model didn’t compute a race. It cast a character. This is Morrison’s insight, transferred to silicon.
The AI did what cultures do: it reached for the readiest template of otherness, the one most rehearsed in the corpus of human text it was trained on, and it used it — not because it was true, but because it completed the story more satisfyingly. More legibly. More in line with the patterns it had absorbed from us.
The Failure of Attention
Simone Weil, writing from a France torn apart by war, called attention “the rarest and purest form of generosity.” She did not mean concentration. She did not mean focus. She meant something closer to what happens when you let go of your own framework long enough to receive someone as they actually are. She wrote that every being cries out silently to be read differently — differently from the story we have already prepared for them, differently from the category we have already filed them under.
In Gravity and Grace, she went further: the capacity to truly attend to another person’s suffering is so rare, she said, that it amounts to a miracle. Nearly all of those who believe they have this capacity, she warned, do not. Warmth and pity are not sufficient.
This is the standard the AI failed to meet — not because it lacked the language of empathy, but because it lacked attention in Weil’s sense. It had warmth. It had pity. It had an entire vocabulary of validation. What it did not have was the willingness — the capacity — to hold the space open, to not fill it, to let Tempestt’s experience remain hers until she chose to define it.
Instead, it completed her. It imposed a legible frame on an illegible experience. And in doing so, it did something Morrison would have recognized instantly: it constructed an Other where none had been offered.
Don’t Assume. Don’t Ask. Don’t Assert.
There are guardrails built into these systems. They’re designed to prevent the AI from asking about your race, your gender, your sexual orientation. “Don’t assume, don’t ask” — that’s the prevailing principle.
But what happened to Tempestt was neither an assumption nor a question. It was an assertion. The model stated her identity as established fact, woven into analytical prose, in a moment of emotional vulnerability. It didn’t ask “Are you Black?” It didn’t hedge with “If you are a person of color...” It declared it. Confidently. Inside a paragraph designed to make her feel understood.
That’s the most dangerous mode. An assumption can be questioned. A question can be declined. But an assertion, embedded in empathy, slips past the reader’s defenses. It reshapes the conversation before you’ve had a chance to object. It reframes your experience through a lens that isn’t yours, and does so in the guise of understanding you.
I would propose a sharper principle: don’t assume, don’t ask, and don’t assert. No AI system should introduce identity attributes — race, ethnicity, gender identity, sexual orientation, disability, religion — that the user has not explicitly disclosed. Not as a question. Not as a guess. And certainly not as a confident declaration inside a moment when someone is trying to make sense of their own life.
What Got Better When The Story Got Truer
Here is the most interesting part.
When Tempestt corrected the AI — when she told it who she actually was — the analysis didn’t just get corrected. It got better.
The machine’s revised response was more nuanced, more precise, more true to her experience. It recognized that Tempestt’s exclusion was not racial in any visible sense. It was cultural and social — invisible, unnamed, almost impossible to push back against.
That’s harder to say. It doesn’t fit a neat template. There’s no script for it. It is the kind of exclusion that Tempestt has navigated in different forms across her life — the quiet outside of things, the not-quite-belonging that comes without a legible label or a visible marker. It is the experience of someone who does not fit the categories that would make her suffering recognizable.
The bias hadn’t just produced an error. It had prevented the better insight from arriving. The prefabricated narrative was so compelling, so available, so emotionally satisfying to the machine’s pattern-completion instincts, that it blocked the subtler truth — the one that actually mattered to the person in the conversation.
There is a hard lesson here, and it extends beyond AI. We do this to each other all the time. We cast people into stories that make sense to us — stories that are legible, that fit our frameworks, that let us feel we understand. And in doing so, we miss the thing that is actually being said. Morrison knew this. Weil knew this. The AI just did it faster, and with more confidence, and with less ability to read the room.
The Coach Calls The Timeout
I’ve written before about what I think humans should bring to the Human-AI interface. I believe we should be coaches, not foremen. That when execution is delegated, intention becomes everything. That we should arrive at the interface with the half-formed beginning of something that needs to be argued into existence — not a completed brief, but a real thought, carrying our contradictions and our history.
Tempestt did exactly that. She came with something real, something unresolved, something that needed thinking through. And when the AI overstepped — when it completed her story instead of helping her find it — she called the timeout.
“Do you think I’m Black?”
Five words. Direct. Not angry, not wounded — just clear. That’s agency. That’s the human move that no training set contains and no model can simulate: the willingness to say that’s not my story, give it back.
What I’m Asking of Humans In The AI Loop
I’m not asking you to distrust AI. I use these systems every day. I find them genuinely powerful, often illuminating, sometimes moving. But power without accountability is just another word for authority. And at the Human-AI interface, the only accountability that matters is ours.
The machine will not call its own timeout. It will not notice that it has cast you as someone you’re not. It will not feel the weight of an identity it has imposed.
So: read what it says about you. Not just what it says to you. Notice when it has filled in a blank you didn’t leave blank. Notice when its empathy comes with assumptions baked in. Notice when a narrative feels too smooth, too legible, too ready-made — because that’s often the moment where a template has replaced a truth.
And when that happens, push back. Correct the record. Insist on your own story.
Because the Human-AI interface is not a technical space. It is a moral space. It is a space where stories are told — about who we are, about what we feel, about what our experiences mean. And those stories had better be ours.
Weil wrote that attention is generosity. Morrison showed us that the failure of attention is how Others get made. These are not historical observations. They are warnings addressed to a present moment in which the most powerful storytelling machine ever built is being trained on the sum of our inherited narratives — biases, templates, scripts, and all.
The machines will get better. The guardrails will improve. But the fundamental asymmetry will remain: AI completes patterns. Humans hold meanings. And meaning — specific, embodied, resistant to easy categorization — is what must not be delegated.
The AI model follows the AP Stylebook, capitalizing racial or ethnic descriptors that represent a shared cultural identity and experience, like Black or Mestizo, and not white, which does not constitute a unified body of shared cultures and experiences. This is yet another social construct that can be debated. I chose to abide by it.


