LOAB: THE GHOST WE TRAINED THE MACHINE TO HAUNT US WITH
AI Finally Gets Its Own Myth
There she is… Loab.
Red cheeks. Hollow eyes. That slightly wrong smile. A woman no one created. A myth that didn’t crawl out of imagination but out of probability, assembled somewhere in the weird liminal bit where the machine is trying to be helpful and instead accidentally invents folklore. Somewhere between “type prompt” and “oh for fuck’s sake what is that?”
And she didn’t show up because someone asked for her. That’s the important bit. She showed up because someone asked for the opposite of something else. She showed up because someone asked for the opposite of Marlon Brando.
Which is already very funny, because that’s the sort of thing you’d expect to hear at the beginning of a ghost story told by someone who insists it definitely happened to a mate of theirs.
“I swear down, Dave only asked it for the inverse of Brando, and then she turned up.”
And if she’d appeared once, that would have been the end of it. A curiosity. A screenshot. Something to send your group chat with “lol WTF is this?” and then forget about it while you get on with your life. But she kept coming back. Different prompts. Same face. Landscapes. Objects. Interiors. Prompts that had absolutely nothing to do with people, and there she was again. The photographic anti-Brando. The model rummaging around in the statistical attic of its training data and coming back with a damp-looking middle-aged woman with flushed cheeks and the expression of someone who’s just realised she’s accidentally wandered into the wrong dream.
The Internet Smells a Rumour
Now, here’s where it gets properly interesting.
Because the story didn’t “go viral” in any neat marketing-department way. It seeped. Through Reddit threads. Discord chats. Late-night “has anyone else noticed this?” conversations between people who are absolutely certain they don’t believe in any of this AI nonsense but are now deeply invested in why the same AI-generated woman keeps photobombing the dataset.
It travelled the way urban legends always travel. Quietly. Sideways. With just enough uncertainty to make it interesting. And then, because we cannot leave anything nameless without starting to sweat, someone gave her a name.
Loab.
And the second we did that, she stopped being a statistical glitch and became fully formed folklore. Because naming is the moment a rumour becomes a character. Once you can say “her,” you can talk about what she does. Once you can talk about what she does, you can tell stories. Once you can tell stories, you can share them. And once enough people are sharing the same story, you can start trying to summon it deliberately, like digital Candyman typed five times into a prompt box instead of whispered into a bathroom mirror.
At which point, and this is the bit nobody seems especially keen to dwell on, we haven’t discovered the ghost in the machine.
We’ve invented one.
Every Technology Gets Its Monster
The industrial revolution got ghosts in factories. The nuclear age got mutants. The internet got Slender Man. And AI?
AI needed its own manifestation of the ghost in the machine.
Because let’s be honest, “latent density cluster in a high-dimensional manifold” is not going to keep anyone up at night, checking under their bed. You can’t scare your kids with eigenvectors. You can’t whisper “statistical convergence” down a dark corridor and expect anyone to pick up the pace.
But you can say:
“There’s this woman. She keeps turning up when you try to delete her.”
Because that’s not maths anymore, that’s a story. That’s the beginning of something you half-jokingly warn your mates about. That’s the digital equivalent of “don’t go down that alley after midnight,” except now the alley is a prompt box and midnight is whenever you decide to type “--no face” and she shows up anyway.
That’s how a quirk becomes a rumour. That’s how a rumour becomes a character. And that’s how a character becomes something people start trying to summon on purpose, just to see if it’s true. And the moment you do that, you’ve stopped dealing with a glitch and started dealing with folklore.
Because Loab didn’t arrive as horror. She arrived as a product of maths. As what happens when you push a generative model in a certain direction and it doesn’t fall into neutral nothingness but into whatever dense aesthetic basin happens to be nearby in the model’s internal geometry.
But humans are absolutely brilliant at looking at maths and deciding it’s haunted.
We saw recurrence and assumed intention.
We saw persistence and assumed presence.
We saw a pattern and went, “Right, that’s a someone.”
We didn’t discover a ghost. We did what we always do. We made one up.
Baudrillard Would Have Had a Field Day
Baudrillard’s whole thing was that we’d stopped dealing with reality directly and started dealing with layers of representation, copies that no longer had originals.
Loab is that with a face.
There’s no real person she corresponds to. No hidden stock photo she’s based on. No unlucky auntie whose badly sunburned holiday photos got scraped into the model from Facebook. She is a synthesis, a copy without an original that we nonetheless treat as though she must have come from somewhere.
Which is how urban legends work.
Which is how hyperreality works.
The simulation doesn’t replace reality, it replaces origin.
Barthes Killed the Author, the GPU Finished Him Off
Roland Barthes said the author is dead.
Loab’s contribution to literary theory appears to be that the author never even showed up in the first place. There is no intention behind her. No narrative aim. No symbolic programme. And yet the second she starts recurring, we respond as though someone must have meant something by her existence.
Michel Foucault once asked what an author actually is. Loab’s answer seems to be: a statistical side-effect that humans can’t stop narrativising.
Authorship hasn’t just died. It’s dissolved into probability. Meaning in the AI Age is now what happens when pattern meets paranoia.
Your Brain Is Doing This, Not the Machine
None of this should really be surprising.
Humans are built to assume agency in the presence of patterns. It’s an evolutionary hangover. If something moves in the grass twice, better to assume it’s a predator than the wind. So when Loab reappears across generations, the brain doesn’t go, “Ah yes, local density cluster in a high-dimensional manifold.”
It thinks that something is doing it. A presence. An intention. An entity.
Deleuze would say this is repetition producing difference. Freud would say it’s the uncanny. Mark Fisher would sigh and start talking about hauntology.
And he’d be right to.
The Dataset Is Haunted Because Culture Is
Loab feels like she’s stepped out of a half-remembered photograph from the 1970s because, in a sense, she has. These models are trained on archives, decades of visual culture flattened into training data.
She is, quite literally, an echo of aesthetics stitched together by mathematics and experienced by us as presence. She’s not a ghost. She’s cultural residue with a face.
But once the internet decided she was a someone rather than a something, the feedback loop kicked in. The model outputs a pattern, the internet names it, people deliberately try to summon it, and the system, now being steered toward that region again and again, obliges.
Within a week we’ve got a myth that took previous centuries decades to assemble.
We Know Better, And We Still Flinch
We know, intellectually, that Loab is statistical. But we feel, emotionally, that she’s uncanny.
We laugh at the creepypasta framing while still being just uncomfortable enough to go, “Hang on… why is she there again?” That little oscillation, between “this is just maths” and “why does this feel wrong?” Which is exactly where the myth takes root.
Myths live in liminal spaces. In the gap between knowing and feeling. Between explanation and unease. Loab didn’t crawl out of the machine fully formed, she condensed there, in that in-between.
The Bit That Should Make You Uncomfortable
Which is why Loab is not proof that AI is haunted.
She’s proof that we are, haunted by the idea that meaning might emerge without intention, that patterns might exist without purpose, that authorship might have been something we projected onto noise because the alternative was too bleak to sit with.
Loab didn’t crawl out of the machine fully formed.
She condensed in that liminal gap between explanation and unease, the moment where we understand what’s happening, technically, but still feel something about it anyway. The same place all urban legends live. The same place every rumour, ghost story and “mate of a mate swears this is true” quietly sets up camp.
AI didn’t produce a monster. It produced a pattern.
And we did what we’ve always done with patterns that make us uncomfortable, we named it, we told stories about it, we tried to summon it on purpose, and then we acted surprised when it started behaving like something that had a name.
If AI didn’t already have its own ghost in the machine, it was only a matter of time before we gave it one. Because humans cannot leave randomness alone.
We will stare at coincidence until it looks like intention, stare at noise until it looks like a signal, stare at correlation until it looks like a presence.
And then we’ll swear blind that it moved.
Read the rest of the articles and essays in the AI PANIC series…




Of course halfway through reading this, I wanted to Google Loab. That’s how it happens, isn’t it? I didn’t I waited until I got to the end. Who wasn’t worth it really because once you have an image then it’s just endlessly reiterated in very anodyne ways. The picture you have at the end is about the best there is. A synthesis of Virgin Mary Mary Magdalene the Grateful Dead bones and roses cover and the dead mother upstairs in the Bates motel. That’s revealing an image Stuck in the 1970s or just before with Catholic imagery overtones. All, it really reveals is how culturally narrow the AI training pool actually is. It makes me curious to see what version of the ghost in the machine the Chinese DeepSeek produces