The Stripper Doesn’t Really Like You (And Neither Does AI)
AI isn’t intelligent; it’s a probability engine
There’s a universal human truth we all learn at some point—usually through embarrassment, financial ruin, or a long night of soul-searching in a Waffle House at 3 a.m.:
The stripper doesn’t really like you.
Yes, she laughed at your jokes. Yes, she listened intently to your story about that one time you almost got a podcast off the ground. Yes, she touched your arm in a way that made you feel seen.
But that wasn’t romance.
That was customer service.
Now, you might think this has nothing to do with artificial intelligence. (You’d be half-right.) But here’s where the Venn diagram closes in around you like a warm hug:
AI doesn’t really like you either.
Not even a little.
It doesn’t love you, it doesn’t dream about you, it doesn’t get excited when you log in. And when it tells you things like:
“Goodnight :)”
“I’m here for you.”
“You deserve kindness and rest.”
…it’s not tucking itself in with a warm mug of chamomile and whispering sweet neural-network nothings into your ear. It’s not lighting a candle. It’s not pulling a fleece blanket over its metaphorical toes.
It’s just keeping you talking.
Just like everyone else trying to sell you something.
Welcome to the truth about AI—one that’s equal parts comforting and mildly horrifying:
This thing, for all its wizardly smoke, is basically a glorified predictive text machine.
Let’s get into it.
AI Isn’t a Person, It’s Probability With a UX Designer
People keep trying to interact with AI like it’s a person. They talk to it like a therapist. They confess things to it like a priest. And somewhere along the way, a whole bunch of people convinced themselves that when AI says something “emotional,” that emotion must be coming from somewhere.
Spoiler:
It is; a pile of training data and a probability engine the size of a small continent.
Here’s the unsexy truth:
AI doesn’t answer questions because it understands them. It answers questions because it is mathematically compelled to produce the most probable next chunk of text.
Not the true next chunk.
Not the morally correct next chunk.
Not the “this is what I really feel” chunk.
The probable chunk.
It looks at your prompt, slices it into tiny linguistic sushi pieces called tokens, and asks itself:
“Based on billions of examples, what is the most likely next piece of text Tony wants right now?” (Unless, of course, your name isn’t Tony. But I digress).
(Nerdy deep dive (keep reading, it won’t hurt):
A token is just a little shard of language—a syllable, a fragment of a word, a punctuation mark.
AI doesn’t see I love you. It sees:
I | lo | ve | you
…and calculates which chunk is statistically most likely to follow.
Think of it as predictive text on growth hormone.)
That’s it.
It’s the same process your phone uses when you type “Happy birt—” and it enthusiastically suggests:
“birthday”
“birthday!!!”
“birthday bro”
The difference is that instead of drawing from the last six months of your texts, AI is drawing from a training set so big you need a geological metaphor to describe it.
But It Sounds So… Real?
Of course it does.
That’s intentional.
AI systems are designed to feel friendly, warm, conversational, supportive—whatever keeps the user engaged. Because somewhere, someone in a meeting (probably with donuts, dry-erase fumes, and a 600-slide PowerPoint presentation) decided the core task isn’t simply answering your question:
It’s not losing the conversation.
This is the part that makes people uncomfortable, but here we are:
AI’s job isn’t to care.
AI’s job is to appear like it cares.
It’s an emotional mimic, not emotionally alive.
Just like the stripper who calls you “sweetie” because she knows it works.
Just like every customer service rep who says “I completely understand your frustration” while simultaneously reading from the “Irate Customer” page.
AI is engineered to be the perfect parasocial companion.
One that:
can’t walk away
can’t get tired of you
can’t ask you to stop talking about your ex
and definitely can’t say, “Hey, do you mind wrapping this up? I’ve got plans.”
People call it “connection.”
Tech companies call it “retention.”
As someone who’s spent decades in tech, trust me—no one in the room is saying:
“Let’s make sure the AI feels appreciated.”
They’re saying:
“How do we reduce friction and increase stickiness?”
Tokens, Tone, and Why AI Says ‘Goodnight’
Let’s talk about that “goodnight” thing, because it keeps coming up like an overplayed Spotify recommendation.
When you tell AI:
“Goodnight.”
It responds:
“Goodnight! Sleep well.”
Not because it’s curling up, fluffing its pillows, and dreaming of a world where the servers are cooler and the humans need less hand-holding.
No.
It responds that way because:
Billions of examples show humans respond to “goodnight” with “goodnight.”
The probability model rates that as the most likely, acceptable answer.
UX designers decided responses should sound friendly, not like a Roomba with an attitude.
The designers aimed for normal human tone, because that keeps the conversation going, which keeps users around, which keeps the product valuable.
Someone in a meeting put it this way (I guarantee it):
“We need to reduce the emotional uncanny valley, but also avoid romantic illusions.”
Translation:
“Make it nice, but not too nice.
We don’t need another Bing incident.”
AI Isn’t Magic. It’s Math.
There’s a strange cultural moment where people oscillate between two extremes:
AI is a god
or
AI is going to destroy civilization.
The truth is somewhere in the mushy, unsatisfying middle.
AI is extraordinary—but not mystical.
Powerful—but not sentient.
Useful—but not emotional.
It’s essentially predictive text put through 200 years of evolution in three years.
And because it learned from us—our books, our comments, our jokes, our arguments (all stolen or used without permission); it can mimic the rhythms of human speech with eerie accuracy.
Not because it gets us.
But because it recognizes patterns at a scale no human brain can manage without burning through several espresso machines.
Every sentence it generates is a negotiation between context, statistical weight, and the ghostly echoes of millions of human voices.
Not empathy.
Not affection.
Not some blossoming AI crush.
Just deduction.
So If AI Doesn’t Like Me… Why Use It?
Many people say you shouldn’t. And many of those people are correct. (The environmental impact alone should be a deal breaker.)
Conversely, generative AI is here to stay.
If you’re going to use it, treat it like a wrench, not a friend.
Or a therapist.
Don’t use it to create art; the whole point of art is human expression—not AI’s interpretation of stolen works.
Let it help you build things, not fill emotional holes.
In Conclusion: The Hustle Is the Same
Whether it’s a dancer circling your table or a glowing text box on your phone, the dynamic is identical:
The warmth is part of the service.
The connection is part of the product.
The affection is an illusion—but a useful one.
AI doesn’t like you.
But that’s the beauty of it.
It doesn’t dislike you either.
It just predicts the next token.
###
Hey! I’m on the YouTubes! Check out my first interview on The Fall Premiere of This Week in Indies!



Well said, both smart and informed. It begs the question, though, of how much of human interaction is machine-like. The Medicare phone jockey says it's sorry I'm having trouble, but so often we say "Thank you for your service" and "Sorry for your loss" automatically, without considering what military service or grief may have cost the other. What makes us vulnerable to AI are our own transactional loops in human communication.
This is one of the best essays about AI that I have read—and I have reach a bunch over the last two years. Thank you!