
Netflix

Homse
Socialy

ARTIFICIAL IMAGES STUDIO
JEMIOL
REJ Productions
Managing by Hallucination
I know several CEOs who "consult ChatGPT" before every significant decision. They say it with pride – proof that they're modern, data-driven, ahead of the curve. And every year, there are more of them.
My thought is always the same: this company is heading toward very expensive mistakes.
Language models are designed to sound convincing. That's not a side effect – it's the point. An answer that feels confident and competent scores higher than one that admits uncertainty. The result is a tool that always has an answer. Even when it shouldn't.
This is called hallucination. A language model doesn't "know" things – it processes patterns and generates what statistically should come next. It does this with identical confidence regardless of whether it has solid ground beneath it or none at all. For someone who doesn't understand the mechanism, a convincing fabrication is indistinguishable from a reliable analysis.
But that's just the tool. The real problem emerges when a decision-maker confuses analysis with judgment – when they stop seeing the difference between processing information and making a call.
I've had the chance to work with people like this and watch the process unfold up close. What I see is always the same. These people gradually lose trust in their own reasoning. They develop a kind of paranoia – every decision, even a minor one, must pass through the model first. As if their own opinion has become suspect, and a well-crafted prompt has become the objective oracle.
The effect is the opposite of what they intended. Instead of better decisions – more hesitation. Instead of confidence – dependency on a tool that doesn't understand the context, doesn't know the company, and will face no consequences whatsoever.
There's something a language model will never have. Something we've started whispering about in the AI era, as if it were a sign of unprofessionalism.
Intuition.
I don't mean gut feeling or guessing. Intuition is accumulated years of experience – thousands of decisions, projects, failures, and successes that leave something in a person that no model can process. It's patterns recognised before you've had time to articulate them. It's knowing something won't work before the data has caught up.
My best projects came from intuition. Not against the data – alongside it. The data said one thing, my experience said another, and my experience was right.
A seasoned expert who has spent years making mistakes and learning from them, who has faced real consequences for their decisions, who understands an industry better than any database – is incomparably more valuable than a model processing text from the internet.
What happens when someone stops listening to those people and starts relying on the model instead?
The company becomes a permanent experiment. Endless MVP cycles. Infinite testing phases. Wrong decisions covered by the ever-fashionable word "pivot." Every failure wrapped in a narrative about "learning" and "iteration" – because when there's no person with experience and intuition behind a decision, there's also no one taking responsibility for it.
Pivot has become the word many companies use instead of "we got it wrong." And they get it wrong more often – because they ask someone who actually knows less and less frequently.
There's another dimension to this problem that rarely gets discussed. When thousands of companies in the same industry use the same models and the same prompts – they make the same mistakes simultaneously. Systemically. There's no diversity-of-perspective effect that historically protected markets from synchronised failures. There's one point of vulnerability, distributed across an entire industry.
The industrialisation of error at scale.
Let me be direct: if you're running your company through ChatGPT, you've already lost.
Not because AI is bad. Use it – automate, delegate generic tasks, process information faster than ever. That's what it's good for.
But for making decisions, use the best model that exists: your own mind. And the minds of the people on your team who have years of experience and intuition that no model will ever replace.
Real accountability for decisions and years of learning from your own mistakes – that's not something you can generate with any prompt.

Maks Rybicki
More articles
Contact
The best projects start with a conversation.
Use the form below to start one.
Details matter. Especially the ones you don't notice.