Natural hallucinations
AI gets hammered for hallucinating.
Fair enough.
But it’s strange how rarely we admit the obvious: humans do it constantly.
The brain fills gaps. It confabulates.
It invents patterns where none exist.
It misremembers with confidence. It builds stories to protect identity, tribe and belief.
Those stories can be harmless, helpful or deeply destructive.
Neuroscience has known this for decades.
Memory is a reconstruction.
Perception is an interpretation.
Belief is a social technology (ty Krista Tippett for this idea) that often survives long after its evidence is gone.
Humans are the original hallucination engines.
So when we criticise AI for drifting from the facts, the wiser move is to look at what this exposes in us.
The risk is not that machines are untrustworthy. It is that they mirror a species that has always preferred a tidy story over a messy truth.
For leaders, that matters.
Future literacy means learning to spot where our own minds bend reality before we judge the machines for doing the same.
#scottspeaks #futureliteracy #leadership #AI #cognition #decisionmaking


