Your AI project is not failing because of the technology.
AI amplifies what it finds.
Clarity and capable teams: it accelerates. Organizational disorder: it exposes.
Ten words. The whole problem.
You read that and you thought: we are probably in the first category. Good people. Experienced management. Processes that work, mostly.
You are not wrong. But look closer.
Every company I have gone inside, the same thing happens around week three.
The CEO tells me: the processes are documented. We have SOPs. We are more organized than most companies our size.
Then we start mapping the actual workflow. Not the documented one. The real one.
The spreadsheet three people maintain in three different formats. The customer records built for one purpose and repurposed six times. The process the senior analyst runs in her head because writing it down takes too long and she knows it by heart.
The CEO goes quiet.
The gap between the documented version and the real version is always bigger than expected. That is what makes him go quiet.
AI runs on the real version.
In practice: data that was never meant to be used at scale. Records that are good enough for humans to navigate because everyone knows to call Claire when the version in the system is wrong. Processes that live entirely in the head of the person who has been there nine years. Decisions routing through one person because nobody ever built a system to handle them.
And then the invisible things. The output of one team feeding three other processes nobody formally connected. The Monday report someone generates manually because the automation broke two years ago and the workaround became permanent.
None of this is unusual. It is how most functioning companies actually work.
The question is what happens when you build AI on top of it.
What happens is amplification. The data problems manageable at human scale become blockers at AI scale. The undocumented process the senior analyst ran perfectly for nine years produces garbage when the AI tries to follow it without her. The informal dependencies everyone understood collapse when a system tries to navigate them formally.
Here is the other side.
The companies that went in honest, that looked at what they actually had before they tried to build, found something useful: the problems were fixable. Not all of them. The ones that mattered for the first use case. They fixed those. Built the foundation. Deployed on top of something solid.
Then the AI did what AI does when it has something real to work with. It accelerated.
That is the version you want.
What does AI find when it enters your company?
Worth knowing now. Before the implementation starts. Before the vendor is already in the room. An honest look at what you actually have is not a delay. It is the work that determines whether this works.