
Vendors bake hidden assumptions into their AI ROI numbers. Here's how to spot the flaws and rerun the math honestly before you spend a dollar.
The Number on the Slide Looked Great. Then You Bought It.
The demo went well. The vendor showed you a calculator — plug in your headcount, your average salary, the hours your team spends on the task. Out pops a number. "$340,000 in annual savings." Maybe it was less. Maybe it was more. Either way, it felt real because it came with a spreadsheet.
Six months later, you're looking at the actual results and wondering what happened.
You're not alone, and you're not bad at math. The problem is the inputs were wrong before you ever touched the calculator. Vendors build ROI models to close deals, not to survive contact with your actual business. This article shows you exactly where those models break — and how to build a number you can actually defend.
Why This Is Urgent Right Now
Something shifted in the last 12 months that makes this more than an abstract finance problem.
AI tools moved from pilot-project territory into operating budgets. Companies are no longer testing — they're committing. According to McKinsey's 2024 State of AI report, more than 70% of organizations reported using AI in at least one business function, up from about 55% the prior year. That sounds like momentum. What it actually means is that a lot of businesses signed contracts before they had a model for measuring results.
At the same time, vendors got better at selling. The ROI calculators got shinier. The case studies got more specific. The numbers got bigger and more confident. And the pressure you feel — from competitors, from your own team, from every conference session and LinkedIn post — pushes you toward a decision faster than the evidence warrants.
The businesses that are winning right now aren't the ones who moved fastest. They're the ones who asked sharper questions before they signed. The gap between a $40,000 AI investment that pays off and one that quietly drains your budget for two years often comes down to three or four flawed assumptions that nobody flagged at the proposal stage.
You can flag them. Here's how.
The Five Things You Need to Know
1. Vendors measure task time, not work time — and those aren't the same thing.
The concept: When a vendor says AI will save your team 10 hours a week, they mean 10 hours of a specific task — not 10 hours you can actually redirect or cut from payroll.
Why it matters: Work doesn't compress that neatly. If your customer service rep spends 30% of their day writing responses and an AI tool cuts that by half, you haven't freed up 15% of their salary. You've freed up 15% of their attention, which usually fills with other small tasks within a week. Gartner calls this the "productivity absorption" problem, and it's one of the most consistent gaps between projected and realized AI savings.
A regional insurance brokerage adopted an AI drafting tool for policy summaries. The vendor projected 12 hours of weekly savings across their three-person team. Actual measured time savings after 90 days: about 8 hours. Redirected toward revenue-generating work: roughly 3 hours. The rest absorbed into email, meetings, and administrative catch-up.
Rule of thumb this week: Take the vendor's time-savings estimate and apply a 30–40% "absorption discount" before you calculate any dollar value. That's a more honest starting point (estimate based on the pattern seen consistently across SMB AI deployments and documented in Gartner's productivity research).
2. The "fully-loaded cost" in their model is probably wrong.
The concept: Vendors calculate your savings against salary, but the real cost of any workflow includes the tools, oversight, corrections, and management time that surround it.
Why it matters: When you replace a task with AI, you don't eliminate all the surrounding labor. Someone still has to review outputs, catch errors, update the model's instructions when your business changes, and troubleshoot when it breaks. That oversight cost is almost never included in a vendor's calculator — and in the early months, it can run surprisingly high.
A mid-size e-commerce company implemented an AI tool to generate product descriptions at scale. Vendor ROI model assumed near-zero ongoing labor after setup. Reality: a content coordinator spent about six hours per week reviewing and correcting AI outputs for brand consistency. At her fully-loaded hourly cost, that was roughly $18,000 per year in unmodeled expense against a tool that cost $24,000 annually.
Rule of thumb this week: Add a line item for "AI oversight labor" — budget at minimum two to four hours per week of a real employee's time per AI tool you deploy, especially in the first six months. Adjust from there based on actual error rates.
3. The baseline they're comparing against is probably your best day, not your average day.
The concept: ROI projections almost always measure AI performance against an idealized version of your current process, not what actually happens on a normal Tuesday.
Why it matters: Vendors pull benchmark data from structured pilots or controlled demos. Your real operation has exceptions, legacy systems, staff turnover, and bad data. When AI hits those conditions — and it will — performance degrades. If the ROI model assumed clean inputs and consistent workflows, the model is comparing a real tool to a fictional baseline.
A manufacturing distributor was shown an AI demand-forecasting tool benchmarked against a competitor's 94% inventory accuracy rate. Their own baseline, once measured honestly, was closer to 79%. The AI got them to 87% — genuinely valuable — but half the expected ROI evaporated because the starting point in the vendor model was wrong.
Rule of thumb this week: Before you accept any ROI comparison, write down your actual current performance on the metric they're promising to improve. Measure it for two weeks if you don't already track it. That's your real baseline.
4. One-time savings are being sold to you as recurring savings.
The concept: Many AI tools deliver a one-time efficiency gain as teams adapt — then performance plateaus, while you keep paying the subscription.
Why it matters: The year-one ROI on an AI tool can look compelling because it captures the initial acceleration effect. Year two and three often look much flatter, especially for tools that automate repetitive tasks. Your team adapts, the easy wins are captured, and the marginal value of the tool decreases — but the license fee doesn't.
A professional services firm adopted an AI meeting-summary tool. First quarter: measurable time savings and strong team adoption. By month eight, the team had developed new habits around meetings — shorter agendas, fewer participants — and the tool's marginal value dropped. The firm was still paying full price for a benefit that was now largely baked into behavior rather than the software.
Rule of thumb this week: Ask the vendor to show you their customer retention data by year — specifically, what percentage of customers renew after year two and at what usage level. Low renewal rates or declining usage in year two is a signal that the recurring ROI story doesn't hold up.
5. They're not counting your switching costs, integration costs, or failure costs.
The concept: The ROI model starts at "tool works perfectly" — it doesn't include what happens before that point or what it costs if it doesn't.
Why it matters: Implementation takes time. Integration with your existing systems takes longer. Staff training takes longer than that. And if the tool doesn't deliver, unwinding it has a cost too — in staff time, in reset workflows, in the credibility you spent internally to get people on board. None of this appears in a standard vendor ROI model.
According to a 2023 KPMG survey on enterprise technology adoption, integration and change management costs averaged 30–40% of total project cost for mid-market technology deployments — and AI tools don't escape that pattern. For a $30,000 AI platform, that's potentially $9,000–$12,000 in unmodeled cost before you've seen a dollar of return.
Rule of thumb this week: Add 35% to whatever the vendor quotes for implementation to cover integration, training, and the first 60 days of troubleshooting. If that number changes your ROI calculation materially, the business case was thinner than it looked.
How This Connects to Your Business
Here's where you are, stated plainly. Pick the situation that fits.
If you're evaluating an AI tool right now and the vendor has given you an ROI projection: Run it through all five filters above before your next call. You're looking for how they handled time absorption, oversight labor, baseline assumptions, year-two value, and implementation cost. If any of those are missing, ask directly. A vendor who can answer those questions confidently is worth a harder look. One who deflects is showing you something important.
If you've already bought a tool and the ROI isn't materializing: Start with the baseline problem. Measure what your performance actually was before the tool, compare it to what it is now, and be honest about what changed and why. Then look at oversight labor — you may be getting a real return that's being eaten by untracked management time. Fix the tracking before you fix the tool.
If you're being pressured to approve an AI budget in the next 30 days: Slow down by one meeting cycle. Ask your team to fill in the five line items above with real numbers from your actual operation. If someone can't fill in your current baseline on the key metric, you're not ready to evaluate ROI — you're ready to guess. Guessing with $20,000 is avoidable.
If you're in early research mode and nothing is imminent: Build your own ROI template now, before a vendor hands you theirs. A simple spreadsheet with five rows — time savings net of absorption, oversight labor cost, honest baseline, year-two projection, and total implementation cost — will protect you from almost every common error in this process.
Common Traps to Avoid
Trusting the calculator because it has your name on it. Vendors will personalize their ROI tools with your company name, your headcount, your industry. It feels like diligence. It isn't. The structural assumptions underneath are still built to produce a favorable number. Personalized inputs into a biased model still produce a biased output.
Measuring ROI against the pilot, not production. Pilots are controlled. The vendor supports them heavily. Your best people work on them. They almost always outperform what happens when the tool goes live across your full team with normal staff and normal conditions. If your business case is built on pilot results, you're starting from a high point that you should discount by at least 20–30% for production reality (estimate based on commonly observed pilot-to-production performance gaps in enterprise software adoption).
Skipping the "what if it fails" math. Before you approve any AI spend, calculate the cost of a failed implementation — including staff time spent, workflow disruption, and the cost of unwinding. If that number would meaningfully hurt your business, the risk profile of the investment changes. Most owners do this instinctively for equipment purchases and skip it entirely for software.
Letting year-one excitement drive year-two commitments. Many AI contracts auto-renew or lock you into multi-year terms. Before you sign, know what the exit looks like. An AI tool that delivers great year-one ROI but declines in year two is still a good deal — unless you're locked into paying full price for the declining half.
Your Next Step This Week
Pick one AI tool you're currently evaluating or already using. Open a blank document and write down five numbers: your actual current baseline on the metric it's supposed to improve, the vendor's projected improvement, that improvement after a 35% absorption discount, your estimated weekly oversight labor in dollars, and your total implementation cost including the 35% buffer.
If the math still works after that, you may have a real business case. If it doesn't, you just saved yourself a significant budget mistake — and that's worth something too.
What's the one metric you'd most want AI to move in your business right now?

