AI Has a Budget Problem
Tripty Arya
March 14, 2025
C-Suite Leaders,
Over the past several years, organizations have invested heavily in artificial intelligence with a clear expectation: that smarter systems would translate into measurable business returns. Pilots became products and experiments became budget lines as AI moved steadily into core operations. According to McKinsey, nearly 90 percent of companies now use AI in at least one business function. Yet fewer than 40 percent of executives say those investments have delivered material impact at the enterprise level. (McKinsey, The State of AI)
What is becoming clearer is that this gap is not a failure of ambition or technology. It reflects a deeper framing issue. AI has largely been evaluated as software rather than treated as infrastructure, meaning it has been funded, governed, and measured at the level of individual tools rather than as a shared enterprise system. That distinction has quietly shaped how budgets are allocated, how success is assessed, and why returns struggle to scale.
When AI Adoption Doesn’t Add Up
AI adoption itself has been remarkably successful. Tools work, models improve, and individual teams often report meaningful productivity gains. At the enterprise level, however, those gains frequently fail to add up to something larger. AI creates its greatest value when intelligence accumulates over time, when learning in one part of the organization improves decisions in another. When AI is deployed primarily as a series of isolated tools, that accumulation rarely occurs and learning tends to reset with each new use case.
In practice, organizations become more automated without becoming meaningfully more intelligent. Returns appear, but they do not compound. This is the paradox many leadership teams now face: AI is widely used, but its impact remains uneven and difficult to scale.
How AI Became a Budget Allocation Problem
For most enterprises, AI entered through familiar budget categories. Customer support funded conversational tools. Marketing invested in personalization engines. Operations deployed forecasting models. Each initiative fit neatly into an existing department, justified by a local business case and measured against local outcomes. This made AI relatively easy to approve and govern within existing planning cycles.
Over time, however, it also produced fragmentation.
Systems learned in isolation. Data was duplicated across tools. Insights tended to stop at organizational boundaries. AI delivered value within departments, but rarely across them. Many organizations are now finding that AI ROI breaks down not because budgets are too small, but because they are too fragmented. Intelligence cannot compound when it is confined to silos.
Some enterprises attempted to address this by creating a new AI category altogether, often in the form of centralized programs or innovation hubs. These initiatives frequently demonstrated technical promise, but struggled to scale. New categories lack economic gravity when they do not align cleanly with operating budgets, portfolio reviews, or core performance metrics. As a result, ROI remains aspirational rather than embedded in how the business runs.
The Platform Category: Where ROI Begins to Compound
A third path is now emerging, one that reframes AI not as a tool and not as a separate initiative, but as a platform that consolidates existing budgets. In this model, AI does not compete with departmental spend. It connects it. Multiple use cases draw from shared systems and shared intelligence, allowing learning in one area to improve outcomes in another. ROI is measured at the enterprise level rather than the use-case level.
One way to understand this shift is to look at how AI investments are typically organized. In the most common model, AI is funded as a series of departmental line items. Customer support buys conversational tools, marketing invests in personalization, and operations deploy forecasting systems. These investments are relatively easy to approve and govern, but their impact remains limited. Each system delivers value within its function, and learning rarely carries beyond it.
Some organizations attempt to move faster by creating a separate AI initiative, positioned above the business. These efforts often feel bold and future-oriented, but they struggle to tie their impact to day-to-day operating metrics. As a result, value is visible in demonstrations, but harder to sustain in the core of the business.
A third approach is now emerging. Rather than treating AI as a tool or a standalone program, organizations are consolidating existing investments into a shared platform. In this model, AI draws from common systems and data, allowing intelligence developed in one area to be reused in others. Returns no longer reset with each new use case. They accumulate. This distinction is not semantic. It is economic. ROI follows structure.
Deloitte’s research reinforces this pattern. While AI experimentation is widespread, only about 20 percent of organizations report achieving high or very high returns. Those leaders consistently embed AI across workflows rather than isolating it in programs or functions. (Deloitte, State of AI in the Enterprise)
For this model to work in practice, shared budgets must be paired with shared access to data and systems. Platform consolidation without interoperability simply recreates fragmentation under a different label. When intelligence can move across use cases, learning accumulates and returns scale.
What Leaders Should Be Doing Now
[Redefine AI ROI at the enterprise level]
Executives should ensure AI ROI is evaluated in the same forums as other enterprise investments, such as portfolio reviews and strategic planning discussions. The central question is whether intelligence generated in one part of the organization is improving outcomes elsewhere.
[Use budget fragmentation as a diagnostic]
Overlapping AI spend across departments should be viewed as a signal, not a failure. It often points to opportunities where consolidation can unlock value by allowing existing investments to work together rather than in parallel.
[Consolidate around platforms rather than tools]
Platform decisions should focus on the ability to aggregate multiple use cases and connect existing categories into a coherent system, rather than adding another layer of technology to manage.
[Make interoperability a budget requirement]
Shared platforms only deliver returns when data and intelligence can move across teams and workflows. Interoperability should be treated as a financial and strategic requirement during investment decisions, not a downstream technical concern.
[Align AI investment with how work actually happens]
AI creates value when it is embedded in daily decision-making and collaboration, rather than positioned as a separate initiative that teams must work around.
Looking Ahead
Every major technology shift follows a familiar arc. Early value comes from isolated applications. Enduring value comes from shared systems. AI is now reaching that point. The organizations that succeed will not be those that spend the most on AI, but those that structure their investments so intelligence can accumulate rather than fragment. That is when AI stops being a line item and starts becoming a compounding asset.
Additional Resources:
To support this AI 2.0 series, we’ve created an AI Guide for Business Leaders that outlines the key differences between AI 1.0 and AI 2.0, including how interaction models, platforms, and organizational expectations have evolved. The guide is designed as a practical reference for leaders looking to align strategy, technology, and operating models as AI becomes more conversational and platform-driven.
About This Email Series
This email is part of an ongoing Strategy Saturday series written for C-suite leaders and focused on the strategic shifts required to lead effectively in an AI-driven world. The insights and perspectives shared are intended to support strategic reflection and informed decision-making, rather than prescribe specific actions.
-> Next Month: The Fragility of Pricing in Housing Software