The Real Risk Isn’t an Intelligence Shortage. It’s Misallocation.

If AI capacity tightens, does our growth plan still work?

It’s a question echoing in boardrooms after the release of Citrini Research’s “The 2028 Global Intelligence Crisis” and the subsequent Wall Street Journal coverage of market aftershocks. As WSJ reported, “Investors scrambled to reassess valuations, with tech equities sliding on fears that ‘the era of cheap intelligence may be ending.’” The markets didn’t just wobble—they recalibrated, because AI-driven productivity is now priced into the future of nearly every sector.

But here’s the pivot: The real risk isn’t a looming shortage of intelligence. It’s how companies are allocating the intelligence—and capital—they already have.

What the 2028 Thesis Gets Right

Citrini’s report is clear-eyed about the structural risks. They warn: “A projected intelligence bottleneck is emerging as compute demand outpaces supply, with ‘potential for systemic disruptions in digital productivity by 2028.’” The paper details “macroeconomic risk scenarios where ‘AI capacity constraints could curtail GDP growth by up to 2% annually in advanced economies.’”

And it doesn’t stop at numbers. The authors frame the issue as “a systemic challenge: ‘The concentration of AI infrastructure in a handful of providers creates single points of failure and amplifies market fragility.’”

They’re not wrong. Infrastructure concentration, energy bottlenecks, and the assumption that AI will always be cheap and abundant are real issues. Even if Citrini’s timeline is aggressive, their thesis exposes a deeper vulnerability inside most enterprises.

The Misallocation Problem

Here’s what’s happening in 2026:

·       AI pilots are everywhere, but rarely connected to core business outcomes.

·       Automation is measured by activity, emails sent, and hours saved, not by incremental margin.

·       Platform dashboards report performance, but finance can’t trace it to the P&L.

·       AI investments are scattered across marketing, ops, and IT, with no unified ownership.

This is the misallocation problem. When intelligence is cheap, inefficiency hides. But if AI becomes more expensive or less available, waste becomes glaring and costly.

Let’s clarify three terms:

System Design: How intelligence flows into revenue. Are models and tools built to drive sales, retention, or margin?

Organizational Alignment: Who owns the outcome? Is AI a tech initiative, or is it embedded in the business’s core value drivers?

Defensible ROI: Can you measure the incremental financial impact, or are you relying on vendor claims and vanity metrics?

If intelligence costs rise, only companies with discipline in these areas will protect their margins. Scarcity punishes poor governance.

A 2026 Stress Test for AI-Dependent Growth

·       What percentage of next year’s growth assumes AI-driven efficiency gains?

·       If those gains slow by 25–30%, what happens to EBITDA?

·       Which AI use cases are directly tied to gross margin?

·       Where is AI embedded into revenue-generating workflows?

·       Can finance validate incrementality beyond platform-reported metrics?

·       Who owns AI ROI; marketing, technology, or P&L?

Strategic Reframe

The real 2028 crisis won’t be about GPU supply. It will be triggered by slow decision cycles, misaligned KPIs, fragmented data, and capital deployed without governance. As Citrini warns, “Systemic fragility will punish organizations that ‘fail to align intelligence deployment with strategic objectives.’”

The companies that win will have:

Strategic Edge: They know where intelligence moves the needle.

Decision Confidence: They can reallocate capital quickly as conditions shift.

Future-Proofing: Their system design is resilient to supply shocks.

Organizational Alignment: Every leader owns a piece of AI ROI.

Defensible ROI: Financial impact is measured, not assumed.

Final Thoughts

Markets respond to macro narratives. Boards are judged on capital discipline. The winners in the next era won’t be those with the most intelligence capacity; they’ll be those who allocate it best.

Intelligence will scale. Discipline is the real constraint.

Next
Next

Marketing Anxiety in 2026: Why Decision Confidence, Not Channels, Is the Real Leadership Challenge