Azeem Azhar’s exponential gap framework highlights a growing reality: Technology is evolving exponentially, but society is struggling to keep up.
This isn’t just about innovation outpacing regulation. It’s about power consolidating in the hands of a few, institutional trust breaking down, and decision-making becoming harder—not easier—in a world drowning in data but starving for meaningful insights.
During our conversation on Navigating Noise, Azeem and I explored how this gap manifests in three key ways:
One of the most striking examples of the exponential gap is AI’s acceleration.
Azeem pointed out that OpenAI’s models went from ranking among 100,000 human software developers to the top 200 in under a year. That’s an order-of-magnitude shift. But here’s the problem: The understanding of AI’s capabilities isn’t accelerating at the same rate.
This isn’t a problem of access—anyone can use AI. It’s a problem of context. The tools are advancing so quickly that even power users struggle to distinguish where AI is reliable and where it isn’t.
Azeem described AI as a highly capable but sometimes erratic graduate research assistant. It can generate incredible insights one moment, then completely miss the mark the next. The problem? AI is getting smarter faster than we’re getting better at using it.
Which leads to the second major divide: who actually controls AI’s trajectory.
The exponential gap isn’t just widening between technology and society. It’s widening between corporations and governments, particularly when it comes to AI.
Azeem was blunt: If OpenAI, Anthropic, and Google disappeared tomorrow, AI wouldn’t stop evolving. Thousands of researchers across the globe are building these systems. But here’s the difference—governments aren’t leading that charge.
Right now, AI’s trajectory is being determined by a handful of tech companies whose incentives are, at best, aligned with profit, and at worst, actively hostile to broader societal interests.
For now, these decisions are being made inside corporate boardrooms, not by democratic institutions. And without a serious shift in governance and accountability, the gap between public interest and corporate power will only widen.
Which brings us to the final—and most important—takeaway from my conversation with Azeem:
If technology is accelerating faster than our institutions can adapt, and corporate actors are outpacing public governance, what’s the solution?
The answer isn’t just more data. It’s better data.
Azeem and I discussed how local, high-fidelity data is the missing piece in this equation. Right now, most organizations—whether governments, financial firms, or media outlets—are drowning in global-scale data but blind to local realities.
AI models train on massive public datasets, but they often miss the signals that actually shape human behavior.
If AI is going to be a force for decision-making rather than just noise generation, the key is hyper-local, contextual, and verifiable data. This is where organizations will either thrive or fail in the AI age.
The takeaways from this conversation with Azeem couldn’t be clearer:
The exponential gap isn’t going to close itself. It’s a question of who adapts and who gets left behind.
And right now, that decision is being made by those who control the best data—not just the biggest models.