March 19, 2025

How AI’s Acceleration is Reshaping Power, Trust, and Decision-Making, with Azeem Azhar, Founder at Exponential View

Azeem Azhar’s exponential gap framework highlights a growing reality: Technology is evolving exponentially, but society is struggling to keep up.

This isn’t just about innovation outpacing regulation. It’s about power consolidating in the hands of a few, institutional trust breaking down, and decision-making becoming harder—not easier—in a world drowning in data but starving for meaningful insights.

During our conversation on Navigating Noise, Azeem and I explored how this gap manifests in three key ways:

  1. AI’s rapid improvement is creating a knowledge divide, not closing it.
  2. Corporate actors, not governments, are setting the rules of engagement.
  3. Access to high-quality data—not just raw information—will determine who adapts and who gets left behind.

AI’s Knowledge Divide: More Information, Less Understanding

One of the most striking examples of the exponential gap is AI’s acceleration.

Azeem pointed out that OpenAI’s models went from ranking among 100,000 human software developers to the top 200 in under a year. That’s an order-of-magnitude shift. But here’s the problem: The understanding of AI’s capabilities isn’t accelerating at the same rate.

  • Regulators don’t know how to govern it.
  • Executives don’t know how to integrate it.
  • The public doesn’t know how to trust it.

This isn’t a problem of access—anyone can use AI. It’s a problem of context. The tools are advancing so quickly that even power users struggle to distinguish where AI is reliable and where it isn’t.

Azeem described AI as a highly capable but sometimes erratic graduate research assistant. It can generate incredible insights one moment, then completely miss the mark the next. The problem? AI is getting smarter faster than we’re getting better at using it.

Which leads to the second major divide: who actually controls AI’s trajectory.

Corporate Power vs. Public Interest: Who’s in Control?

The exponential gap isn’t just widening between technology and society. It’s widening between corporations and governments, particularly when it comes to AI.

Azeem was blunt: If OpenAI, Anthropic, and Google disappeared tomorrow, AI wouldn’t stop evolving. Thousands of researchers across the globe are building these systems. But here’s the difference—governments aren’t leading that charge.

Right now, AI’s trajectory is being determined by a handful of tech companies whose incentives are, at best, aligned with profit, and at worst, actively hostile to broader societal interests.

  • Who decides what data AI models are trained on?
  • Who determines the values AI embeds in its reasoning?
  • Who benefits when AI systems become too powerful for human oversight?

For now, these decisions are being made inside corporate boardrooms, not by democratic institutions. And without a serious shift in governance and accountability, the gap between public interest and corporate power will only widen.

Which brings us to the final—and most important—takeaway from my conversation with Azeem:

High-Quality Data: The Only Bridge Across the Exponential Gap

If technology is accelerating faster than our institutions can adapt, and corporate actors are outpacing public governance, what’s the solution?

The answer isn’t just more data. It’s better data.

Azeem and I discussed how local, high-fidelity data is the missing piece in this equation. Right now, most organizations—whether governments, financial firms, or media outlets—are drowning in global-scale data but blind to local realities.

AI models train on massive public datasets, but they often miss the signals that actually shape human behavior.

  • What’s happening on the ground in China’s housing markets?
  • How are emerging political movements forming in places with low social media visibility?
  • What are the early warning signs of economic shifts before they show up in the global data?

If AI is going to be a force for decision-making rather than just noise generation, the key is hyper-local, contextual, and verifiable data. This is where organizations will either thrive or fail in the AI age.

The Exponential Gap is Here. The Question is Who Closes It.

The takeaways from this conversation with Azeem couldn’t be clearer:

  • AI isn’t just getting smarter—it’s outpacing our ability to understand it.
  • Corporate power, not public governance, is shaping AI’s trajectory.
  • The future of AI decision-making hinges on high-quality, local data.

The exponential gap isn’t going to close itself. It’s a question of who adapts and who gets left behind.

And right now, that decision is being made by those who control the best data—not just the biggest models.

Find the signal in the noise of global information. Delivered to your inbox every wednesday.
Copyright © Filter Labs