March 26, 2025

Weaponized AI, Women’s Influence Networks, and Why Defense Tech Keeps Failing, with Maggie Feldman-Piltch

If you want to understand the future of power, don’t look at politicians. Look at who controls the flow of information.

That was the core theme of my conversation with Maggie Feldman-Piltch, a national security expert who spends her time dissecting influence operations, misinformation campaigns, and the overlooked ways power is actually shaped in the modern world.

The conversation left me with three big takeaways:

  1. Generative AI is the new radio, but the playbook for weaponizing information hasn’t changed.
  2. Women’s networks are a major battleground for global influence ops—and most people in national security aren’t paying attention.
  3. The biggest failures in defense tech come from a fundamental misunderstanding of the human element.

AI Is the New Radio—And We’re Already Behind

Maggie made a simple but powerful analogy:

“Generative AI is to 2025 what radio was to 1938.”

We’ve been here before. Every time a new communication technology emerges, governments, corporations, and bad actors race to exploit it—long before the public or regulators catch on.

  • Radio was an open, powerful tool in the 1930s. Governments used it for propaganda, military planning, and influence campaigns that reshaped entire societies.
  • AI today is following the same trajectory—but at hyperspeed. Instead of controlling the airwaves, actors are shaping algorithmic visibility, deploying deepfakes, and using AI to manipulate narratives at scale.

The core problem? We aren’t treating this shift with the urgency it deserves.

Right now, the focus is on whether AI generates misinformation, as if that’s the biggest risk. But Maggie pointed out something deeper: the real threat is how AI accelerates and amplifies influence operations—making them more effective, more personalized, and harder to detect.

Governments are still reacting to AI as a fact-checking problem when they should be treating it as an information warfare problem. If history tells us anything, whoever figures out how to weaponize a new technology first gets to set the rules—and right now, authoritarian regimes are far ahead of democratic institutions in defining AI’s role in global influence.

The Overlooked Battlefield: Women’s Networks and Information Control

Most people don’t think about gender when they think about influence operations. They should.

Maggie made the case that women’s communities are a major target for global disinformation—and national security analysts are largely ignoring it.

Here’s why:

  • Women handle the majority of social coordination in families and communities. They are the ones making healthcare decisions, planning education, and influencing local economies—meaning they control key nodes of trust in society.
  • Adversaries understand this better than we do. Russia, China, and other authoritarian states flood women’s spaces with propaganda, lifestyle content, and “soft influence” strategies that subtly shift opinions without ever mentioning politics directly.
  • National security is still male-dominated. As a result, policymakers and analysts often focus on traditional power structures (governments, media, military) rather than how influence actually spreads in everyday life.

Here’s the reality: If you want to shape public sentiment, you don’t just target institutions—you target where trust already exists.

Ignoring the role of women’s networks in information warfare isn’t just a blind spot—it’s an active failure to understand how influence moves in the modern world.

The Pentagon’s Tech Obsession Is Blinding It to Real Problems

Maggie shared a story that perfectly encapsulates why defense tech keeps failing.

A company developed a new system to help fighter pilots relieve themselves in-flight. They spent years on R&D, secured major funding, and were about to roll it out across military aircraft.

One problem:

It only worked for pilots who had a penis.

The team had completely overlooked the fact that not all pilots are men—a mistake so obvious it’s almost laughable. But it’s also revealing.

This isn’t just about one product. It’s about how defense innovation repeatedly ignores the human element:

  • The Pentagon doesn’t need more tech. It needs better tech that’s actually designed for the people using it.
  • Startups and contractors focus on solving technical challenges but ignore the real-world contexts where those solutions need to work.
  • Bureaucracy isn’t the main reason new defense tech isn’t getting adopted—bad design is.

This mindset problem extends beyond defense. AI is being developed in a similar vacuum—built by technologists who don’t engage with the complexity of real-world social dynamics.

The result? Systems that look great on paper but fail in the environments where they actually need to operate.

Where Do We Go from Here?

This episode left me with one overwhelming realization:

We are repeating the same mistakes—faster.

  • We aren’t learning from the history of information warfare.
  • We’re ignoring major influence channels (like women’s networks) that adversaries are actively exploiting.
  • We keep building tech in isolation from the real-world contexts where it actually matters.

These aren’t theoretical problems. They are shaping how power and influence will work in the next decade.

The future of AI, national security, and information warfare won’t be decided by who has the most powerful models—but by who understands how to integrate technology into real human systems of trust and influence.

Right now, we’re losing that battle.

The question is: Are we willing to change our approach before it’s too late?

Find the signal in the noise of global information. Delivered to your inbox every wednesday.
Copyright © Filter Labs