204 / 727

‘They feel true’: political deepfakes are growing in influence – even if people know they aren’t real

TL;DR

AI-generated images of fabricated women in military gear are being monetized and used as effective propaganda tools.

Key Points

  • Researchers warn these avatars help idealize political figures like Donald Trump – even when viewers know the content is fake.
  • Experts call this 'emotionally true': the images feel real despite being synthetic.
  • Creators generate real income from these accounts while simultaneously shaping geopolitical narratives.
  • Deepfakes have moved beyond real public figures – entirely fabricated identities now drive new propaganda formats.

Nauti's Take

'Knowing it's fake' is no longer a shield – that's the uncomfortable core finding here. Next-generation AI propaganda doesn't need deception, it only needs resonance.

Building Trump-idealizing military avatars and monetizing them produces political framing as a side effect by design. Platforms that allow monetization of such content are not neutral infrastructure – they are co-producers of this reality distortion.

Any regulation that only asks 'is it real? ' fundamentally misses the point.

Context

The assumption has long been that awareness of fakery neutralizes its effect. This research challenges that directly. Emotional impact and cognitive recognition operate on different tracks – propaganda works even when the brain flags 'fake.

' Military framing combined with sexualized imagery targets two of the most potent psychological triggers simultaneously, making this deepfake category more dangerous than conventional misinformation.

Sources