AI may never be as cheap as it is today
TL;DR
AI usage is cheaper today than it has ever been – but that window may be closing.
Key Points
- Writer CEO May Habib told Axios that LLM companies will be forced to raise prices around their IPOs.
- New models from OpenAI, Google, and Anthropic are faster and cheaper, driven by massive efficiency gains in inference.
- Nvidia is expected to unveil a more efficient inference chip at its developer conference next week.
- The pattern mirrors Amazon and Uber: hook users with subsidized prices, then monetize once locked in.
Nauti's Take
This is the oldest playbook in tech: subsidize adoption, then normalize margins once users are locked in. Uber did it with rides, Amazon did it with Prime, and there is no structural reason AI should be different.
Current token prices are sometimes absurdly low – GPT-4-class intelligence for fractions of a cent. That will not last.
Anyone scaling AI-dependent products while assuming today's cost structure holds is building on shaky ground. Use the cheap era while it lasts – but price in the reversal.