Show HN: Reduce LLM token use by ~30% with this MCP/CLI tool(Claude benchmarked)

TL;DR

Smart code reading for humans and AI agents.

Key Points

  • Tilth is what happens when you give ripgrep, tree-sitter, and cat a shared brain.
  • 4: Added adaptive 2nd-hop impact analysis to callers search — when a function has ≤10 unique callers, tilth automatically traces callers-of-callers in a single scan.
  • First full 26-task Opus baseline (previously 5 hard tasks only).
  • Haiku adoption improved from 42% to 78%, flipping Haiku from a cost regression to -38% $/correct.

Nauti's Take

Tilth forces token math instead of KI gut feeling: adaptive second hops and the new 6k token view keep agents from looping through multi-section reads, so Haiku stops being a cost sink. If you build KI agents, track this delta and stop wasting upstream tokens on repeated scans.

Sources