Show HN: Atombot – tiny personal assistant for local models and GPT‑5.4
TL;DR
Atombot is a self-hosted AI assistant with only ~500 lines of core code – compared to ~400k in OpenClaw.
Key Points
- It runs locally via Ollama or LM Studio, and also supports GPT-5.4 through the Codex CLI, with auto-detection at onboarding.
- Telegram acts as the interface with allowlist-based access, persistent memory, and searchable daily history logs.
- One-time and recurring reminders are built in; the skills system is compatible with the OpenClaw SKILL.md format.
Nauti's Take
500 lines versus 400,000 – that is not an oversight, it is a design philosophy. Atombot demonstrates that a personal AI assistant does not require a sprawling ecosystem to be genuinely useful.
The Telegram integration is pragmatic; if you already use Telegram, setup is immediate. The GPT-5.4 support via Codex CLI feels like a very fresh addition that will likely evolve.
Privacy-conscious users will stick to Ollama. For developers and tinkerers who want an assistant they can actually own and understand, this is a compelling starting point.