---
title: "Show HN: Running AI agents across environments needs a proper solution"
slug: "show-hn-running-ai-agents-across-environments-needs-a-proper-solution"
date: 2026-03-24
category: community
tags: [agents, open-source]
language: en
sources_count: 1
featured: false
publisher: AInauten News
url: https://news.ainauten.com/en/story/show-hn-running-ai-agents-across-environments-needs-a-proper-solution
---

# Show HN: Running AI agents across environments needs a proper solution

**Published**: 2026-03-24 | **Category**: community | **Sources**: 1

---

## TL;DR

- A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.

---

## Summary

- A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.
- The evolution goes from LLM+Tools through workflows to full agent systems with tools, CLI access, memory, and fine-grained system capabilities.
- The open-source project Odyssey aims to provide a lightweight, scalable runtime for thousands of concurrent agents.
- Core problem: LLMs already introduce significant latency, and adding heavy container overhead on top makes things worse.

---

## Why it matters

A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.

---

## Key Points

- A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.
- The evolution goes from LLM+Tools through workflows to full agent systems with tools, CLI access, memory, and fine-grained system capabilities.
- The open-source project Odyssey aims to provide a lightweight, scalable runtime for thousands of concurrent agents.
- Core problem: LLMs already introduce significant latency, and adding heavy container overhead on top makes things worse.

---

## Nauti's Take

The point is valid: most 'agent frameworks' are glorified wrappers around LLM calls, not actual runtimes. Anyone who has tried to run more than a few dozen agents concurrently knows Docker is the wrong tool for the job. Whether Odyssey is the answer remains to be seen – a GitHub project without broad production validation is still a promise. The direction is interesting though: agents need what Node.js was for async I/O – something fundamental, not just another abstraction layer.

---


## FAQ

**Q:** What is Show HN about?

**A:** - A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.

**Q:** Why does it matter?

**A:** A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.

**Q:** What are the key takeaways?

**A:** A developer argues that current infrastructure is not ready for true AI agents – Docker is too heavy, Python agents consume too much memory.. The evolution goes from LLM+Tools through workflows to full agent systems with tools, CLI access, memory, and fine-grained system capabilities.. The open-source project Odyssey aims to provide a lightweight, scalable runtime for thousands of concurrent agents.

---

## Related Topics

- [agents](https://news.ainauten.com/en/tag/agents)
- [open-source](https://news.ainauten.com/en/tag/open-source)

---

## Sources

- [Show HN: Running AI agents across environments needs a proper solution](https://github.com/liquidos-ai/Odyssey) - Hacker News AI

---

## About This Article

This article is a synthesis of 1 sources, curated and summarized by AInauten News. We aggregate AI news from trusted sources and provide bilingual (German/English) coverage.

**Publisher**: [AInauten](https://www.ainauten.com) | **Site**: [news.ainauten.com](https://news.ainauten.com)

---

*Last Updated: 2026-03-24*
