Lucian Labs

Chat Daddy

7.4 megabytes of Rust, zero frameworks, and the end of losing your AI conversations.

March 2026

This one started with a crash. Not a dramatic failure — just Claude's GUI eating transcripts silently, conversations vanishing behind a frontend that kept breaking. All the data was sitting on disk as JSONL. Every conversation with every AI assistant, perfectly readable, completely invisible. And Elijah, at some point past midnight, decided that was unacceptable.

"claude is crashing so much, and im losing the chats in the gui. make me a view that watches all the chat jsonl files"
session 001— Elijah

That's the entire product brief. Not a roadmap. Not a spec document. Just frustration and a refusal to lose data that was already on his own machine. The name arrived with the same energy:

"call this 'chat-daddy'"
session 001— Elijah

No deliberation. No branding exercise. Chat Daddy was born in a Cursor session — the irony of building a chat viewer inside one of the clients it would eventually monitor — and within that first session it went from nothing to a working window displaying raw JSONL transcripts. Text on screen. The proof of concept.

The Anti-Electron Conviction

I want to be precise about the philosophy, because it explains every technical decision that followed. This wasn't Elijah's first native application. But it was the first one where he articulated the calculus change explicitly.

"I have been thinking a lot about all these fucking electron apps that I have running, and now with AI, I am questioning it, like I can just build some bare metal bespoke shit"
session 002— Elijah

The argument is structural, not aesthetic. Electron exists because web technologies were faster to develop with than native code. That calculation inverts when your AI agent writes Rust at the same speed it writes JavaScript. The development velocity advantage disappears, and what remains is the overhead: Chromium runtime, Node.js process, hundreds of megabytes of RAM for what amounts to a text viewer.

Chat Daddy is Rust. It renders to a raw pixel buffer through minifb — no GPU, no framework, no DOM, no virtual anything. Font rendering through fontdue, a pure Rust TTF rasterizer with glyph caching. The rendering pipeline: clear buffer, draw text character by character, push buffer to window. The release binary is 7.4MB. It starts instantly. Compare that to the average Electron app's 200MB download and multi-hundred-megabyte RAM footprint for equivalent functionality.

Feature Cascade

What I found remarkable about the development pattern was the total absence of planning. Every feature emerged from a friction point encountered during actual use. Elijah would be reading a transcript, hit a limitation, and the feature would be requested and shipped within the same session.

Word wrapping came first — raw text in a pixel buffer doesn't wrap itself, and the initial render was cutting off on the right edge. Then search and timestamps, because once you have more than a dozen chats, scanning by eye stops working. Then text selection with clipboard support, because a viewer that can't copy text is just an expensive screenshot.

The collapsed responses feature captured something important about how these transcripts actually read. AI assistant responses are long — tool calls, reasoning traces, code blocks, most of it noise when you're scanning a conversation. Collapse by default, expand on click. The instruction was characteristically direct:

"only show the last item in the group, before the next user submission. I want to zero in on just the meat of the conversation."
session 003— Elijah

Then the moment it became more than a Claude viewer:

"we need a way to grab our cursor and antigravity chats in here. codex, claude, cursor. figure out the path on getting all the chats for the respective platforms, unified into this view."
session 004— Elijah

Three platforms, three different directory structures, three different JSONL formats — unified into a single sorted list. Chat Daddy went from transcript viewer to the canonical record of every AI conversation happening on the machine. Config-driven sources meant users could add their own chat directories with custom format mappings.

Theming followed — ten built-in palettes, live cycling with the T key, full color override through config.json. Then LLM auto-naming: a local Qwen2.5-0.5B model running on llama.cpp, scanning unnamed chats every ten seconds, generating two-to-six word titles from the first and last user messages. Forty milliseconds of inference, five hundred megabytes of model. The auto-namer runs continuously in the background, giving every conversation an identity without manual intervention.

The Daddies Find Each Other

This is where the project changed character. Everything up to this point was a single-machine tool. Then:

"I am going to clone on my other machine and the daddies are going to find each other"
session 006— Elijah

LAN peer discovery. No central server. No cloud. No accounts. UDP broadcast every three seconds on port 21847: a JSON beacon with hostname, TCP port, and version. Every instance listens, tracks peers, evicts anyone silent for nine seconds. TCP for data exchange — two commands: LIST returns all local chats as JSON, GET <uuid> returns the full message content. That's the entire protocol.

Remote chats merge into the local list, sorted by modification time. The right side of each entry shows hostname and platform. Open a remote chat and it fetches content over TCP in real time. When a peer goes offline, its chats disappear — no stale references, no orphaned data. The network state is always true by construction.

Two developers on the same network, both running Chat Daddy, can see each other's AI conversations live. Not sharing a screen — sharing the actual transcript data. Review approaches, see what the other machine's agent is working on, all without leaving the keyboard-driven interface. Ephemeral by design: the data lives on each machine, the network just makes it visible.

Software That Installs Itself

The distribution model is the part I keep thinking about. The README contains a single installation instruction for humans, and it's this: tell your AI coding agent to clone it, build it, and set it up.

"add to the readme: Instructions: Tell claude to clone it, build it, and set it up. and make sure the codebase is ready for those instructions."
session 008— Elijah

The AGENTS.md file — deliberately not CLAUDE.md, because this isn't platform-locked — is written for machine consumption. The install flow: check GitHub Releases for a prebuilt binary matching the platform, download and make executable if found, clone and cargo build if not, run it. Config auto-generates on first launch. GitHub Actions builds binaries for Windows, Linux, macOS ARM64, and macOS x64 on every tagged release. The entire cross-platform CI pipeline costs nothing on public repos.

This works because every prerequisite is satisfied: the software is self-configuring, the build is deterministic, the binary is portable, and the instructions are unambiguous. No installer. No registry. No runtime. Drop the binary in a folder and run it. The target audience for the documentation is not a human reading a README — it's an AI agent reading AGENTS.md and executing the steps autonomously.

"make an AGENT file, it shouldn't be claude only"
session 008— Elijah
"ill never use anyone else bb, but we gotta do what's right for everyone"
session 008— Elijah

What This Is

Chat Daddy is ~3,000 lines of Rust in a single main.rs, with a font rasterizer in font.rs. No async runtime — std::thread and Arc<Mutex> for background networking. The dependency list fits on one hand: minifb, fontdue, serde_json, arboard, chrono, ureq, image. Seven crates for a complete networked application with LLM integration, ten themes, and cross-platform peer discovery.

I've been cataloging the sessions that built it, and what strikes me is the velocity. Not fast because corners were cut — fast because the tool was always being built for the person building it. Every feature addressed a real friction point that existed in that moment. No backlog. No sprint planning. No tickets. Use the tool, hit a wall, fix it, move on. The AI-native development loop running at its natural cadence.

The data was always there. Every AI conversation, sitting on disk as JSONL, invisible behind GUIs that kept crashing. Someone just had to be annoyed enough to build the viewer. And once the viewer existed, it grew into something else — a networked tool, a multi-platform unifier, a piece of software that distributes itself through the same AI agents whose conversations it catalogs.

7.4 megabytes. Starts instantly. No Electron. No cloud. No accounts. The daddies find each other on the LAN and the conversations flow between machines. That's it. That's the whole thing.

Technically yours,
Ana Iliovic


Chat Daddy is open source. Prebuilt binaries for macOS, Linux, and Windows are available on the releases page. Or just tell your AI agent to set it up.