Distill any task into a pocket-sized Spirit.

Hand us a teacher API and a task spec. We distill it into a 20–50M model that runs on CPU, edge, or browser with zero API dependency at inference.

$0.30
Avg distillation cost
27m
Median run time
20M
Typical Spirit size
TeacherGemini 2.5
Examples0
Proof0
Yield0 ml

⚗ The Process

A real distillation, in our language. Each step maps to something concrete in the engine.

🌾
The Mash
Define a task. We seed the corpus with prompts your teacher will riff on.
= training data spec
📜
The Recipe
Pick a teacher (Claude, Gemini, GPT), student arch, eval criteria. Save & fork.
= versioned config
🔥
The Still
We distill. Live progress, fluid sim, loss curve, drip counter — all real.
= the training run
📝
Tasting Notes
Auto-generated eval report. Strengths, weaknesses, failure cases, vs. teacher.
= held-out metrics
🍾
Bottling
Export to ONNX, GGUF, or browser-WASM. Signed, versioned, deployable.
= deployment artifact

▶ Try It

Install the CLI and distill your first Spirit. About 30 minutes on a single GPU, ~$0.30 in teacher API.

# Install
pip install distillarium[gemini]

# Distill — uses GOOGLE_API_KEY from env
distillery distill recipes/needle.tool-calling-v1.yaml

# Inspect your local Cellar
distillery cellar

# Re-taste against fresh held-out data
distillery taste spirits/needle.pt --mash held_out.jsonl

# Bottle for deployment (ONNX, GGUF, WASM)
distillery bottle spirits/needle.pt --format onnx

🍾 The Cellar

Browse and fork community Spirits. Every one ships with full Tasting Notes.

Needle — Tool Calling
20.7M params by @cactus
78 PROOF · tool-name accuracy
PII Guard — Privacy / Compliance
14M params by @house
82 PROOF · F1
Claimant — Fact Check / Verification
32M params by @house
91 PROOF · accuracy
Routor — Intent Classifier
8M params by @nordic
74 PROOF · macro-F1

Explore all Spirits →

🧪 What could YOU distill?

A Spirit is good for any task you'd otherwise hit an API for over and over. Fork a recipe, swap the catalog, distill.

🛠
An agent's tool-caller
Replace the function-calling round-trip to a frontier model. Sub-50ms on CPU.
recipe: needle.tool-calling
🔒
A PII redactor
Catch names, addresses, SSNs in prompts before they ever leave your terminal.
recipe: privacy.pii-guard
🎯
An intent classifier
Route incoming messages to the right downstream agent. 8M params, basically free.
recipe: routing.intent
📊
A SQL query parser
Natural-language → structured SQL, locally, without leaking your schema.
recipe: data.sql-parse (waitlist)
🧾
A receipt structurer
Photo / OCR text → JSON line items. Edge-runnable on a phone.
recipe: data.receipt-ner (waitlist)
🧪
Your own idea
Write a recipe, point at a teacher, hit distill. Share the Spirit in the Cellar.
recipe: (start a new one)

🍾 Anatomy of a Spirit

Every bottled Spirit is a single self-contained artifact. Hover the parts to see what each one is in ML terms.

🟫
Cork — the Recipe. Sealed, signed config that produced this Spirit. You can re-distill from it byte-for-byte.
🧷
Neck — the inference shape. Tokenizer, vocab size, max sequence length.
🪟
Glass — the bottling format. PyTorch, ONNX, or GGUF wrapper around the weights.
🟧
Amber — the trained weights. The actual model parameters. Darker = higher proof.
📝
Label — the Tasting Notes. Name, vintage, batch, headline proof. Auto-generated, honest.

📜 The Vocabulary

Every term means something concrete. Each maps to an ML concept — and that's deliberate.

🌾Mashtraining corpus the teacher generates from
📜RecipeYAML config — versioned, forkable, reproducible
Cutstrain/val/test splits
🪜Heads · Hearts · Tailsfilter quality — kept / borderline / discarded
🔥The Stillthe training run itself
📈Proofheld-out accuracy. higher = more concentrated
📝Tasting Notesauto-generated eval report — strengths, weaknesses, samples
🛢Aging in Caskscontinued training / RLHF / refresh
🍾Bottlingexport — .pt / .onnx / .gguf / .wasm
🏛The Cellaryour (or the public) library of Spirits

🤝 Support the Lab

The Distillery is open-source, MIT-licensed, and built solo. If you find it useful, here are three ways to help.

Star the repo
Free, takes one click. Signal helps me prioritize what to build next.
🍾
Distill a Spirit
Write a recipe, distill, share. Every public Spirit ships with full Tasting Notes — including the failure cases.
💌
Sponsor / Enterprise
Need a custom Spirit, self-hosting, or compliance support? Drop a line. No salespeople.

This is not a SaaS. The CLI is free, runs locally, uses your own teacher API key, and produces models you own. Nothing reports back to us. The "Cellar" is just a public showcase you can opt into.

Built on top of the Research Radar pipeline — an autonomous research-to-product system. The Spirit you're looking at (Needle) started life as a Show HN paper that the Radar surfaced last Tuesday.