Loading...
Loading...
Wire net/http to an OpenAI-compatible endpoint, hand-roll the tool loop, add file and shell tools with guardrails, and ship the whole thing as a static distroless binary. You will finish with an agent you can explain line by line.
Message a mentor about fit, prerequisites, or where to start. Replies come on WhatsApp, usually within a day.
Engineers are learning here from
Build a local coding agent in pure Go using only the standard library. Wire net/http to an OpenAI-compatible endpoint, implement read_file, write_file, and run_bash tools, run a real tool-calling loop, add safety guards, and ship it as a static distroless binary.
Build a zero-dependency coding agent in Go with tool calling, safety guards, and a distroless binary.
What you'll ship
What you'll learn
Curriculum
Zero-dependency LLM call
Wire Go stdlib net/http to an OpenAI-compatible endpoint and confirm a plain chat round-trip works
First tool: read_file
Advertise a tool, dispatch the call, and run a single tool round-trip
The agent loop
Close the loop so tool outputs feed the next completion until the model emits a final answer
Writing files
Add write_file with path safety and a simple diff to show what changed
Shell execution
Add run_bash with a denylist, a hard timeout, and truncation
Iteration caps and budget
Cap steps, watch tokens, and give the model a clean way to break out
Shipping
Package the agent as a single static binary inside a distroless nonroot image
Who it's for
who want to understand agents without adopting a Python stack
who have used Cursor or Claude Code and want to see what is underneath
who need a small, auditable agent binary they can drop into a CI container
FAQ
You should be comfortable with structs, interfaces, goroutines, and the standard library. The course does not teach Go itself, it teaches how to wire an agent loop in Go.
The point is to see the protocol. Once you can build the tool loop by hand, every SDK becomes obvious. You can still adopt a framework later if you want, with a much clearer mental model.
Any OpenAI-compatible endpoint works. The default is OpenRouter with a small open model like Gemma, so you can iterate cheaply. Swapping to a stronger model is one env variable.
The agent ships with a shell denylist, a hard execution timeout, and an iteration cap. For real isolation, the final phase runs the binary inside a distroless container with your repo mounted as a single volume.
Pricing
Subscribe to Pro for every paid course, or buy just this one.
Unlock this course and every paid course plus workshop replays. One subscription.
You save 54% with regional pricing
One-time purchase. Lifetime access to every lesson, exercise, and update.
You save 47% with regional pricing
Still deciding? Ask Param a question
Building coding agents from scratch in Go
$79 one-time