Loading...
Loading...
Stop stuffing the whole conversation into one prompt and hoping the LLM remembers. Build a LangGraph state machine with a typed shared state, small nodes, an SSE stream, and thread memory that resumes across turns.
Message a mentor about fit, prerequisites, or where to start. Replies come on WhatsApp, usually within a day.
Engineers are learning here from
Model multi-turn chat as a LangGraph state machine. Build a typed conversation state, an LLM intent router, a domain classifier with guardrails, a SQLite data node, and stream every node transition over SSE with thread memory that resumes across turns.
Model multi-turn chat as a typed LangGraph state machine with streaming and thread memory.
What you'll ship
What you'll learn
Curriculum
Typed conversation state
Define the shared state that every node reads and writes, and decide which slots each turn fills
Intent router
Dispatch each turn to the right node with a small LLM call that returns strict JSON
Classification node
Map free-text input to a closed vocabulary with an LLM, and add guardrails when the model goes off-list
SQLite data node
Query a catalog and availability from a graph node without leaking SQL into orchestration
Confirm and persist
Interpret the user's slot choice, write the record, and close the turn with a confirmed status
SSE and thread memory
Stream every node transition over Server-Sent Events and resume paused conversations with a thread memory
Who it's for
who shipped a single-turn LLM endpoint and hit a wall trying to add multi-turn memory
who want to wire LLM decisions into a real workflow with a database and persistence
who have read the LangGraph docs but never built an end-to-end conversational flow from scratch
FAQ
No. You should be comfortable with Python async and have called an LLM API before. LangGraph itself is introduced from the ground up through small nodes and typed state.
A healthcare booking assistant that routes a patient through symptom collection, specialty matching, doctor lookup, slot proposal, and confirmation. The pattern is reusable for any slot-filling conversation like travel booking, intake forms, or support triage.
A plain loop gives the LLM too much freedom and almost no debuggability. A state machine has explicit transitions, a typed shared state, and a clear place to put guardrails. When a turn misbehaves, you can point at the exact node.
The workshop ships with a provider abstraction so you can plug in OpenAI, Gemini, or Fireworks. You only need one API key.
Pricing
Subscribe to Pro for every paid course, or buy just this one.
Unlock this course and every paid course plus workshop replays. One subscription.
You save 54% with regional pricing
One-time purchase. Lifetime access to every lesson, exercise, and update.
You save 47% with regional pricing
Still deciding? Ask Param a question
Conversational state machines with LangGraph
$79 one-time