Skip to content

Interactive Notebooks

Run all 25 tutorial notebooks directly in your browser using JupyterLite — no installation required.

Open JupyterLite

Available Notebooks

Phase 1: Core Patterns

NotebookDescription
01_chatbot_basics.ipynbBuild your first LangGraph chatbot
02_tool_calling.ipynbImplement ReAct pattern with tools
03_memory_persistence.ipynbAdd conversation memory
04_human_in_the_loop.ipynbHuman approval workflows
05_reflection.ipynbSelf-improving agents
06_plan_and_execute.ipynbPlanning and execution patterns
07_research_assistant.ipynbMulti-step research agent

Phase 2: RAG Patterns

NotebookDescription
08_basic_rag.ipynbBasic retrieval-augmented generation
09_self_rag.ipynbSelf-correcting RAG
10_crag.ipynbCorrective RAG with web fallback
11_adaptive_rag.ipynbQuery-adaptive retrieval
12_agentic_rag.ipynbAgent-driven RAG
13_perplexity_clone.ipynbBuild a Perplexity-style search

Phase 3: Multi-Agent Patterns

NotebookDescription
14_multi_agent_collaboration.ipynbAgent collaboration basics
15_hierarchical_teams.ipynbHierarchical agent teams
16_subgraphs.ipynbModular graph composition
17_agent_handoffs.ipynbAgent-to-agent handoffs
18_agent_swarm.ipynbSwarm-style agent systems
19_map_reduce_agents.ipynbParallel agent processing
20_multi_agent_evaluation.ipynbEvaluating multi-agent systems

Phase 4: Advanced Reasoning

NotebookDescription
21_plan_and_execute.ipynbAdvanced planning patterns
22_reflection.ipynbAdvanced reflection techniques
23_reflexion.ipynbReflexion pattern implementation
24_lats.ipynbLanguage Agent Tree Search
25_rewoo.ipynbReWOO reasoning pattern

How It Works

JupyterLite runs entirely in your browser using WebAssembly:

  • No server required — Everything runs client-side
  • Instant startup — No waiting for kernel initialization
  • Persistent storage — Your work is saved in browser storage
  • Full Python — Powered by Pyodide (Python 3.11)

Limitations

JupyterLite cannot make network requests to external services. Notebooks that require Ollama or other API calls will show simulated outputs. For full functionality, run notebooks locally with langgraph-local serve.

Running Locally

For the complete experience with actual LLM calls:

bash
# Clone the repository
git clone https://github.com/AbhinaavRamesh/langgraph-ollama-tutorial
cd langgraph-ollama-tutorial

# Install dependencies
pip install -e ".[dev]"

# Start the local environment
langgraph-local serve

This launches Jupyter with Ollama integration for real LLM interactions.