Use with Claude Code
LLM Dojo exposes a knowledge base MCP server. Connect it to Claude Code and any agent can query the full curriculum — finding the right notebook, concept, or Colab link for any LLM question.
01 — Install
Run this once in your terminal. Works globally across all Claude Code sessions.
claude mcp add llm-dojo --transport http https://llmdojo.dev/api/mcp -s userRequires Claude Code CLI. Install at claude.ai/code
02 — What it does
Three tools are exposed. Claude uses them automatically when you ask LLM-related questions.
search_curriculum("how to implement GRPO")Semantic search over all 101 notebooks. Returns ranked results with descriptions, concept tags, and direct Colab links.
list_stages()Returns all 8 curriculum stages with notebook counts and descriptions. Useful for understanding what's covered.
get_notebook("62-grpo-reasoning")Full details for a specific notebook — description, concepts, duration, Colab link.
03 — Example session
After connecting, Claude automatically queries the knowledge base when relevant:
How do I train a reasoning model like DeepSeek-R1?
→ calls search_curriculum("reasoning model DeepSeek GRPO")
→ returns: Notebook 62 — GRPO: Reasoning Model Training
Group Relative Policy Optimization with verifiable rewards
Colab: github.com/nishchaysinha/llm-dojo/blob/main/notebooks/62_grpo_reasoning.ipynbWhat's the difference between DPO, ORPO, and KTO?
→ calls search_curriculum("DPO ORPO KTO comparison")
→ returns: Notebook 63 — Preference Algorithm Comparison
Side-by-side DPO vs ORPO vs KTO vs IPO on same dataset04 — Manual config (optional)
Or add directly to ~/.claude/settings.json for project-scoped use:
{
"mcpServers": {
"llm-dojo": {
"type": "http",
"url": "https://llmdojo.dev/api/mcp"
}
}
}05 — Direct API
The search endpoint is also available as a plain HTTP API:
GET https://llmdojo.dev/api/search?q=LoRA+fine-tuning&k=5