MCP server that guides learners through any material: orientation, notes, four-phase study loop, spaced repetition.
Project description
learners-mcp
MCP server that turns any source material into a guided learning experience:
- Orientation — material-level learning map + per-section focus briefs; cross-material concept linking.
- Notes — handwritten-style Markdown notes extracted via a map → reduce → consistency pipeline.
- Four-phase study loop — Preview → Explain → Question → Anchor (soft-guidance, not locked).
- Flashcards — SM-2 spaced repetition; no duplicates.
- Grounded Q&A — ad-hoc questions against the material with
[§N]citations. - Prerequisite checks — before a section, surface unmastered cards from sections its key concepts build on.
- Study plans — calendar-aware schedule using focus-brief time estimates + suggested path.
- Streak + weekly report — activity roll-ups across the library.
- Phase evaluation — opt-in structured assessment of a phase response (strengths, gaps, misconceptions, follow-ups).
- Exports — auto-written learner Markdown mirror, Anki
.apkg, CSV, combined Markdown notes, explicit JSON artifact export, portable project JSON (full round-trip).
Host-agnostic: works with any MCP-capable agent (Claude Desktop, Claude Code, Codex, Gemini, Cursor, Zed, Continue, etc.).
Configuring models
By default learners-mcp uses Anthropic models (haiku for fast tasks, sonnet for most work, opus for the learning map). Copy examples/llm.yaml to ~/.learners-mcp/llm.yaml to change models, providers, or per-task routing.
YAML structure
profiles:
default:
model: openrouter/anthropic/claude-sonnet-4.6
params:
reasoning_effort: low
prompt_cache: auto # auto|on|off
routes:
qa: default
learning_map: oneshot
# ... (11 tasks total — see examples/llm.yaml for full list)
Supported providers
Any provider supported by LiteLLM: Anthropic, OpenRouter, OpenAI, Gemini, Bedrock, Vertex, and custom OpenAI-compatible endpoints.
Set the matching API key in your environment — LiteLLM reads them automatically:
ANTHROPIC_API_KEY, OPENROUTER_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY.
The 11 tasks and default profiles
| Task | Default profile | Used for |
|---|---|---|
notes_map, notes_tldr, focus_brief |
fast (haiku) |
Per-chunk work, high volume |
notes_reduce, notes_polish, rolling_summary, qa, phase_evaluation, completion_report, flashcards |
default (sonnet) |
Most analytic work |
learning_map |
oneshot (opus) |
Material-level orientation, one call |
Env overrides
Override without editing the YAML:
LEARNERS_MCP_MODEL_DEFAULT=gpt-4o-mini— change the model for a profileLEARNERS_MCP_PARAMS_DEFAULT='{"reasoning_effort":"low"}'— change params (JSON)LEARNERS_MCP_ROUTE_QA=fast— re-route a task to a different profileLEARNERS_MCP_LLM_CONFIG=/path/to/llm.yaml— use a custom config path
Prompt caching
For Anthropic-family models (including via OpenRouter), cache_control blocks are preserved and caching applies automatically. Non-Anthropic models (OpenAI, Gemini, etc.) use flat text — no block-level caching, so map-reduce pipelines cost more. Set prompt_cache: on to force pass-through if you know your proxy supports it.
Install
pip install -e .
export ANTHROPIC_API_KEY=sk-ant-...
Register with a host
Claude Desktop — claude_desktop_config.json:
{
"mcpServers": {
"learners": {
"command": "learners-mcp",
"env": { "ANTHROPIC_API_KEY": "sk-ant-..." }
}
}
}
Claude Code — .mcp.json in the project or ~/.claude.json:
{ "mcpServers": { "learners": { "command": "learners-mcp" } } }
Other MCP hosts (Codex, Gemini CLI, Cursor, Zed, Continue) follow the same command + env pattern — consult their docs for the exact config file.
Surface
- Tools (38): ingestion/prep (
ingest_material,prepare_material,get_preparation_status,start_background_preparation,get_background_status); orientation (get_material_map,regenerate_map,get_focus_brief); notes (get_notes,extract_notes_now); library (list_sections,list_materials,material_progress,library_dashboard); study loop (start_section,get_phase_prompt,record_phase_response,complete_phase,check_prerequisites,plan_study,study_streak,weekly_report); evaluation (evaluate_phase_response,list_evaluations); flashcards (suggest_flashcards,add_flashcard,list_flashcards,review_flashcard,next_due); ad-hoc (answer_from_material,recommend_next_action); completion (get_completion_report,regenerate_completion_report); exports (export_anki,export_notes,export_material_artifacts,export_project,import_project). - Resource templates:
material://{id},learning_map://{id},focus_brief://{section_id},notes://{id},notes://{id}/{section_id},section_state://{section_id},completion_report://{section_id},evaluations://{section_id},plan://{material_id}. - Concrete resources:
library://,review://due,streak://,report://weekly. - Prompts:
preview,explain,question,anchor(phase-coaching prompts the host agent executes).
Typical flow
ingest_material("/path/to/book.pdf")→ returnsmaterial_id.start_background_preparation(material_id)→ learning map + focus briefs + notes generate asynchronously.get_material_map(material_id)once map is ready → orient the learner.start_section(section_id)→ content + focus brief + phase state.- For each phase (
preview→explain→question→anchor): host invokes the matching prompt, learner responds, server records viarecord_phase_response+complete_phase. - In Anchor:
suggest_flashcards→add_flashcard× N.complete_phase(section_id, 'anchor')triggers a completion report. - Later:
next_due(material_id)for review sessions;review_flashcard(id, knew_it)to grade.
Learner artifacts
Generated study material is mirrored as readable Markdown in ./learners/<material-slug>/ by default. The SQLite DB remains the canonical state, but the learner can open files such as learning-map.md, focus-briefs.md, notes.md, flashcards.md, and progress.md directly from the working directory.
Set LEARNERS_MCP_ARTIFACT_DIR=/path/to/dir to write the mirror somewhere else, or LEARNERS_MCP_ARTIFACT_MIRROR=off to disable automatic Markdown writes. JSON artifacts are explicit only: call export_material_artifacts(material_id, format="json") or format="all" to write structured files under json/.
State
SQLite DB + server config live in ~/.learners-mcp/. Override with LEARNERS_MCP_DATA_DIR. Delete the directory to start fresh.
Tests
pip install -e ".[dev]"
pytest
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file learners_mcp-0.1.0.tar.gz.
File metadata
- Download URL: learners_mcp-0.1.0.tar.gz
- Upload date:
- Size: 82.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2ee577b34c9d2043d00298ebf2c296090a5925f5e46c7828ce20f01fb0ac53a8
|
|
| MD5 |
56cd2408717455b85f96001886b213d5
|
|
| BLAKE2b-256 |
1db3c2da33230b88ac985d7d7449635d8b7c5fa7ec8861f50f16475f633701b5
|
Provenance
The following attestation bundles were made for learners_mcp-0.1.0.tar.gz:
Publisher:
publish.yml on arashbehmand/learners-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
learners_mcp-0.1.0.tar.gz -
Subject digest:
2ee577b34c9d2043d00298ebf2c296090a5925f5e46c7828ce20f01fb0ac53a8 - Sigstore transparency entry: 1353049060
- Sigstore integration time:
-
Permalink:
arashbehmand/learners-mcp@4bc8e1a89d72bcfc362947421d8080492e0ed9a2 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/arashbehmand
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@4bc8e1a89d72bcfc362947421d8080492e0ed9a2 -
Trigger Event:
push
-
Statement type:
File details
Details for the file learners_mcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: learners_mcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 73.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6db0af2a90efd026058e83de23e5c8a1c6578706c35f9932a450737e84b502d7
|
|
| MD5 |
06304580dc8df9321773caf2e4322df7
|
|
| BLAKE2b-256 |
77bf3115608d52db426f88dedbb3844487dee2cb4b7b5f9659aebc727c2d6a7e
|
Provenance
The following attestation bundles were made for learners_mcp-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on arashbehmand/learners-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
learners_mcp-0.1.0-py3-none-any.whl -
Subject digest:
6db0af2a90efd026058e83de23e5c8a1c6578706c35f9932a450737e84b502d7 - Sigstore transparency entry: 1353049173
- Sigstore integration time:
-
Permalink:
arashbehmand/learners-mcp@4bc8e1a89d72bcfc362947421d8080492e0ed9a2 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/arashbehmand
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@4bc8e1a89d72bcfc362947421d8080492e0ed9a2 -
Trigger Event:
push
-
Statement type: