FastMCP v2 server for NotebookLM automation with modern async support
Project description
🚀 NotebookLM FastMCP v2 Server
Modern FastMCP v2 server for NotebookLM automation with UV Python manager
✨ Key Features
- 🔥 FastMCP v2: Modern decorator-based MCP framework
- ⚡ UV Python Manager: Lightning-fast dependency management
- 🚀 Multiple Transports: STDIO, HTTP, SSE support
- 🎯 Type Safety: Full Pydantic validation
- 🔒 Persistent Auth: Automatic Google session management
- 📊 Rich CLI: Beautiful terminal interface with Taskfile automation
- 🐳 Production Ready: Docker support with monitoring
🏃♂️ Quick Start with UV
Prerequisites
Install UV (if not already installed):
# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or with pip
pip install uv
1. Clone & Setup
git clone https://github.com/khengyun/notebooklm-mcp.git
cd notebooklm-mcp
# Complete setup with UV
task setup
2. Development Setup
# Install development dependencies
task install-dev
# Show all available tasks
task --list
3. Start Server
# STDIO (for MCP clients)
task server-stdio
# HTTP (for web testing)
task server-http
# SSE (for streaming)
task server-sse
🔧 UV Development Workflow
Core Commands
# 📦 Dependency Management
task deps-add -- requests # Add dependency
task deps-add-dev -- pytest # Add dev dependency
task deps-remove -- requests # Remove dependency
task deps-list # List dependencies
task deps-update # Update all dependencies
# 🧪 Testing
task test # Run all tests
task test-quick # Quick validation test
task test-coverage # Coverage analysis
task enforce-test # MANDATORY after function changes
# 🔍 Code Quality
task lint # Run all linting
task format # Format code (Black + isort + Ruff)
# 🏗️ Build & Release
task build # Build package
task clean # Clean artifacts
notebooklm-mcp server
# Start HTTP server for web testing
notebooklm-mcp server --transport http --port 8001 --headless
# Start with specific notebook
notebooklm-mcp server --notebook YOUR_NOTEBOOK_ID
# Start in GUI mode for debugging
notebooklm-mcp server
🔧 Traditional Installation (Alternative)
If you prefer pip over UV:
# Install with pip
pip install notebooklm-mcp
# Initialize
notebooklm-mcp init https://notebooklm.google.com/notebook/YOUR_NOTEBOOK_ID
# Start server
notebooklm-mcp server
🛠️ Available Tools
| Tool | Description | Parameters |
|---|---|---|
healthcheck |
Server health status | None |
send_chat_message |
Send message to NotebookLM | message: str, wait_for_response: bool |
get_chat_response |
Get response with timeout | timeout: int |
chat_with_notebook |
Complete interaction | message: str, notebook_id?: str |
navigate_to_notebook |
Switch notebooks | notebook_id: str |
get_default_notebook |
Current notebook | None |
set_default_notebook |
Set default | notebook_id: str |
get_quick_response |
Instant response | None |
🌐 Transport Options
STDIO (Default)
task server-stdio
# For: LangGraph, CrewAI, AutoGen
HTTP
task server-http
# Access: http://localhost:8001/mcp
# For: Web testing, REST APIs
SSE
task server-sse
# Access: http://localhost:8002/
# For: Real-time streaming
🧪 Testing & Development
HTTP Client Testing
from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport
transport = StreamableHttpTransport(url="http://localhost:8001/mcp")
async with Client(transport) as client:
tools = await client.list_tools()
result = await client.call_tool("healthcheck", {})
Command Line Testing
# Test with curl
curl -X POST http://localhost:8001/mcp \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc": "2.0", "id": 1, "method": "tools/list", "params": {}}'
📊 Client Integration
LangGraph
from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport
# HTTP transport
transport = StreamableHttpTransport(url="http://localhost:8001/mcp")
client = Client(transport)
tools = await client.list_tools()
CrewAI
from crewai_tools import BaseTool
from fastmcp import Client
class NotebookLMTool(BaseTool):
name = "notebooklm"
description = "Chat with NotebookLM"
async def _arun(self, message: str):
client = Client("http://localhost:8001/mcp")
result = await client.call_tool("chat_with_notebook", {"message": message})
return result
🔒 Authentication
Automatic Setup
# First time - opens browser for login
notebooklm-mcp init https://notebooklm.google.com/notebook/abc123
# Subsequent runs - uses saved session
notebooklm-mcp server --headless
Manual Setup
# Interactive browser login
notebooklm-mcp server
# After login, switch to headless
notebooklm-mcp server --headless
🐳 Docker Deployment
Quick Start
docker run -e NOTEBOOKLM_NOTEBOOK_ID="YOUR_ID" notebooklm-mcp
With Compose
version: '3.8'
services:
notebooklm-mcp:
image: notebooklm-mcp:latest
ports:
- "8001:8001"
environment:
- NOTEBOOKLM_NOTEBOOK_ID=your-notebook-id
- TRANSPORT=http
volumes:
- ./chrome_profile:/app/chrome_profile
⚙️ Configuration
Config File (notebooklm-config.json)
{
"default_notebook_id": "your-notebook-id",
"headless": true,
"timeout": 30,
"auth": {
"profile_dir": "./chrome_profile_notebooklm"
},
"debug": false
}
Environment Variables
export NOTEBOOKLM_NOTEBOOK_ID="your-notebook-id"
export NOTEBOOKLM_HEADLESS=true
export NOTEBOOKLM_DEBUG=false
🚀 Performance
FastMCP v2 Benefits
- ⚡ 5x faster tool registration with decorators
- 📋 Auto-generated schemas from Python type hints
- 🔒 Built-in validation with Pydantic
- 🧪 Better testing and debugging capabilities
- 📊 Type safety throughout the stack
Benchmarks
| Feature | Traditional MCP | FastMCP v2 |
|---|---|---|
| Tool registration | Manual schema | Auto-generated |
| Type validation | Manual | Automatic |
| Error handling | Basic | Enhanced |
| Development speed | Standard | 5x faster |
| HTTP support | Limited | Full |
🛠️ Development
Setup
git clone https://github.com/khengyun/notebooklm-mcp
cd notebooklm-mcp
pip install -e ".[dev]"
Testing
# Run tests
pytest
# With coverage
pytest --cov=notebooklm_mcp
# Integration tests
pytest tests/test_integration.py
Code Quality
# Format code
black src/ tests/
ruff check src/ tests/
# Type checking
mypy src/
📚 Documentation
- Quick Setup Guide - Get started in 2 minutes
- HTTP Server Guide - Web testing & integration
- FastMCP v2 Guide - Modern MCP features
- Docker Deployment - Production setup
- API Reference - Complete tool documentation
🔗 Related Projects
- FastMCP - Modern MCP framework
- MCP Specification - Official MCP spec
- NotebookLM - Google's AI notebook
📄 License
MIT License - see LICENSE file for details.
🆘 Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: Read the Docs
Built with ❤️ using FastMCP v2 - Modern MCP development made simple!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file notebooklm_mcp-2.0.6.tar.gz.
File metadata
- Download URL: notebooklm_mcp-2.0.6.tar.gz
- Upload date:
- Size: 47.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
983dd1a5600b97f2c7e797a01172bb62acc366677d489c5558d28d22e8a04c34
|
|
| MD5 |
5c1aed22536e47ef86239c4391411d81
|
|
| BLAKE2b-256 |
97fa2b62802786ecc4b1ce9691ea92a7f4227c27e473309519bffdfb169cf13c
|
File details
Details for the file notebooklm_mcp-2.0.6-py3-none-any.whl.
File metadata
- Download URL: notebooklm_mcp-2.0.6-py3-none-any.whl
- Upload date:
- Size: 25.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4a9c141e3bd120848c50e585bd064749b56a179beffae3fb4227beb644dcc2c6
|
|
| MD5 |
95ff5c552fb7be1f81b0527dd074f533
|
|
| BLAKE2b-256 |
d96dd504b1a6bd529066cb393156f86293d521bf1d3bb0864087faaeca6a5206
|