Skip to main content

FastMCP v2 server for NotebookLM automation with modern async support

Project description

🚀 NotebookLM FastMCP v2 Server

Modern FastMCP v2 server for NotebookLM automation with UV Python manager

Python 3.10+ FastMCP v2 UV License: MIT Tests codecov

✨ Key Features

  • 🔥 FastMCP v2: Modern decorator-based MCP framework
  • ⚡ UV Python Manager: Lightning-fast dependency management
  • 🚀 Multiple Transports: STDIO, HTTP, SSE support
  • 🎯 Type Safety: Full Pydantic validation
  • 🔒 Persistent Auth: Automatic Google session management
  • 📊 Rich CLI: Beautiful terminal interface with Taskfile automation
  • 🐳 Production Ready: Docker support with monitoring

🏃‍♂️ Quick Start with UV

Prerequisites

Install UV (if not already installed):

# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh

# Or with pip
pip install uv

1. Clone & Setup

git clone https://github.com/khengyun/notebooklm-mcp.git
cd notebooklm-mcp

# Complete setup with UV
task setup

2. Development Setup

# Install development dependencies
task install-dev

# Show all available tasks
task --list

3. Start Server

# STDIO (for MCP clients)
task server-stdio

# HTTP (for web testing)
task server-http

# SSE (for streaming)
task server-sse

🔧 UV Development Workflow

Core Commands

# 📦 Dependency Management
task deps-add -- requests       # Add dependency
task deps-add-dev -- pytest     # Add dev dependency
task deps-remove -- requests    # Remove dependency
task deps-list                  # List dependencies
task deps-update                # Update all dependencies

# 🧪 Testing
task test                       # Run all tests
task test-quick                 # Quick validation test
task test-coverage              # Coverage analysis
task enforce-test               # MANDATORY after function changes

# 🔍 Code Quality
task lint                       # Run all linting
task format                     # Format code (Black + isort + Ruff)

# 🏗️ Build & Release
task build                      # Build package
task clean                      # Clean artifacts
notebooklm-mcp server

# Start HTTP server for web testing
notebooklm-mcp server --transport http --port 8001 --headless

# Start with specific notebook
notebooklm-mcp server --notebook YOUR_NOTEBOOK_ID

# Start in GUI mode for debugging  
notebooklm-mcp server

🔧 Traditional Installation (Alternative)

If you prefer pip over UV:

# Install with pip
pip install notebooklm-mcp

# Initialize
notebooklm-mcp init https://notebooklm.google.com/notebook/YOUR_NOTEBOOK_ID

# Start server
notebooklm-mcp server

🛠️ Available Tools

Tool Description Parameters
healthcheck Server health status None
send_chat_message Send message to NotebookLM message: str, wait_for_response: bool
get_chat_response Get response with timeout timeout: int
chat_with_notebook Complete interaction message: str, notebook_id?: str
navigate_to_notebook Switch notebooks notebook_id: str
get_default_notebook Current notebook None
set_default_notebook Set default notebook_id: str
get_quick_response Instant response None

🌐 Transport Options

STDIO (Default)

task server-stdio
# For: LangGraph, CrewAI, AutoGen

HTTP

task server-http  
# Access: http://localhost:8001/mcp
# For: Web testing, REST APIs

SSE

task server-sse
# Access: http://localhost:8002/
# For: Real-time streaming

🧪 Testing & Development

HTTP Client Testing

from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport

transport = StreamableHttpTransport(url="http://localhost:8001/mcp")
async with Client(transport) as client:
    tools = await client.list_tools()
    result = await client.call_tool("healthcheck", {})

Command Line Testing

# Test with curl
curl -X POST http://localhost:8001/mcp \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{"jsonrpc": "2.0", "id": 1, "method": "tools/list", "params": {}}'

📊 Client Integration

LangGraph

from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport

# HTTP transport
transport = StreamableHttpTransport(url="http://localhost:8001/mcp")
client = Client(transport)
tools = await client.list_tools()

CrewAI

from crewai_tools import BaseTool
from fastmcp import Client

class NotebookLMTool(BaseTool):
    name = "notebooklm"
    description = "Chat with NotebookLM"
    
    async def _arun(self, message: str):
        client = Client("http://localhost:8001/mcp")
        result = await client.call_tool("chat_with_notebook", {"message": message})
        return result

🔒 Authentication

Automatic Setup

# First time - opens browser for login
notebooklm-mcp init https://notebooklm.google.com/notebook/abc123

# Subsequent runs - uses saved session
notebooklm-mcp server --headless

Manual Setup

# Interactive browser login
notebooklm-mcp server

# After login, switch to headless
notebooklm-mcp server --headless

🐳 Docker Deployment

Quick Start

docker run -e NOTEBOOKLM_NOTEBOOK_ID="YOUR_ID" notebooklm-mcp

With Compose

version: '3.8'
services:
  notebooklm-mcp:
    image: notebooklm-mcp:latest
    ports:
      - "8001:8001"
    environment:
      - NOTEBOOKLM_NOTEBOOK_ID=your-notebook-id
      - TRANSPORT=http
    volumes:
      - ./chrome_profile:/app/chrome_profile

⚙️ Configuration

Config File (notebooklm-config.json)

{
  "default_notebook_id": "your-notebook-id",
  "headless": true,
  "timeout": 30,
  "auth": {
    "profile_dir": "./chrome_profile_notebooklm"
  },
  "debug": false
}

Environment Variables

export NOTEBOOKLM_NOTEBOOK_ID="your-notebook-id"
export NOTEBOOKLM_HEADLESS=true
export NOTEBOOKLM_DEBUG=false

🚀 Performance

FastMCP v2 Benefits

  • ⚡ 5x faster tool registration with decorators
  • 📋 Auto-generated schemas from Python type hints
  • 🔒 Built-in validation with Pydantic
  • 🧪 Better testing and debugging capabilities
  • 📊 Type safety throughout the stack

Benchmarks

Feature Traditional MCP FastMCP v2
Tool registration Manual schema Auto-generated
Type validation Manual Automatic
Error handling Basic Enhanced
Development speed Standard 5x faster
HTTP support Limited Full

🛠️ Development

Setup

git clone https://github.com/khengyun/notebooklm-mcp
cd notebooklm-mcp
pip install -e ".[dev]"

Testing

# Run tests
pytest

# With coverage
pytest --cov=notebooklm_mcp

# Integration tests
pytest tests/test_integration.py

Code Quality

# Format code
black src/ tests/
ruff check src/ tests/

# Type checking
mypy src/

📚 Documentation

🔗 Related Projects

📄 License

MIT License - see LICENSE file for details.

🆘 Support


Built with ❤️ using FastMCP v2 - Modern MCP development made simple!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

notebooklm_mcp-2.0.7.tar.gz (47.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

notebooklm_mcp-2.0.7-py3-none-any.whl (25.2 kB view details)

Uploaded Python 3

File details

Details for the file notebooklm_mcp-2.0.7.tar.gz.

File metadata

  • Download URL: notebooklm_mcp-2.0.7.tar.gz
  • Upload date:
  • Size: 47.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for notebooklm_mcp-2.0.7.tar.gz
Algorithm Hash digest
SHA256 8820db90f54c209b8ff59be1fd885e006b8c25e4e3ff2e8754b43824ee4aa98a
MD5 1b2423d2d60ea90b19603ff4725e5b52
BLAKE2b-256 23fafe56cbe453f9bbbd1e58e9711fd363f4e6107eb9c0c6d056c7b854feb24d

See more details on using hashes here.

File details

Details for the file notebooklm_mcp-2.0.7-py3-none-any.whl.

File metadata

  • Download URL: notebooklm_mcp-2.0.7-py3-none-any.whl
  • Upload date:
  • Size: 25.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for notebooklm_mcp-2.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 6691fd0843b673444b4b12fcaa81a5c3dd0ead036223b76ae836d0a03b09c0f2
MD5 85c9a993a89e8930c770c5f951688ee1
BLAKE2b-256 6abbf6e7b7d996c2582b56e05403a3f59638f63e6be326f3fca4d53b3ad82489

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page