Skip to main content

FastMCP v2 server for NotebookLM automation with modern async support

Project description

🚀 NotebookLM MCP

Professional Automation Google NotebookLM automation

Python 3.10+ FastMCP v2 UV License: MIT Tests codecov

✨ Key Features

  • 🔥 FastMCP v2: Modern decorator-based MCP framework
  • ⚡ UV Python Manager: Lightning-fast dependency management
  • 🚀 Multiple Transports: STDIO, HTTP, SSE support
  • 🎯 Type Safety: Full Pydantic validation
  • 🔒 Persistent Auth: Automatic Google session management
  • 📊 Rich CLI: Beautiful terminal interface with Taskfile automation
  • 🐳 Production Ready: Docker support with monitoring

🏃‍♂️ Quick Start with UV

Prerequisites

Install UV (if not already installed):

# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh

# Or with pip
pip install uv

1. Clone & Setup

git clone https://github.com/khengyun/notebooklm-mcp.git
cd notebooklm-mcp

# Complete setup with UV
task setup

2. Development Setup

# Install development dependencies
task install-dev

# Show all available tasks
task --list

3. Start Server

# STDIO (for MCP clients)
task server-stdio

# HTTP (for web testing)
task server-http

# SSE (for streaming)
task server-sse

🔧 UV Development Workflow

Core Commands

# 📦 Dependency Management
task deps-add -- requests       # Add dependency
task deps-add-dev -- pytest     # Add dev dependency
task deps-remove -- requests    # Remove dependency
task deps-list                  # List dependencies
task deps-update                # Update all dependencies

# 🧪 Testing
task test                       # Run all tests
task test-quick                 # Quick validation test
task test-coverage              # Coverage analysis
task enforce-test               # MANDATORY after function changes

# 🔍 Code Quality
task lint                       # Run all linting
task format                     # Format code (Black + isort + Ruff)

# 🏗️ Build & Release
task build                      # Build package
task clean                      # Clean artifacts
notebooklm-mcp server

# Start HTTP server for web testing
notebooklm-mcp server --transport http --port 8001 --headless

# Start with specific notebook
notebooklm-mcp server --notebook YOUR_NOTEBOOK_ID

# Start in GUI mode for debugging  
notebooklm-mcp server

🔧 Traditional Installation (Alternative)

If you prefer pip over UV:

# Install with pip
pip install notebooklm-mcp

# Initialize
notebooklm-mcp init https://notebooklm.google.com/notebook/YOUR_NOTEBOOK_ID

# Start server
notebooklm-mcp server

🛠️ Available Tools

Tool Description Parameters
healthcheck Server health status None
send_chat_message Send message to NotebookLM message: str, wait_for_response: bool
get_chat_response Get response with timeout timeout: int
chat_with_notebook Complete interaction message: str, notebook_id?: str
navigate_to_notebook Switch notebooks notebook_id: str
get_default_notebook Current notebook None
set_default_notebook Set default notebook_id: str
get_quick_response Instant response None

🌐 Transport Options

STDIO (Default)

task server-stdio
# For: LangGraph, CrewAI, AutoGen

HTTP

task server-http  
# Access: http://localhost:8001/mcp
# For: Web testing, REST APIs

SSE

task server-sse
# Access: http://localhost:8002/
# For: Real-time streaming

🧪 Testing & Development

HTTP Client Testing

from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport

transport = StreamableHttpTransport(url="http://localhost:8001/mcp")
async with Client(transport) as client:
    tools = await client.list_tools()
    result = await client.call_tool("healthcheck", {})

Command Line Testing

# Test with curl
curl -X POST http://localhost:8001/mcp \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{"jsonrpc": "2.0", "id": 1, "method": "tools/list", "params": {}}'

📊 Client Integration

LangGraph

from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport

# HTTP transport
transport = StreamableHttpTransport(url="http://localhost:8001/mcp")
client = Client(transport)
tools = await client.list_tools()

CrewAI

from crewai_tools import BaseTool
from fastmcp import Client

class NotebookLMTool(BaseTool):
    name = "notebooklm"
    description = "Chat with NotebookLM"
    
    async def _arun(self, message: str):
        client = Client("http://localhost:8001/mcp")
        result = await client.call_tool("chat_with_notebook", {"message": message})
        return result

🔒 Authentication

Automatic Setup

# First time - opens browser for login
notebooklm-mcp init https://notebooklm.google.com/notebook/abc123

# Subsequent runs - uses saved session
notebooklm-mcp server --headless

Manual Setup

# Interactive browser login
notebooklm-mcp server

# After login, switch to headless
notebooklm-mcp server --headless

🐳 Docker Deployment

Quick Start

docker run -e NOTEBOOKLM_NOTEBOOK_ID="YOUR_ID" notebooklm-mcp

With Compose

version: '3.8'
services:
  notebooklm-mcp:
    image: notebooklm-mcp:latest
    ports:
      - "8001:8001"
    environment:
      - NOTEBOOKLM_NOTEBOOK_ID=your-notebook-id
      - TRANSPORT=http
    volumes:
      - ./chrome_profile:/app/chrome_profile

⚙️ Configuration

Config File (notebooklm-config.json)

{
  "default_notebook_id": "your-notebook-id",
  "headless": true,
  "timeout": 30,
  "auth": {
    "profile_dir": "./chrome_profile_notebooklm"
  },
  "debug": false
}

Environment Variables

export NOTEBOOKLM_NOTEBOOK_ID="your-notebook-id"
export NOTEBOOKLM_HEADLESS=true
export NOTEBOOKLM_DEBUG=false

🚀 Performance

FastMCP v2 Benefits

  • ⚡ 5x faster tool registration with decorators
  • 📋 Auto-generated schemas from Python type hints
  • 🔒 Built-in validation with Pydantic
  • 🧪 Better testing and debugging capabilities
  • 📊 Type safety throughout the stack

Benchmarks

Feature Traditional MCP FastMCP v2
Tool registration Manual schema Auto-generated
Type validation Manual Automatic
Error handling Basic Enhanced
Development speed Standard 5x faster
HTTP support Limited Full

🛠️ Development

Setup

git clone https://github.com/khengyun/notebooklm-mcp
cd notebooklm-mcp
pip install -e ".[dev]"

Testing

# Run tests
pytest

# With coverage
pytest --cov=notebooklm_mcp

# Integration tests
pytest tests/test_integration.py

Code Quality

# Format code
black src/ tests/
ruff check src/ tests/

# Type checking
mypy src/

📚 Documentation

🔗 Related Projects

📄 License

MIT License - see LICENSE file for details.

🆘 Support


Built with ❤️ using FastMCP v2 - Modern MCP development made simple!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

notebooklm_mcp-2.0.9.tar.gz (47.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

notebooklm_mcp-2.0.9-py3-none-any.whl (25.2 kB view details)

Uploaded Python 3

File details

Details for the file notebooklm_mcp-2.0.9.tar.gz.

File metadata

  • Download URL: notebooklm_mcp-2.0.9.tar.gz
  • Upload date:
  • Size: 47.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for notebooklm_mcp-2.0.9.tar.gz
Algorithm Hash digest
SHA256 edcc334cd4a0c491cd2416f0ea9b5a6cf4d29d69434da11afd28e87c96725087
MD5 fa12d32a4aa7cd001991823f9c62f608
BLAKE2b-256 cbddf75836d17ed9291e7b6ab7db2f69db74eabba75095d1db56aa60ac57c9a7

See more details on using hashes here.

File details

Details for the file notebooklm_mcp-2.0.9-py3-none-any.whl.

File metadata

  • Download URL: notebooklm_mcp-2.0.9-py3-none-any.whl
  • Upload date:
  • Size: 25.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for notebooklm_mcp-2.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 dbefabb9e87558671dc7fd65a340b17e6cae1bb932bc9680556cd55591b0d56c
MD5 c3bb0ce228d869c39570249a4cbf5329
BLAKE2b-256 244dac87daa5130ba624820af1085cf9fd0f413352623c1578e1ccfeb3337ac4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page