YouTube video analysis and content generation pipeline exposed as MCP tools
Project description
mcp-content-pipeline
A YouTube video analysis and content generation pipeline exposed as MCP tools. Extract transcripts, generate key takeaways, TLDRs, and Twitter/X hook drafts — all callable by any MCP-compatible AI client like Claude Desktop.
Why?
Manually copying YouTube transcripts into AI tools, crafting prompts, and formatting output is tedious and error-prone. This MCP server turns the entire workflow into chainable tools that any AI agent can call. List videos from a channel, analyse them in batch, and sync the results to GitHub — all in a single conversation.
Quick Start
uvx mcp-content-pipeline
Or install explicitly:
uv tool install mcp-content-pipeline
mcp-content-pipeline
Claude Desktop Configuration
Add to your Claude Desktop MCP config (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"content-pipeline": {
"command": "/usr/local/bin/uvx",
"args": ["mcp-content-pipeline"],
"env": {
"MCP_CP_ANTHROPIC_API_KEY": "sk-ant-...",
"MCP_CP_GITHUB_TOKEN": "ghp_...",
"MCP_CP_GITHUB_REPO": "your-username/your-repo",
"MCP_CP_GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
Usage
Once configured in Claude Desktop, chain the tools in a single conversation:
Step 1 — Analyse
"Analyse this video: https://www.youtube.com/watch?v=..."
Step 2 — Generate image
"Generate an image for this analysis"
Step 3 — Sync to GitHub
"Sync the analysis and image to GitHub"
Or do it all in one prompt:
"Analyse this video, generate the image, and sync to GitHub: https://www.youtube.com/watch?v=..."
X Feed Digest
"Analyse what karpathy, garrytan, and elvissun posted about AI today"
Or with the full pipeline:
"Analyse the X feed, generate the image, and sync to GitHub"
Tools
| Tool | Description | Requires |
|---|---|---|
analyse_video |
Analyse a single YouTube video — transcript, takeaways, TLDR, Twitter hook | ANTHROPIC_API_KEY |
batch_analyse |
Analyse multiple videos from a URL list or config file | ANTHROPIC_API_KEY |
list_channel_videos |
Fetch recent videos from a YouTube channel | YOUTUBE_API_KEY |
sync_to_github |
Push analyses as markdown files to a GitHub repo | GITHUB_TOKEN, GITHUB_REPO |
analyse_x_feed |
Analyse recent posts from curated X accounts — daily digest | X_BEARER_TOKEN |
generate_image |
Generate comic-book infographic from analysis result | GEMINI_API_KEY |
Environment Variables
All prefixed with MCP_CP_:
| Variable | Required | Description |
|---|---|---|
MCP_CP_ANTHROPIC_API_KEY |
Yes | Anthropic API key for Claude analysis |
MCP_CP_YOUTUBE_API_KEY |
No | YouTube Data API v3 key (only for list_channel_videos) |
MCP_CP_GITHUB_TOKEN |
For sync | GitHub personal access token |
MCP_CP_GITHUB_REPO |
For sync | Target repo in owner/repo format |
MCP_CP_GITHUB_BRANCH |
No | Branch to push to (default: main) |
MCP_CP_GITHUB_OUTPUT_DIR |
No | Output directory in repo (default: content/videos) |
MCP_CP_CLAUDE_MODEL |
No | Claude model to use (default: claude-sonnet-4-20250514) |
MCP_CP_MAX_TRANSCRIPT_TOKENS |
No | Max transcript length in tokens (default: 100000) |
MCP_CP_GEMINI_API_KEY |
For image | Google AI Studio API key for image generation |
MCP_CP_GEMINI_MODEL |
No | Gemini model for images (default: gemini-3.1-flash-image-preview) |
MCP_CP_X_BEARER_TOKEN |
For X digest | X API v2 bearer token |
MCP_CP_X_ACCOUNTS |
For X digest | Comma-separated X usernames |
MCP_CP_X_TOPICS |
No | Comma-separated topics (default: AI,tech) |
Development
git clone https://github.com/your-username/mcp-content-pipeline.git
cd mcp-content-pipeline
uv sync
uv run pytest -v --cov=src/mcp_content_pipeline
uv run ruff check src/ tests/
Security
- All credentials are configured via local environment variables — never committed to the repo
- The tool is open source but your API keys, YouTube key, and GitHub token stay on your machine
- Never create a
.envfile in the repo — use shell exports or Claude Desktop config instead
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feat/my-feature) - Commit using Conventional Commits (
feat: add new feature) - Push and open a Pull Request
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_content_pipeline-0.8.0.tar.gz.
File metadata
- Download URL: mcp_content_pipeline-0.8.0.tar.gz
- Upload date:
- Size: 130.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f59c938e03fd28a2e70b641a5a1b64f5a880009a5b735bfd7fbe455f924319d1
|
|
| MD5 |
fd766df48eeaf122490cfc0a27d95ffa
|
|
| BLAKE2b-256 |
fd6bbc92fe0db7890b07e8001764cbfd00cc1afefee624618e89eee32ca4a0de
|
Provenance
The following attestation bundles were made for mcp_content_pipeline-0.8.0.tar.gz:
Publisher:
release.yml on berkayildi/mcp-content-pipeline
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_content_pipeline-0.8.0.tar.gz -
Subject digest:
f59c938e03fd28a2e70b641a5a1b64f5a880009a5b735bfd7fbe455f924319d1 - Sigstore transparency entry: 1272485026
- Sigstore integration time:
-
Permalink:
berkayildi/mcp-content-pipeline@b775348c55e3ef83ea6f073271ffa12d08d9bcf5 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/berkayildi
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b775348c55e3ef83ea6f073271ffa12d08d9bcf5 -
Trigger Event:
push
-
Statement type:
File details
Details for the file mcp_content_pipeline-0.8.0-py3-none-any.whl.
File metadata
- Download URL: mcp_content_pipeline-0.8.0-py3-none-any.whl
- Upload date:
- Size: 25.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a9fb9848f7dac4b8d17386a4ee809d8f9a33fc3b15a4355533acc6e2b8c74a20
|
|
| MD5 |
daad68a8fd2a0c2cec3734a3c7806e7e
|
|
| BLAKE2b-256 |
09bb787fcc3cfcdb2453cdd92ac50d00d8022bdeb0bc429cfd8635f4b385f2b7
|
Provenance
The following attestation bundles were made for mcp_content_pipeline-0.8.0-py3-none-any.whl:
Publisher:
release.yml on berkayildi/mcp-content-pipeline
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_content_pipeline-0.8.0-py3-none-any.whl -
Subject digest:
a9fb9848f7dac4b8d17386a4ee809d8f9a33fc3b15a4355533acc6e2b8c74a20 - Sigstore transparency entry: 1272485061
- Sigstore integration time:
-
Permalink:
berkayildi/mcp-content-pipeline@b775348c55e3ef83ea6f073271ffa12d08d9bcf5 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/berkayildi
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b775348c55e3ef83ea6f073271ffa12d08d9bcf5 -
Trigger Event:
push
-
Statement type: