Skip to main content

MCP server for ScrapeBadger - Twitter/X scraping API for AI agents

Project description

ScrapeBadger

ScrapeBadger MCP Server

PyPI version Python versions License MCP Compatible

Give your AI agents access to Twitter/X data via the Model Context Protocol


What is this?

ScrapeBadger MCP Server is a Model Context Protocol (MCP) server that enables AI assistants like Claude, ChatGPT, Cursor, Windsurf, and other MCP-compatible clients to access Twitter/X data through the ScrapeBadger API.

With this MCP server, your AI can:

  • Get Twitter user profiles, followers, and following lists
  • Search and retrieve tweets
  • Access trending topics globally or by location
  • Explore Twitter lists and communities
  • Search for places and geolocated content

Quick Start

1. Get Your API Key

Sign up at scrapebadger.com and get your API key.

2. Install

# Using uvx (recommended - no installation needed)
uvx scrapebadger-mcp

# Or install globally with pip
pip install scrapebadger-mcp

# Or with uv
uv tool install scrapebadger-mcp

3. Configure Your AI Client

Claude Desktop

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "scrapebadger": {
      "command": "uvx",
      "args": ["scrapebadger-mcp"],
      "env": {
        "SCRAPEBADGER_API_KEY": "sb_live_your_api_key_here"
      }
    }
  }
}

Cursor

Add to your Cursor MCP settings (.cursor/mcp.json):

{
  "mcpServers": {
    "scrapebadger": {
      "command": "uvx",
      "args": ["scrapebadger-mcp"],
      "env": {
        "SCRAPEBADGER_API_KEY": "sb_live_your_api_key_here"
      }
    }
  }
}

Windsurf

Add to your Windsurf MCP configuration:

{
  "mcpServers": {
    "scrapebadger": {
      "command": "uvx",
      "args": ["scrapebadger-mcp"],
      "env": {
        "SCRAPEBADGER_API_KEY": "sb_live_your_api_key_here"
      }
    }
  }
}

VS Code with Copilot

Add to your VS Code settings (.vscode/mcp.json):

{
  "mcpServers": {
    "scrapebadger": {
      "command": "uvx",
      "args": ["scrapebadger-mcp"],
      "env": {
        "SCRAPEBADGER_API_KEY": "sb_live_your_api_key_here"
      }
    }
  }
}

4. Start Using It!

Once configured, simply ask your AI to fetch Twitter data:

"Get the profile of @elonmusk"

"Search for tweets about AI agents"

"What's trending on Twitter right now?"

"Find the top 10 Python developers on Twitter"


Available Tools

The MCP server provides 17 tools organized into categories:

User Tools

Tool Description
get_twitter_user_profile Get a user's profile by username (bio, followers, following, etc.)
get_twitter_user_about Get extended "About" info (account location, username history)
search_twitter_users Search for users by query
get_twitter_followers Get a user's followers
get_twitter_following Get accounts a user follows

Tweet Tools

Tool Description
get_twitter_tweet Get a single tweet by ID
get_twitter_user_tweets Get recent tweets from a user
search_twitter_tweets Search for tweets (supports Twitter search operators)

Trend Tools

Tool Description
get_twitter_trends Get global trending topics (optionally by category)
get_twitter_place_trends Get trends for a specific location (by WOEID)

Geo Tools

Tool Description
search_twitter_places Search for Twitter places by name

List Tools

Tool Description
get_twitter_list_detail Get details about a Twitter list
search_twitter_lists Search for Twitter lists
get_twitter_list_tweets Get tweets from a list

Community Tools

Tool Description
get_twitter_community_detail Get details about a Twitter community
search_twitter_communities Search for communities

Example Conversations

Research a Twitter Account

You: Get the profile and recent tweets from @sama

Claude: I'll fetch Sam Altman's Twitter profile and recent tweets.

Uses get_twitter_user_profile and get_twitter_user_tweets

Sam Altman (@sama)
- 3.2M followers, 1,847 following
- Bio: "ceo of openai"
- Joined: December 2008

Recent tweets:
1. "excited to share..." (245K likes)
2. "the future of ai..." (180K likes)
...

Monitor Trending Topics

You: What's trending in tech on Twitter today?

Claude: I'll check the current trending topics.

Uses get_twitter_trends with category="news"

Top Tech Trends:
1. #GPT5 - 125,000 tweets
2. Apple Vision Pro - 89,000 tweets
3. #AGI - 45,000 tweets
...

Competitive Analysis

You: Compare the Twitter engagement of Stripe, Square, and PayPal

Claude: I'll gather data on all three companies' Twitter profiles.

Uses get_twitter_user_profile for each company

| Company | Followers | Following | Engagement Rate |
|---------|-----------|-----------|-----------------|
| Stripe  | 892K      | 1,245     | 2.3%           |
| Square  | 1.2M      | 567       | 1.8%           |
| PayPal  | 2.1M      | 234       | 0.9%           |

Configuration Options

Environment Variables

Variable Required Description
SCRAPEBADGER_API_KEY Yes Your ScrapeBadger API key

Using with Docker

FROM python:3.12-slim

RUN pip install scrapebadger-mcp

ENV SCRAPEBADGER_API_KEY=your_key_here

CMD ["scrapebadger-mcp"]

Using with Python Directly

# Set your API key
export SCRAPEBADGER_API_KEY="sb_live_your_key_here"

# Run the server
python -m scrapebadger_mcp.server

Error Handling

The MCP server handles common errors gracefully:

Error Description Solution
AuthenticationError Invalid API key Check your SCRAPEBADGER_API_KEY
RateLimitError Too many requests Wait and retry, or upgrade your plan
InsufficientCreditsError Out of credits Purchase more at scrapebadger.com
NotFoundError User/tweet not found Verify the username or tweet ID

Development

Setup

# Clone the repository
git clone https://github.com/scrape-badger/scrapebadger-mcp.git
cd scrapebadger-mcp

# Install dependencies
uv sync --dev

# Set your API key
export SCRAPEBADGER_API_KEY="sb_live_your_key_here"

Running Locally

# Run the MCP server directly
uv run python -m scrapebadger_mcp.server

# Or use the CLI
uv run scrapebadger-mcp

Testing

# Run tests
uv run pytest

# Run with coverage
uv run pytest --cov=src/scrapebadger_mcp

Code Quality

# Lint
uv run ruff check src/

# Format
uv run ruff format src/

# Type check
uv run mypy src/

Troubleshooting

"SCRAPEBADGER_API_KEY environment variable is required"

Make sure you've set the API key in your MCP configuration:

{
  "env": {
    "SCRAPEBADGER_API_KEY": "sb_live_your_key_here"
  }
}

Server not showing in Claude Desktop

  1. Restart Claude Desktop after changing the config
  2. Check the config file path is correct for your OS
  3. Verify JSON syntax is valid (no trailing commas)

"uvx: command not found"

Install uv first:

# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"

Rate limit errors

ScrapeBadger has usage limits based on your plan. If you're hitting limits:

  1. Reduce request frequency
  2. Use pagination with smaller max_results
  3. Upgrade your plan at scrapebadger.com

Related Projects


Support


License

MIT License - see LICENSE for details.


Made with love by ScrapeBadger

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapebadger_mcp-0.1.1.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapebadger_mcp-0.1.1-py3-none-any.whl (10.0 kB view details)

Uploaded Python 3

File details

Details for the file scrapebadger_mcp-0.1.1.tar.gz.

File metadata

  • Download URL: scrapebadger_mcp-0.1.1.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for scrapebadger_mcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 a50d2b6b22c1c1072f28b9693c8b0ff9accadca12cef62314a4ec4fb5198d478
MD5 f6877a1f13d5bc4b5ea1fe32232574aa
BLAKE2b-256 ed80609a4d65eddbb460aaf40a9cfa81a0bfb7bc4820d0cc89254d5373b67b4d

See more details on using hashes here.

Provenance

The following attestation bundles were made for scrapebadger_mcp-0.1.1.tar.gz:

Publisher: publish.yml on scrape-badger/scrapebadger-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file scrapebadger_mcp-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapebadger_mcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7202e302d89b668cca66a373a6d12db4129fc36e8743b9ff386faf2c3c1ecde7
MD5 ddb535d711e7653cca0a0853c6c2ac5d
BLAKE2b-256 d23766586751f28e719f297eeb76e474fa05c3a69dbe6f1e59ef5ae8eaf59953

See more details on using hashes here.

Provenance

The following attestation bundles were made for scrapebadger_mcp-0.1.1-py3-none-any.whl:

Publisher: publish.yml on scrape-badger/scrapebadger-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page