Skip to main content

MCP server for LinkedIn profile, company, and job scraping with Claude AI integration. Supports direct profile/company/job URL scraping with secure credential storage.

Project description

LinkedIn MCP Server

PyPI CI Status Release License

Through this LinkedIn MCP server, AI assistants like Claude can connect to your LinkedIn. Access profiles and companies, search for jobs, or get job details.

[!IMPORTANT] FAQ

Is this safe to use? Will I get banned? This tool controls a real browser session; it doesn't exploit undocumented APIs or bypass authentication. That said, LinkedIn's TOS prohibit automated tools. With normal usage (not bulk scraping!) you're not risking a ban. So far, no users have been banned for using this MCP. If you encounter any issues, let me know in the Discussions.

What if my agents execute too many actions? LinkedIn may send you a warning about automated tool usage. If that happens, reduce your automation volume. This MCP executes tool calls sequentially via a queue but has no built-in rate limits. Prompt your agents responsibly.

Installation Methods

uvx Install MCP Bundle Docker Development

https://github.com/user-attachments/assets/eb84419a-6eaf-47bd-ac52-37bc59c83680

Features & Tool Status

Tool Description Status
get_person_profile Get profile info with explicit section selection (experience, education, interests, honors, languages, certifications, skills, projects, contact_info, posts) working
connect_with_person Send a connection request or accept an incoming one, with optional note #304
get_sidebar_profiles Extract profile URLs from sidebar recommendation sections ("More profiles for you", "Explore premium profiles", "People you may know") on a profile page working
get_inbox List recent conversations from the LinkedIn messaging inbox working
get_conversation Read a specific messaging conversation by username or thread ID #307
search_conversations Search messages by keyword working
send_message Send a message to a LinkedIn user (requires confirmation) #344
get_company_profile Extract company information with explicit section selection (posts, jobs) working
get_company_posts Get recent posts from a company's LinkedIn feed working
search_jobs Search for jobs with keywords and location filters working
search_people Search for people by keywords and location working
get_job_details Get detailed information about a specific job posting working
close_session Close browser session and clean up resources working


🚀 uvx Setup (Recommended - Universal)

Prerequisites: Install uv.

Installation

Client Configuration

{
  "mcpServers": {
    "linkedin": {
      "command": "uvx",
      "args": ["linkedin-scraper-mcp@latest"],
      "env": { "UV_HTTP_TIMEOUT": "300" }
    }
  }
}

The @latest tag ensures you always run the newest version — uvx checks PyPI on each client launch and updates automatically. The server starts quickly, prepares the shared Patchright Chromium browser cache in the background under ~/.linkedin-mcp/patchright-browsers, and opens a LinkedIn login browser window on the first tool call that needs authentication.

[!NOTE] Early tool calls may return a setup/authentication-in-progress error until browser setup or login finishes. If you prefer to create a session explicitly, run uvx linkedin-scraper-mcp@latest --login.

uvx Setup Help

🔧 Configuration

Transport Modes:

  • Default (stdio): Standard communication for local MCP servers
  • Streamable HTTP: For web-based MCP server
  • If no transport is specified, the server defaults to stdio
  • An interactive terminal without explicit transport shows a chooser prompt

CLI Options:

  • --login - Open browser to log in and save persistent profile
  • --no-headless - Show browser window (useful for debugging scraping issues)
  • --log-level {DEBUG,INFO,WARNING,ERROR} - Set logging level (default: WARNING)
  • --transport {stdio,streamable-http} - Optional: force transport mode (default: stdio)
  • --host HOST - HTTP server host (default: 127.0.0.1)
  • --port PORT - HTTP server port (default: 8000)
  • --path PATH - HTTP server path (default: /mcp)
  • --logout - Clear stored LinkedIn browser profile
  • --timeout MS - Browser timeout for page operations in milliseconds (default: 5000)
  • --user-data-dir PATH - Path to persistent browser profile directory (default: ~/.linkedin-mcp/profile)
  • --chrome-path PATH - Path to Chrome/Chromium executable (for custom browser installations)

Basic Usage Examples:

# Run with debug logging
uvx linkedin-scraper-mcp@latest --log-level DEBUG

HTTP Mode Example (for web-based MCP clients):

uvx linkedin-scraper-mcp@latest --transport streamable-http --host 127.0.0.1 --port 8080 --path /mcp

Runtime server logs are emitted by FastMCP/Uvicorn.

Tool calls are serialized within a single server process to protect the shared LinkedIn browser session. Concurrent client requests queue instead of running in parallel. Use --log-level DEBUG to see scraper lock wait/acquire/release logs.

Test with mcp inspector:

  1. Install and run mcp inspector bunx @modelcontextprotocol/inspector
  2. Click pre-filled token url to open the inspector in your browser
  3. Select Streamable HTTP as Transport Type
  4. Set URL to http://localhost:8080/mcp
  5. Connect
  6. Test tools
❗ Troubleshooting

Installation issues:

  • Ensure you have uv installed: curl -LsSf https://astral.sh/uv/install.sh | sh
  • Check uv version: uv --version (should be 0.4.0 or higher)
  • On first run, uvx downloads all Python dependencies. On slow connections, uv's default 30s HTTP timeout may be too short. The recommended config above already sets UV_HTTP_TIMEOUT=300 (seconds) to avoid this.

Session issues:

  • Browser profile is stored at ~/.linkedin-mcp/profile/
  • Managed browser downloads are cached at ~/.linkedin-mcp/patchright-browsers/
  • Make sure you have only one active LinkedIn session at a time

Login issues:

  • LinkedIn may require a login confirmation in the LinkedIn mobile app for --login
  • You might get a captcha challenge if you logged in frequently. Run uvx linkedin-scraper-mcp@latest --login which opens a browser where you can solve it manually.

Timeout issues:

  • If pages fail to load or elements aren't found, try increasing the timeout: --timeout 10000
  • Users on slow connections may need higher values (e.g., 15000-30000ms)
  • Can also set via environment variable: TIMEOUT=10000

Custom Chrome path:

  • If Chrome is installed in a non-standard location, use --chrome-path /path/to/chrome
  • Can also set via environment variable: CHROME_PATH=/path/to/chrome


📦 Claude Desktop MCP Bundle (formerly DXT)

Prerequisites: Claude Desktop.

One-click installation for Claude Desktop users:

  1. Download the latest .mcpb artifact from releases
  2. Click the downloaded .mcpb file to install it into Claude Desktop
  3. Call any LinkedIn tool

On startup, the MCP Bundle starts preparing the shared Patchright Chromium browser cache in the background. If you call a tool too early, Claude will surface a setup-in-progress error. On the first tool call that needs authentication, the server opens a LinkedIn login browser window and asks you to retry after sign-in.

MCP Bundle Setup Help

❗ Troubleshooting

First-time setup behavior:

  • Claude Desktop starts the bundle immediately; browser setup continues in the background
  • If the Patchright Chromium browser is still downloading, retry the tool after a short wait
  • Managed browser downloads are shared under ~/.linkedin-mcp/patchright-browsers/

Login issues:

  • Make sure you have only one active LinkedIn session at a time
  • LinkedIn may require a login confirmation in the LinkedIn mobile app for --login
  • You might get a captcha challenge if you logged in frequently. Run uvx linkedin-scraper-mcp@latest --login which opens a browser where you can solve captchas manually. See the uvx setup for prerequisites.

Timeout issues:

  • If pages fail to load or elements aren't found, try increasing the timeout: --timeout 10000
  • Users on slow connections may need higher values (e.g., 15000-30000ms)
  • Can also set via environment variable: TIMEOUT=10000


🐳 Docker Setup

Prerequisites: Make sure you have Docker installed and running, and uv installed on the host for the one-time --login step.

Authentication

Docker runs headless (no browser window), so you need to create a browser profile locally first and mount it into the container.

Step 1: Create profile on the host (one-time setup)

uvx linkedin-scraper-mcp@latest --login

This opens a browser window where you log in manually (5 minute timeout for 2FA, captcha, etc.). The browser profile and cookies are saved under ~/.linkedin-mcp/. On startup, Docker derives a Linux browser profile from your host cookies and creates a fresh session each time. If you experience stability issues with Docker, consider using the uvx setup instead.

Step 2: Configure Claude Desktop with Docker

{
  "mcpServers": {
    "linkedin": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-v", "~/.linkedin-mcp:/home/pwuser/.linkedin-mcp",
        "stickerdaniel/linkedin-mcp-server:latest"
      ]
    }
  }
}

[!NOTE] Docker creates a fresh session on each startup. Sessions may expire over time — run uvx linkedin-scraper-mcp@latest --login again if you encounter authentication issues.

[!NOTE] Why can't I run --login in Docker? Docker containers don't have a display server. Create a profile on your host using the uvx setup and mount it into Docker.

Docker Setup Help

🔧 Configuration

Transport Modes:

  • Default (stdio): Standard communication for local MCP servers
  • Streamable HTTP: For a web-based MCP server
  • If no transport is specified, the server defaults to stdio
  • An interactive terminal without explicit transport shows a chooser prompt

CLI Options:

  • --log-level {DEBUG,INFO,WARNING,ERROR} - Set logging level (default: WARNING)
  • --transport {stdio,streamable-http} - Optional: force transport mode (default: stdio)
  • --host HOST - HTTP server host (default: 127.0.0.1)
  • --port PORT - HTTP server port (default: 8000)
  • --path PATH - HTTP server path (default: /mcp)
  • --logout - Clear all stored LinkedIn auth state, including source and derived runtime profiles
  • --timeout MS - Browser timeout for page operations in milliseconds (default: 5000)
  • --user-data-dir PATH - Path to persistent browser profile directory (default: ~/.linkedin-mcp/profile)
  • --chrome-path PATH - Path to Chrome/Chromium executable (rarely needed in Docker)

[!NOTE] --login and --no-headless are not available in Docker (no display server). Use the uvx setup to create profiles.

HTTP Mode Example (for web-based MCP clients):

docker run -it --rm \
  -v ~/.linkedin-mcp:/home/pwuser/.linkedin-mcp \
  -p 8080:8080 \
  stickerdaniel/linkedin-mcp-server:latest \
  --transport streamable-http --host 0.0.0.0 --port 8080 --path /mcp

Runtime server logs are emitted by FastMCP/Uvicorn.

Test with mcp inspector:

  1. Install and run mcp inspector bunx @modelcontextprotocol/inspector
  2. Click pre-filled token url to open the inspector in your browser
  3. Select Streamable HTTP as Transport Type
  4. Set URL to http://localhost:8080/mcp
  5. Connect
  6. Test tools
❗ Troubleshooting

Docker issues:

  • Make sure Docker is installed
  • Check if Docker is running: docker ps

Login issues:

  • Make sure you have only one active LinkedIn session at a time
  • LinkedIn may require a login confirmation in the LinkedIn mobile app for --login
  • You might get a captcha challenge if you logged in frequently. Run uvx linkedin-scraper-mcp@latest --login which opens a browser where you can solve captchas manually. See the uvx setup for prerequisites.
  • If Docker auth becomes stale after you re-login on the host, restart Docker once so it can fresh-bridge from the new source session generation.

Timeout issues:

  • If pages fail to load or elements aren't found, try increasing the timeout: --timeout 10000
  • Users on slow connections may need higher values (e.g., 15000-30000ms)
  • Can also set via environment variable: TIMEOUT=10000

Custom Chrome path:

  • If Chrome is installed in a non-standard location, use --chrome-path /path/to/chrome
  • Can also set via environment variable: CHROME_PATH=/path/to/chrome


🐍 Local Setup (Develop & Contribute)

Contributions are welcome! See CONTRIBUTING.md for architecture guidelines and checklists. Please open an issue first to discuss the feature or bug fix before submitting a PR.

Prerequisites: Git and uv installed

Installation

# 1. Clone repository
git clone https://github.com/stickerdaniel/linkedin-mcp-server
cd linkedin-mcp-server

# 2. Install UV package manager (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh

# 3. Install dependencies
uv sync
uv sync --group dev

# 4. Install pre-commit hooks
uv run pre-commit install

# 5. Start the server
uv run -m linkedin_mcp_server

The local server uses the same managed-runtime flow as MCPB and uvx: it prepares the Patchright Chromium browser cache in the background and opens LinkedIn login on the first auth-requiring tool call. You can still run uv run -m linkedin_mcp_server --login when you want to create the session explicitly.

Local Setup Help

🔧 Configuration

CLI Options:

  • --login - Open browser to log in and save persistent profile
  • --no-headless - Show browser window (useful for debugging scraping issues)
  • --log-level {DEBUG,INFO,WARNING,ERROR} - Set logging level (default: WARNING)
  • --transport {stdio,streamable-http} - Optional: force transport mode (default: stdio)
  • --host HOST - HTTP server host (default: 127.0.0.1)
  • --port PORT - HTTP server port (default: 8000)
  • --path PATH - HTTP server path (default: /mcp)
  • --logout - Clear stored LinkedIn browser profile
  • --timeout MS - Browser timeout for page operations in milliseconds (default: 5000)
  • --status - Check if current session is valid and exit
  • --user-data-dir PATH - Path to persistent browser profile directory (default: ~/.linkedin-mcp/profile)
  • --slow-mo MS - Delay between browser actions in milliseconds (default: 0, useful for debugging)
  • --user-agent STRING - Custom browser user agent
  • --viewport WxH - Browser viewport size (default: 1280x720)
  • --chrome-path PATH - Path to Chrome/Chromium executable (for custom browser installations)
  • --help - Show help

Note: Most CLI options have environment variable equivalents. See .env.example for details.

HTTP Mode Example (for web-based MCP clients):

uv run -m linkedin_mcp_server --transport streamable-http --host 127.0.0.1 --port 8000 --path /mcp

Claude Desktop:

{
  "mcpServers": {
    "linkedin": {
      "command": "uv",
      "args": ["--directory", "/path/to/linkedin-mcp-server", "run", "-m", "linkedin_mcp_server"]
    }
  }
}

stdio is used by default for this config.

❗ Troubleshooting

Login issues:

  • Make sure you have only one active LinkedIn session at a time
  • LinkedIn may require a login confirmation in the LinkedIn mobile app for --login
  • You might get a captcha challenge if you logged in frequently. The --login command opens a browser where you can solve it manually.

Scraping issues:

  • Use --no-headless to see browser actions and debug scraping problems
  • Add --log-level DEBUG to see more detailed logging

Session issues:

  • Browser profile is stored at ~/.linkedin-mcp/profile/
  • Use --logout to clear the profile and start fresh

Python/Patchright issues:

  • Check Python version: python --version (should be 3.12+)
  • Reinstall Patchright: uv run patchright install chromium
  • Reinstall dependencies: uv sync --reinstall

Timeout issues:

  • If pages fail to load or elements aren't found, try increasing the timeout: --timeout 10000
  • Users on slow connections may need higher values (e.g., 15000-30000ms)
  • Can also set via environment variable: TIMEOUT=10000

Custom Chrome path:

  • If Chrome is installed in a non-standard location, use --chrome-path /path/to/chrome
  • Can also set via environment variable: CHROME_PATH=/path/to/chrome


Acknowledgements

Built with FastMCP and Patchright.

Use in accordance with LinkedIn's Terms of Service. Web scraping may violate LinkedIn's terms. This tool is for personal use only.

License

This project is licensed under the Apache 2.0 license.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

linkedin_scraper_mcp-4.9.1.tar.gz (128.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

linkedin_scraper_mcp-4.9.1-py3-none-any.whl (101.7 kB view details)

Uploaded Python 3

File details

Details for the file linkedin_scraper_mcp-4.9.1.tar.gz.

File metadata

  • Download URL: linkedin_scraper_mcp-4.9.1.tar.gz
  • Upload date:
  • Size: 128.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for linkedin_scraper_mcp-4.9.1.tar.gz
Algorithm Hash digest
SHA256 8b5280efb8bcc76dba3a733beb8075f66b40afb8594408f71ed883f95bbd6609
MD5 19432c19f54f2df50ebf295f6ab2aa9e
BLAKE2b-256 f9448302556db57e4f0bc1787cfe886b4ba9935bf1161b0790e6b592c761dded

See more details on using hashes here.

Provenance

The following attestation bundles were made for linkedin_scraper_mcp-4.9.1.tar.gz:

Publisher: release.yml on stickerdaniel/linkedin-mcp-server

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file linkedin_scraper_mcp-4.9.1-py3-none-any.whl.

File metadata

File hashes

Hashes for linkedin_scraper_mcp-4.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 22d817d869b588231e9ef06450bd6d6c921140b0fe22dbf564841f3b0337efef
MD5 a70bc6e5ecd9db4a6a49df4b490c664b
BLAKE2b-256 601ef4026dd9f26d7a6d7e9651044fd25228f6d3118582dbc475039dc0f68f03

See more details on using hashes here.

Provenance

The following attestation bundles were made for linkedin_scraper_mcp-4.9.1-py3-none-any.whl:

Publisher: release.yml on stickerdaniel/linkedin-mcp-server

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page