LinkedIn Jobs MCP Server
Self-hosted MCP server that scrapes LinkedIn jobs using your authenticated session cookies, enabling job search and details retrieval without per-run costs.
README
LinkedIn Jobs MCP Server
Self-hosted MCP server that scrapes LinkedIn jobs with your authenticated session cookies. Runs in Docker. No Apify, no per-run costs.
Setup
1. Get your cookies
- Install the Cookie-Editor Chrome extension
- Log into LinkedIn (use a burner account)
- Click Cookie-Editor → Export (JSON)
- Find and copy two values:
li_at— your session cookieJSESSIONID— used as CSRF token (value looks likeajax:1234...)
2. Configure
cp .env.example .env
# Edit .env and paste your cookie values
3. Build and test
# Build the Docker image
docker build -t linkedin-mcp .
# Quick smoke test — should print the MCP server startup log
docker run --rm --env-file .env linkedin-mcp
# Ctrl+C to stop
4. Wire into Claude Desktop
Edit your Claude Desktop config file:
- Mac:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"linkedin-jobs": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"--env-file", "/ABSOLUTE/PATH/TO/linkedin-mcp/.env",
"linkedin-mcp"
]
}
}
}
⚠️ Use the absolute path to your .env file.
~/does not expand here.
5. Restart Claude Desktop
After saving the config, fully quit and reopen Claude Desktop.
You'll see a 🔧 tools icon in the chat bar — click it to confirm
scrape_jobs, get_job_details, check_cookie, and update_cookies are listed.
6. First conversation
You: Check if my LinkedIn cookie is valid
Claude: [calls check_cookie] ✓ Authenticated as John Doe
You: Scrape 20 AI Engineer or ML Engineer jobs posted in the last 3 days in the US
Claude: [calls scrape_jobs] ...returns full job list with descriptions
Cookie refresh (every 30–60 days)
When cookies expire, re-export from Cookie-Editor and tell Claude:
Update my LinkedIn cookies: li_at is "new_value" and jsessionid is "new_value"
Claude will call update_cookies — no container restart needed.
Development
# Run with live source reloading
docker compose up
# Inspect MCP tools without Claude Desktop
npx @modelcontextprotocol/inspector docker run --rm -i --env-file .env linkedin-mcp
Project structure
linkedin-mcp/
├── src/
│ ├── server.py # MCP server — tool definitions and handlers
│ ├── scraper.py # LinkedIn Voyager API calls (httpx)
│ ├── models.py # Pydantic models for Job data
│ └── __init__.py
├── Dockerfile
├── docker-compose.yml
├── requirements.txt
├── .env.example
└── .gitignore
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.