LinkedIn MCP Server
Enables Claude to access and analyze LinkedIn profile data through the Model Context Protocol, allowing users to query their LinkedIn information directly within Claude Desktop.
README
LinkedIn MCP Server with Anthropic Integration
A Python-based MCP (Model Context Protocol) server that gets stuff from your LinkedIn profile and integrates with the Anthropic API for potential analysis tasks. This project follows the src layout for Python packaging.
TL;DR Install for Claude Desktop Access to the LinkedIn profile
# 1.a) Install the mcp server access in Claude Desktop
./install_claude_desktop_mcp.sh
# 1.b) or manually integrate this JSON snippet to the `mcpServers` section of your `claude_desktop_config.json` (e.g. `~/Library/Application\ Support/Claude/claude_desktop_config.json`)
{
"linkedin_francisco_perez_sorrosal": {
"command": "npx",
"args": ["mcp-remote", "http://localhost:10000/mcp"]
}
}
# 2) Restart Claude and check that the 'Add from linkedin_francisco_perez_sorrosal` option is available in the mcp servers list
# 3) Query the LinkedIn profile served from the mcp server in Claude Desktop!
e.g. TODO
Features
- Serves your LinkedIn profile from the project root
- Built with FastAPI for high performance and with Pixi for dependency management and task running
- Source code organized in the
src/directory - Includes configurations for:
- Docker (optional, for containerization)
- Linting (Ruff, Black, iSort)
- Formatting
- Type checking (MyPy)
Prerequisites
- Python 3.11+
- Pixi (for dependency management and task execution)
- Docker (optional, for containerization)
- Access to your LinkedIn profile
Project Structure
.
├── .dockerignore
├── .gitignore
├── Dockerfile
├── pyproject.toml # Python project metadata and dependencies (PEP 621)
├── README.md
├── src/
│ └── linkedin_mcp_server/
│ ├── __init__.py
│ └── main.py # FastAPI application logic
├── tests/ # Test files (e.g., tests_main.py)
Setup and Installation
-
Clone the repository (if applicable) or ensure you are in the project root directory.
-
Install dependencies using Pixi:
This command will create a virtual environment and install all necessary dependencies:
pixi install
Running the Server
Pixi tasks are defined in pyproject.toml:
mcps (MCP Server)
pixi run mcps --transport stdio
Development Mode (with auto-reload)
# Using pixi directly
pixi run mcps --transport stdio # or sse, streamable-http
# Alternatively, using uv directly
uv run --with "mcp[cli]" mcp run src/linkedin_mcp_server/main.py --transport streamable-http
# Go to http://127.0.0.1:10000/mcp
The server will start at http://localhost:10000. It will automatically reload if you make changes to files in the src/ directory.
MCP Inspection Mode
# Using pixi
DANGEROUSLY_OMIT_AUTH=true npx @modelcontextprotocol/inspector pixi run mcps --transport stdio
# Direct execution
DANGEROUSLY_OMIT_AUTH=true npx @modelcontextprotocol/inspector pixi run python src/linkedin_mcp_server/main.py --transport streamable-http
This starts the inspector for the MCP Server.
Web scrapper
pixi run python src/linkedin_mcp_server/web_scrapper.py
Development Tasks
Run Tests
pixi run test
Lint and Check Formatting
pixi run lint
Apply Formatting and Fix Lint Issues
pixi run format
Build the Package
Creates sdist and wheel in dist/:
pixi run build
Docker Support (Optional)
Build the Docker Image
docker build -t linkedin-mcp-server .
Run the Docker Container
TODO: Rewrite this if necessary. Docker support not yet done.
MCP Server Configuration
Local Configuration for Claude Desktop
{
"linkedin_francisco_perez_sorrosal": {
"command": "uv",
"args": [
"run",
"--with", "mcp[cli]",
"--with", "pymupdf4llm",
"mcp", "run",
"src/linkedin_mcp_server/main.py",
"--transport", "streamable-http"
]
}
}
Remote Configuration for Claude Desktop
For connecting to a remote MCP server:
{
"linkedin_francisco_perez_sorrosal": {
"command": "npx",
"args": ["mcp-remote", "http://localhost:10000/mcp"]
}
}
Note: Update the host and port as needed for your deployment.
Currently I'm using render.com to host the MCP server. The configuration is in the config/claude.json file.
Render requires requirements.txt to be present in the root directory. You can generate it using:
uv pip compile pyproject.toml > requirements.txt
Also requires runtime.txt to be present in the root directory with the Python version specified:
python-3.11.11
Remember also to set the environment variables in the render.com dashboard:
TRANSPORT=sse
PORT=8000
Then you can query in Claude Desktop using the linkedin_mcp_fps MCP server to get info:
TODO
License
This project is licensed under the MIT License. See pyproject.toml (See LICENSE file) for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.