A Model Context Protocol (MCP) server for LinkedIn. Search people, companies, and jobs, scrape profiles, and retrieve structured JSON data from any MCP-compatible AI client. Built with FastMCP, Patchright, and a clean hexagonal architecture. The getpersonprofile tool supports granular section scraping. Request only the sections you need: - Main profile (always included) — name, headline, location,
Add this skill
npx mdskills install eliasbiondo/linkedin-mcp-serverComprehensive LinkedIn data scraper with excellent architecture and detailed setup documentation
A Model Context Protocol (MCP) server for LinkedIn. Search people, companies, and jobs, scrape profiles, and retrieve structured JSON data from any MCP-compatible AI client.
Built with FastMCP, Patchright, and a clean hexagonal architecture.
| Category | Tools |
|---|---|
| People | get_person_profile · search_people |
| Companies | get_company_profile · get_company_posts |
| Jobs | get_job_details · search_jobs |
| Browser | close_browser |
The get_person_profile tool supports granular section scraping. Request only the sections you need:
The search_jobs tool supports the following filters:
| Filter | Values |
|---|---|
date_posted | past_hour, past_24_hours, past_week, past_month |
job_type | full_time, part_time, contract, temporary, internship, other |
experience_level | entry, associate, mid_senior, director, executive |
work_type | on_site, remote, hybrid |
easy_apply | true / false |
sort_by | date, relevance |
git clone https://github.com/eliasbiondo/linkedin-mcp-server.git
cd linkedin-mcp-server
uv sync
uv run linkedin-mcp-server --login
A browser window will open. Log in to LinkedIn and the session will be persisted locally at ~/.linkedin-mcp-server/browser-data.
stdio transport (default — for Claude Desktop, Cursor, and similar clients):
uv run linkedin-mcp-server
HTTP transport (for remote clients, the MCP Inspector, etc.):
uv run linkedin-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000
Add to your MCP configuration file:
{
"mcpServers": {
"linkedin": {
"command": "uv",
"args": [
"--directory", "/path/to/linkedin-mcp-server",
"run", "linkedin-mcp-server"
]
}
}
}
npx @modelcontextprotocol/inspector
Then connect to http://localhost:8000/mcp if using HTTP transport.
Configuration follows a strict precedence chain: CLI args > environment variables > .env file > defaults.
| Argument | Description | Default |
|---|---|---|
--transport | stdio or streamable-http | stdio |
--host | Host for HTTP transport | 127.0.0.1 |
--port | Port for HTTP transport | 8000 |
--log-level | DEBUG, INFO, WARNING, ERROR | WARNING |
--login | Open browser for LinkedIn login | — |
--logout | Clear stored credentials | — |
--status | Check session status | — |
Create a .env file in the project root:
# Server
LINKEDIN_TRANSPORT=stdio
LINKEDIN_HOST=127.0.0.1
LINKEDIN_PORT=8000
LINKEDIN_LOG_LEVEL=WARNING
# Browser
LINKEDIN_HEADLESS=true
LINKEDIN_SLOW_MO=0
LINKEDIN_TIMEOUT=5000
LINKEDIN_VIEWPORT_WIDTH=1280
LINKEDIN_VIEWPORT_HEIGHT=720
LINKEDIN_CHROME_PATH=
LINKEDIN_USER_AGENT=
LINKEDIN_USER_DATA_DIR=~/.linkedin-mcp-server/browser-data
The project follows a hexagonal (ports and adapters) architecture with strict layer separation:
src/linkedin_mcp_server/
├── domain/ # Core business logic — zero external dependencies
│ ├── models/ # Data models (Person, Company, Job, Search)
│ ├── parsers/ # HTML to structured data parsers
│ ├── exceptions.py # Domain exceptions
│ └── value_objects.py # Immutable configuration and content objects
├── ports/ # Abstract interfaces
│ ├── auth.py # Authentication port
│ ├── browser.py # Browser automation port
│ └── config.py # Configuration port
├── application/ # Use cases — orchestration layer
│ ├── scrape_person.py
│ ├── scrape_company.py
│ ├── scrape_job.py
│ ├── search_people.py
│ ├── search_jobs.py
│ └── manage_session.py
├── adapters/ # Concrete implementations
│ ├── driven/ # Infrastructure adapters (browser, auth, config)
│ └── driving/ # Interface adapters (CLI, MCP tools, serialization)
└── container.py # Dependency injection composition root
Container class acts as the composition root and is the only place that imports concrete adapter classes.uv sync --group dev
uv run pre-commit install
uv run pytest
With coverage:
uv run pytest --cov=linkedin_mcp_server
This project uses Ruff for both linting and formatting. Pre-commit hooks will run these automatically on each commit.
# Lint
uv run ruff check .
# Lint and auto-fix
uv run ruff check . --fix
# Format
uv run ruff format .
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome. Please read the contributing guide for details on the development workflow and submission process.
This tool is intended for personal and educational use. Scraping LinkedIn may violate their Terms of Service. Use responsibly and at your own risk. The authors are not responsible for any misuse or consequences arising from the use of this software.
Install via CLI
npx mdskills install eliasbiondo/linkedin-mcp-serverLinkedIn MCP Server is a free, open-source AI agent skill. A Model Context Protocol (MCP) server for LinkedIn. Search people, companies, and jobs, scrape profiles, and retrieve structured JSON data from any MCP-compatible AI client. Built with FastMCP, Patchright, and a clean hexagonal architecture. The getpersonprofile tool supports granular section scraping. Request only the sections you need: - Main profile (always included) — name, headline, location,
Install LinkedIn MCP Server with a single command:
npx mdskills install eliasbiondo/linkedin-mcp-serverThis downloads the skill files into your project and your AI agent picks them up automatically.
LinkedIn MCP Server works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Gemini Cli, Amp, Roo Code, Goose. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.