A Model Context Protocol (MCP) server for LinkedIn. Search people, companies, and jobs, scrape profiles, and retrieve structured JSON data from any MCP-compatible AI client. Built with FastMCP, Patchright, and a clean hexagonal architecture. The getpersonprofile tool supports granular section scraping. Request only the sections you need: - Main profile (always included) — name, headline, location,
Add this skill
npx mdskills install eliasbiondo/linkedin-mcp-serverComprehensive LinkedIn data scraper with excellent architecture and detailed setup documentation
1# LinkedIn MCP Server23A [Model Context Protocol](https://modelcontextprotocol.io/) (MCP) server for LinkedIn. Search people, companies, and jobs, scrape profiles, and retrieve structured JSON data from any MCP-compatible AI client.45Built with [FastMCP](https://github.com/PrefectHQ/fastmcp), [Patchright](https://github.com/Kaliiiiiiiiii-Vinyzu/patchright), and a clean hexagonal architecture.67---89## Features1011| Category | Tools |12| ----------- | ---------------------------------------- |13| People | `get_person_profile` · `search_people` |14| Companies | `get_company_profile` · `get_company_posts` |15| Jobs | `get_job_details` · `search_jobs` |16| Browser | `close_browser` |1718### Person Profile Sections1920The `get_person_profile` tool supports granular section scraping. Request only the sections you need:2122- **Main profile** (always included) — name, headline, location, followers, connections, about, profile image23- **Experience** — title, company, dates, duration, description, company logo24- **Education** — school, degree, dates, description, school logo25- **Contact info** — email, phone, websites, birthday, LinkedIn URL26- **Interests** — people, companies, and groups followed27- **Honors and awards** — title, issuer, description28- **Languages** — language name and proficiency level29- **Posts** — recent activity with reactions and timestamps30- **Recommendations** — received and given, with author details3132### Company Profile Sections3334- **About** (always included) — overview, website, industry, size, headquarters, specialties, logo35- **Posts** — recent feed posts with engagement metrics36- **Jobs** — current open positions3738### Job Search Filters3940The `search_jobs` tool supports the following filters:4142| Filter | Values |43| ------------------ | ------------------------------------------------------------------------- |44| `date_posted` | `past_hour`, `past_24_hours`, `past_week`, `past_month` |45| `job_type` | `full_time`, `part_time`, `contract`, `temporary`, `internship`, `other` |46| `experience_level` | `entry`, `associate`, `mid_senior`, `director`, `executive` |47| `work_type` | `on_site`, `remote`, `hybrid` |48| `easy_apply` | `true` / `false` |49| `sort_by` | `date`, `relevance` |5051---5253## Prerequisites5455- Python 3.12 or later56- [uv](https://docs.astral.sh/uv/) package manager57- A LinkedIn account for authentication5859---6061## Quick Start6263### 1. Clone and install6465```bash66git clone https://github.com/eliasbiondo/linkedin-mcp-server.git67cd linkedin-mcp-server68uv sync69```7071### 2. Authenticate with LinkedIn7273```bash74uv run linkedin-mcp-server --login75```7677A browser window will open. Log in to LinkedIn and the session will be persisted locally at `~/.linkedin-mcp-server/browser-data`.7879### 3. Run the server8081**stdio transport** (default — for Claude Desktop, Cursor, and similar clients):8283```bash84uv run linkedin-mcp-server85```8687**HTTP transport** (for remote clients, the MCP Inspector, etc.):8889```bash90uv run linkedin-mcp-server --transport streamable-http --host 0.0.0.0 --port 800091```9293---9495## Client Integration9697### Claude Desktop / Cursor9899Add to your MCP configuration file:100101```json102{103 "mcpServers": {104 "linkedin": {105 "command": "uv",106 "args": [107 "--directory", "/path/to/linkedin-mcp-server",108 "run", "linkedin-mcp-server"109 ]110 }111 }112}113```114115### MCP Inspector116117```bash118npx @modelcontextprotocol/inspector119```120121Then connect to `http://localhost:8000/mcp` if using HTTP transport.122123---124125## Configuration126127Configuration follows a strict precedence chain: **CLI args > environment variables > `.env` file > defaults**.128129### CLI Arguments130131| Argument | Description | Default |132| ------------- | ----------------------------------- | ----------- |133| `--transport` | `stdio` or `streamable-http` | `stdio` |134| `--host` | Host for HTTP transport | `127.0.0.1` |135| `--port` | Port for HTTP transport | `8000` |136| `--log-level` | `DEBUG`, `INFO`, `WARNING`, `ERROR` | `WARNING` |137| `--login` | Open browser for LinkedIn login | — |138| `--logout` | Clear stored credentials | — |139| `--status` | Check session status | — |140141### Environment Variables142143Create a `.env` file in the project root:144145```env146# Server147LINKEDIN_TRANSPORT=stdio148LINKEDIN_HOST=127.0.0.1149LINKEDIN_PORT=8000150LINKEDIN_LOG_LEVEL=WARNING151152# Browser153LINKEDIN_HEADLESS=true154LINKEDIN_SLOW_MO=0155LINKEDIN_TIMEOUT=5000156LINKEDIN_VIEWPORT_WIDTH=1280157LINKEDIN_VIEWPORT_HEIGHT=720158LINKEDIN_CHROME_PATH=159LINKEDIN_USER_AGENT=160LINKEDIN_USER_DATA_DIR=~/.linkedin-mcp-server/browser-data161```162163---164165## Architecture166167The project follows a hexagonal (ports and adapters) architecture with strict layer separation:168169```170src/linkedin_mcp_server/171├── domain/ # Core business logic — zero external dependencies172│ ├── models/ # Data models (Person, Company, Job, Search)173│ ├── parsers/ # HTML to structured data parsers174│ ├── exceptions.py # Domain exceptions175│ └── value_objects.py # Immutable configuration and content objects176├── ports/ # Abstract interfaces177│ ├── auth.py # Authentication port178│ ├── browser.py # Browser automation port179│ └── config.py # Configuration port180├── application/ # Use cases — orchestration layer181│ ├── scrape_person.py182│ ├── scrape_company.py183│ ├── scrape_job.py184│ ├── search_people.py185│ ├── search_jobs.py186│ └── manage_session.py187├── adapters/ # Concrete implementations188│ ├── driven/ # Infrastructure adapters (browser, auth, config)189│ └── driving/ # Interface adapters (CLI, MCP tools, serialization)190└── container.py # Dependency injection composition root191```192193### Design Decisions194195- **Ports and adapters** — Domain logic is fully decoupled from infrastructure. The browser engine, MCP framework, and configuration source can all be swapped independently.196- **Dependency injection** — A single `Container` class acts as the composition root and is the only place that imports concrete adapter classes.197- **Structured JSON output** — LinkedIn HTML is parsed into typed Python dataclasses, then serialized to JSON for reliable LLM consumption.198- **Session persistence** — Browser state is saved to disk, so authentication is required only once.199200---201202## Development203204### Setup205206```bash207uv sync --group dev208uv run pre-commit install209```210211### Running tests212213```bash214uv run pytest215```216217With coverage:218219```bash220uv run pytest --cov=linkedin_mcp_server221```222223### Linting and formatting224225This project uses [Ruff](https://docs.astral.sh/ruff/) for both linting and formatting. Pre-commit hooks will run these automatically on each commit.226227```bash228# Lint229uv run ruff check .230231# Lint and auto-fix232uv run ruff check . --fix233234# Format235uv run ruff format .236```237238---239240## License241242This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.243244---245246## Contributing247248Contributions are welcome. Please read the [contributing guide](CONTRIBUTING.md) for details on the development workflow and submission process.249250---251252## Disclaimer253254This tool is intended for personal and educational use. Scraping LinkedIn may violate their Terms of Service. Use responsibly and at your own risk. The authors are not responsible for any misuse or consequences arising from the use of this software.255
Full transparency — inspect the skill content before installing.