Steve Kinney

Popular MCP Servers for Claude Code and Cursor

Model Context Protocol (MCP) is an open standard protocol that enables different AI models and tools to talk to one another seamlessly. It functions like a universal connector, standardizing how Large Language Models (LLMs) and AI agentic frameworks interact with external systems. MCP servers are components within this ecosystem that expose capabilities, tools, and data sources through the protocol. They run as independent processes and communicate with AI clients via transport protocols like stdio, Server-Sent Events (SSE), or Streamable HTTP. These servers allow AI agents to access necessary endpoints, data sources, and business tools more efficiently, replacing the need for custom integration points for each data source.

MCP servers are highly beneficial when working with Cursor or Claude Code because they stream external knowledge like files, databases, design specs, and issue trackers directly into the LLM’s “context window,” enhancing performance and enabling more dynamic interactions. They also allow for direct integration with external services and APIs, offering a streamlined workflow and eliminating the friction of setting up local servers.

Knowledge & Documentation Servers

Context 7 by Upstach

Addresses the common problem where AI coding assistants like Cursor and Windsurf struggle with writing code for newer APIs or packages, leading to hallucinations or outdated solutions. Context 7 pulls up-to-date documentation and code examples from its own libraries or websites, ingesting them directly into the app’s context to generate correct solutions much faster. This reduces frustration and speeds up development, especially given the rapid pace of change in the AI space. It’s described as a “super smart librarian” for Cursor.

Notion MCP

Allows AI assistants to interact directly with your Notion pages and databases, enabling tasks like creating documentation (PRDs, tech specs), searching content, managing tasks, building reports, and planning campaigns. It provides live access to documents without manual copying or syncing, making workflows cleaner and more reliable. It’s called a “killer” MCP server and is “optimized for AI” with efficient data formatting.

Code & DevOps Servers

GitHub MCP

The official GitHub MCP server, it provides seamless integration with GitHub’s issue tracking system and APIs, allowing LLMs to interact with GitHub issues, list pull requests, create branches, analyze code, and automate various repository-centric workflows. It offers frictionless setup (remote hosting) and auto-updates. repo-centric workflows”.

Desktop Commander

Gives AI the ability to read, write, and use command lines directly on your machine, allowing Claude to modify your computer or code for you. It’s described as “extremely powerful” for native desktop interaction.

Databases & Vector Memory Servers

Supabase MCP

The official Supabase server that connects AI tools to your Supabase projects, allowing them to manage tables, fetch configuration, and query data. It supports secure, read-only modes and project scoping. It can handle tasks like designing tables with migrations, fetching data, running reports, and spinning up new Supabase projects. It uses JWT-based authentication for cloud safety.

Postgres MCP Pro

Provides configurable read/write access and performance analysis for PostgreSQL databases. It allows AI agents to explore database schemas with granular row-level filters and offers an optional read-only mode, enabling safe database interactions. It includes tools for index tuning, explain plans, health checks, and secure SQL execution. It combines LLMs with “classical optimization algorithms” for reliable and flexible database tuning.

Web Automation & Search Servers

Playwright MCP

Provides browser automation capabilities, allowing LLMs to interact with web pages through structured accessibility snapshots, bypassing the need for screenshots or visually-tuned models. It’s fast, lightweight, and LLM-friendly. It’s particularly useful for testing web applications running locally.

Provides web search capabilities for AI agents. It’s highlighted for its AI-centric web search, summarizing information perfectly for LLMs, and finding supplemental resources like community forum posts that might not be in a knowledge base.

Firecrawl MCP

An open-source server that can grab text, images, or data from websites and organize it into formats like JSON or Markdown for Cursor to use. It can scan at high speeds and is great for research or content projects.

A programmatic search tool that helps in finding real statistics and solutions to recurring issues or bugs. It offers better development experience, results, and features compared to Perplexity, and enables strong control over the context provided to the system for more precise and accurate AI outputs.

Specialized & Emerging Servers

Magic UI MCP

These design-oriented MCPs help integrate UI components and libraries into applications, especially useful for landing page design. They can translate designs into fully coded applications more effectively, improving code generation fidelity and reducing LLM usage by minimizing guesswork.

Pieces

Acts as a shared memory among different LLMs, watching user activity in tools like Claude, Gemini AI Studio, Perplexity, and Cursor, and saving contextual snippets to a database for later recall. This allows for a shared memory pool across various AI coding assistants.

Sentry MCP

Provides a secure way to bring Sentry’s full issue context into MCP-leveraging systems. It allows access to Sentry issues and errors, searching for errors in specific files, querying projects, and even invoking Seer for automated issue fixes. It is remotely hosted (preferred) and supports OAuth.

Zapier

Connects AI agents to 8,000+ apps instantly, allowing for broad automation capabilities across various services.

Last modified on .