Adrienne Vermorel
MCP Ecosystem Overview: Servers, Clients, and SDKs
Choosing which MCP servers to install means navigating thousands of options. The main repository has 75,000+ GitHub stars, every major AI vendor has adopted the protocol, and community directories list over 5,800 servers.
This article maps the current landscape: official servers you can trust, clients worth considering, SDKs for building your own, and the database-specific servers that actually matter for our work.
Official Reference Servers
The MCP Steering Group maintains reference servers that demonstrate best practices. These are the ones you can trust for production use, and they’re good models if you end up building your own.
The actively maintained servers:
| Server | Description | Language | Repository |
|---|---|---|---|
| Everything | Reference/test server demonstrating prompts, resources, and tools | TypeScript | modelcontextprotocol/servers |
| Fetch | Web content fetching and conversion for efficient LLM usage | TypeScript | modelcontextprotocol/servers |
| Filesystem | Secure file operations with configurable access controls | TypeScript | modelcontextprotocol/servers |
| Git | Tools to read, search, and manipulate Git repositories | TypeScript | modelcontextprotocol/servers |
| Memory | Knowledge graph-based persistent memory system | TypeScript | modelcontextprotocol/servers |
| Sequential Thinking | Dynamic and reflective problem-solving through thought sequences | TypeScript | modelcontextprotocol/servers |
| Time | Time and timezone conversion capabilities | TypeScript | modelcontextprotocol/servers |
The Filesystem server might be one of the most helpful ones, particularly when developing custom agents. AI assistants can read your SQL files, dbt models, and configuration without you copying and pasting everything. Git is similarly practical for repository exploration and code review assistance.
Several original reference servers have been archived and transferred to community or vendor maintenance. They still work, but the core MCP team no longer updates them:
Database Servers:
- PostgreSQL
- SQLite
- Redis
Platform Integrations:
- GitHub (now maintained by GitHub)
- Slack (now maintained by Zencoder)
- Google Drive
- Google Maps
- Brave Search (now maintained by Brave)
Development Tools:
- Puppeteer
- Sentry
- AWS KB Retrieval
The transition to vendor maintenance benefits users. When GitHub maintains the GitHub MCP server, you get better integration and faster updates when their APIs change.
Client Landscape
MCP clients are applications that connect to servers and expose their capabilities to AI models. There are over 300 clients now, with 40+ in active development. Most of us will use just one or two.
| Client | Type | Platform | Key Features |
|---|---|---|---|
| Claude Desktop | Desktop app | Windows, macOS | Full native support (stdio), creator of MCP |
| Claude Code | CLI tool | Cross-platform | Full MCP support for coding workflows |
| Cursor | Code editor | Windows/Mac/Linux | Full MCP with SSE support |
| VS Code + GitHub Copilot | Extension | Cross-platform | Auto-discovery, command-line MCP |
| Windsurf | Code editor | Windows/Mac/Linux | Full MCP support |
| Cline | VS Code extension | Windows/Mac | Agentic coding with MCP tool creation |
| Continue | VS Code/JetBrains extension | Windows/Mac | Open-source AI code assistant with MCP |
| Zed | Code editor | macOS/Linux | Native MCP integration |
The choice depends on how you work. Claude Desktop works well for exploratory analysis, documentation review, and ad-hoc queries. Claude Code fits naturally into terminal-based workflows like running dbt. Cursor or Windsurf make sense when most of your work involves editing files in an IDE, letting you query databases or check pipeline status without context-switching.
Corporate Adopters
The protocol has broad industry support. OpenAI adopting it in March 2025 signaled that MCP extends beyond Anthropic’s ecosystem.
| Company | Integration | Significance |
|---|---|---|
| OpenAI | Adopted March 2025; integrated across Agents SDK and ChatGPT desktop | Validates MCP as the cross-vendor standard |
| Microsoft | Windows 11 MCP integration announced at Build 2025; Azure MCP Server | OS-level MCP support coming |
| AWS | Official awslabs/mcp servers | Cloud provider endorsement |
| Cloudflare | Official MCP server for Workers/KV/R2/D1 | Edge computing integration |
| Atlassian | Official Jira/Confluence MCP server | Project management integration |
| GitHub | Official GitHub MCP server; MCP Registry | Code platform integration |
| Block, Bloomberg | Production deployments | Enterprise validation |
In December 2025, Anthropic donated MCP to the Agentic AI Foundation under the Linux Foundation, making it a vendor-neutral open standard.
SDKs and Repositories
Building your own MCP server requires one of the official SDKs. Most languages are covered.
| Repository | Description | Stars |
|---|---|---|
| modelcontextprotocol/servers | Reference server implementations | 75.3K |
| modelcontextprotocol/python-sdk | Python SDK | 13.5K |
| modelcontextprotocol/typescript-sdk | TypeScript/Node.js SDK | 7.2K |
| modelcontextprotocol/csharp-sdk | C# SDK | 2.3K |
For data engineers, the Python SDK is the natural choice. Installation:
# Using uv (recommended)uv add "mcp[cli]"
# Using pippip install "mcp[cli]"Requires Python 3.10+. The SDK includes FastMCP, a high-level framework that handles most of the boilerplate:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("MyDataServer")
@mcp.tool()def query_table(table_name: str, limit: int = 100) -> str: """Query a table and return results.""" # Your database logic here return f"Results from {table_name}"
if __name__ == "__main__": mcp.run(transport="stdio")The TypeScript SDK works for Node.js environments:
npm install @modelcontextprotocol/server zodRequires Node.js 18+ (22.7.5+ recommended). Community SDKs also exist for Java, Kotlin, Go, Rust, Swift, Ruby, and PHP.
Data Engineering Servers
Rather than building custom integrations for each AI assistant, these servers provide standardized access to your data infrastructure.
| Server | Databases/Tools Supported | Repository |
|---|---|---|
| Snowflake-Labs/mcp | Cortex AI, SQL execution, semantic views, RBAC | snowflake-labs/mcp |
| ClickHouse/mcp-clickhouse | Schema inspection, query capabilities | ClickHouse/mcp-clickhouse |
| googleapis/genai-toolbox | BigQuery, Cloud SQL, Spanner, AlloyDB | googleapis/genai-toolbox |
| centralmind/gateway | PostgreSQL, MySQL, ClickHouse, Snowflake, BigQuery, MSSQL, Oracle, SQLite, ElasticSearch, DuckDB | centralmind/gateway |
| MindsDB MCP | Federated queries across PostgreSQL, MySQL, MongoDB, Snowflake, BigQuery | mindsdb/mcp |
| Databricks MCP | SQL queries via Statement Execution API | Community |
| confluentinc/mcp-confluent | Confluent Kafka, Cloud REST APIs | confluentinc/mcp-confluent |
The official Snowflake MCP server from Snowflake Labs includes Cortex AI integration, SQL execution, semantic views, and RBAC support. AI assistants can help with query optimization and data exploration using your actual warehouse data, respecting your existing access controls.
The centralmind/gateway server is worth knowing about if you have multiple databases. A single server can provide access to PostgreSQL, MySQL, ClickHouse, Snowflake, BigQuery, SQL Server, Oracle, SQLite, ElasticSearch, and DuckDB. One configuration file, one server process, access to your entire data landscape.
For streaming infrastructure, mcp-confluent lets AI assistants list topics and schemas, produce and consume messages, and access Confluent Cloud REST APIs. Useful for debugging pipelines or exploring topic contents.
Discovery Resources
The official registry at registry.modelcontextprotocol.io is the authoritative source for verified servers. Community directories aggregate the broader ecosystem:
| Resource | Description | URL |
|---|---|---|
| awesome-mcp-servers | Curated GitHub list with 5,800+ servers | punkpeye/awesome-mcp-servers |
| mcpservers.org | Searchable directory with categories | mcpservers.org |
| mcp.so | Discovery platform with ratings | mcp.so |
| pulsemcp.com | Ecosystem tracker with adoption metrics | pulsemcp.com |
Getting Started
The Filesystem server is the most universally useful for custom agents. AI assistants can read your SQL files, dbt models, and documentation directly instead of you copying everything into chat windows.
After that, your primary database (Snowflake, BigQuery via MCP Toolbox, PostgreSQL) makes the biggest difference. Schema awareness and sample data access turn generic AI suggestions into ones that actually work with your tables.
Git and the dbt MCP server are worth adding if you do code review or need lineage context, but they’re refinements rather than essentials.