Loading...
Browse 589 Model Context Protocol servers and integrations. Connect your AI tools to data sources, APIs, and workflows.
Looking for standalone AI tools? Browse AI Tools

llm-context provides intelligent context management for LLMs, enabling focused, task-specific project context through composable rules and MCP integration.

This MCP server provides access to OpenAI's ChatGPT API, enabling conversational AI capabilities within Claude Desktop with web search and conversation state management.

Exposes current date and time information for specified timezones and system local time, offering read-only access to datetime data.




Provides intelligent web search capabilities using OpenAI's reasoning models, ideal for AI assistants needing up-to-date information.

This MCP server provides job search functionality with filtering options, allowing users to find relevant job postings based on keywords, location, and other criteria.





The VictoriaMetrics MCP server provides a Model Context Protocol interface for querying and interacting with VictoriaMetrics instances, enabling monitoring, observability, and debugging.



This MCP server enables web browsing via Playwright, exposing a 'navigate' tool controlled by Azure OpenAI, requiring careful configuration and dependency management.

Kubernetes MCP server provides a flexible interface to interact with Kubernetes clusters, supporting read and write operations, Helm, and OpenTelemetry.


This MCP server provides read-only access to SQLite databases with query validation, table listing, and schema description, enhancing safe data exploration.

This MCP server enables AI models to interact with the YCloud WhatsApp API, automating tasks like sending messages and managing contacts via API calls.


This MCP server provides real-time cross-chain bridge rates and optimal transfer routes, assisting onchain AI agents in making informed decisions.

MCP Text Editor Server provides line-oriented text file editing with conflict detection, partial file access, and encoding support, optimized for LLM tools.