Skip to main content
  1. Blog/

Build Custom MCP Catalogs with Docker: Enterprise Control for AI Tooling

·5 mins
Nitin Kumar Singh
Author
Nitin Kumar Singh
I build enterprise AI solutions and cloud-native systems. I write about architecture patterns, AI agents, Azure, and modern development practices — with full source code.
Table of Contents

Docker has introduced comprehensive MCP (Model Context Protocol) tooling that enables organizations to build custom catalogs with complete control over AI tool access. With over 220+ containerized MCP servers available and the ability to create private catalogs, enterprises can now deploy AI tooling with appropriate security guardrails.

The Challenge
#

Enterprise customers need tighter control over AI tooling access. Many organizations have strict security policies prohibiting direct pulls from Docker Hub, while others want to offer only a curated set of trusted MCP servers to their teams. Docker’s MCP ecosystem addresses these challenges head-on.

The Solution: Docker’s MCP Ecosystem
#

Docker’s MCP ecosystem consists of three integrated components:

1. MCP Catalog
#

A YAML-based index of MCP server definitions describing how to run each server and associated metadata (description, image, repository). The official catalog{:target="_blank"} hosts 220+ containerized, security-hardened MCP servers ready to run. While the official catalog is read-only, you can fork it, export it, or build your own completely custom catalog.

2. MCP Gateway
#

The open-source MCP Gateway{:target="_blank"} acts as a centralized proxy between clients and servers. Instead of configuring X servers across Y clients (requiring X × Y configuration entries), the Gateway reduces this to just Y entries—one per client.

The Gateway doesn’t “host” anything—servers run as regular Docker containers with proper isolation, restricted privileges, network access, and resource usage. It provides:

  • Centralized connection point exposing multiple servers over HTTP SSE or STDIO
  • Lifecycle management for all MCP servers
  • Security isolation running servers in Docker containers with minimal privileges
  • Built-in logging and call-tracing for full visibility and governance
  • Automatic credential injection with secure secrets management

3. MCP Toolkit (GUI)
#

Built into Docker Desktop, the MCP Toolkit{:target="_blank"} provides a graphical interface to:

  • Browse and access Docker’s MCP Catalog
  • Securely handle secrets (API keys, GitHub tokens)
  • Enable/disable MCP servers with one click
  • Connect servers to clients like Claude Desktop, Claude Code, Cursor, Continue.dev, and Gemini CLI

Key Benefits
#

Control: Host MCP server images in your own container registry
Security: Enforce strict policies with containerized isolation and restricted privileges
Simplicity: Manage all servers through a single gateway connection
Flexibility: Fork existing catalog or build completely custom ones from scratch
Visibility: Built-in logging and tracing of all AI tool activity
Secrets Management: Secure credential handling via Docker Desktop

Getting Started with Custom Catalogs
#

Creating a custom MCP catalog involves eight key steps:

Quick Start Commands
#

# 1. Export the official catalog to inspect contents
docker mcp catalog show docker-mcp --format yaml > docker-mcp.yaml

# 2. Fork the official catalog for editing
docker mcp catalog fork docker-mcp my-fork

# 3. Create your private catalog
docker mcp catalog create my-private-catalog

# 4. Add specific servers from your fork
docker mcp catalog export my-fork ./my-fork.yaml
docker mcp catalog add my-private-catalog duckduckgo ./my-fork.yaml

# 5. Pull, retag, and push images to your registry
docker pull mcp/duckduckgo@sha256:...
docker image tag mcp/duckduckgo ghcr.io/yourorg/duckduckgo:latest
docker push ghcr.io/yourorg/duckduckgo:latest

# 6. Update catalog to point to your images
# Edit ~/.docker/mcp/catalogs/my-private-catalog.yaml

# 7. Enable and run the server
docker mcp server enable duckduckgo
docker mcp gateway run --catalog my-private-catalog

Client Integration
#

For VS Code or other clients, update the MCP configuration:

{
  "servers": {
    "docker-mcp-gateway-private": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "mcp",
        "gateway",
        "run",
        "--catalog",
        "my-private-catalog"
      ]
    }
  }
}

For Claude Code, use the simplified command:

docker mcp client connect claude-code --global

Advanced Use Cases
#

Building Images from Source
#

If you cannot pull from Docker Hub, rebuild MCP servers from their GitHub repositories:

# Find the source repository in the catalog YAML
# Example for DuckDuckGo
export SOURCE_REPO="https://github.com/nickclyde/duckduckgo-mcp-server.git"

# Build and push multi-platform image
docker buildx build \
  --platform linux/amd64,linux/arm64 \
  "${SOURCE_REPO}" \
  -t ghcr.io/yourorg/duckduckgo:latest \
  --push

Managing Configuration and Secrets
#

# Manage server configuration
docker mcp config read
docker mcp config write '<yaml-config>'

# Handle secrets securely
docker mcp secret --help

# OAuth flows
docker mcp oauth --help

# Export secrets for cloud deployments
docker mcp secret export server1 server2

Tool Management
#

# List all available tools
docker mcp tools ls

# Inspect a specific tool
docker mcp tools inspect <tool-name>

# Call a tool directly
docker mcp tools call <tool-name> [arguments...]

Architecture
#

The MCP Gateway implements a centralized proxy pattern:

AI Client → MCP Gateway → MCP Servers (Docker Containers)

Each MCP server runs in an isolated Docker container with:

  • Minimal host privileges
  • Restricted network access
  • Resource usage limits
  • Automatic credential injection
  • Full logging and tracing

What’s Next?
#

This is transformative for enterprises deploying AI tooling with proper governance. By using Docker’s MCP ecosystem, you can:

  • Add more servers to your custom catalog
  • Set up CI/CD to rebuild and publish server images automatically
  • Share catalogs internally or with customers
  • Maintain full visibility and control over AI tool usage

Learn More
#

Complete Tutorial: Build Custom MCP Catalog{:target="_blank"} - Step-by-step guide by Docker Staff Solutions Architect Mike Coleman

Official Documentation:

Source Code: MCP Gateway GitHub Repository{:target="_blank"} - 998 stars, actively maintained

MCP Catalog: Explore 220+ Servers{:target="_blank"} - Browse containerized, security-hardened MCP servers

Related Reading:


Want more AI and MCP content? Explore additional MCP-related posts on this blog.

Related

Streamlining AI Development with LiteLLM Proxy: A Comprehensive Guide

·9 mins
In the rapidly evolving landscape of artificial intelligence, development teams face significant challenges when integrating multiple AI models into their workflows. The proliferation of different providers, APIs, and pricing models creates complexity that can slow down innovation and increase technical debt. This article explores a powerful solution: a Docker-based setup combining LiteLLM proxy with Open WebUI that streamlines AI development and provides substantial benefits for teams of all sizes.

Deploying Ollama with Open WebUI Locally: A Step-by-Step Guide

·9 mins
Introduction # Large Language Models (LLMs) have become a cornerstone of modern AI applications, from chatbots that provide customer support to content generation tools for images and videos. They power virtual assistants, automated translation systems, and personalized recommendation engines, showcasing their versatility across industries. However, running these models on a local machine has traditionally been a complex and resource-intensive task, requiring significant configuration and technical expertise. This complexity often deters beginners and even intermediate users who are eager to explore the capabilities of LLMs in a private, local environment.

Dockerizing the .NET Core API, Angular and MS SQL Server

·10 mins
Introduction # Docker has revolutionized application deployment by enabling consistent environments across development, testing, and production. In Clean Architecture, containerization aligns perfectly with the separation of concerns principle, allowing each layer to be independently developed and deployed. This article explores how to Dockerize the Contact Management Application’s .NET Core API and MS SQL Server database.