December 14, 2025

Run an Azure MCP Server on Azure Kubernetes Service

Container Solutions

AI workloads are moving fast. Large Language Models (LLMs) are no longer isolated systems. They need context from code repositories, container platforms, APIs, and internal tools. This is where the Multi Context Protocol (MCP) becomes important.

MCP is a protocol that allows AI agents to connect to multiple external systems in a structured and secure way. An MCP server acts as a bridge between an AI model and real systems like GitHub, Docker, or internal services.

In this article, you will learn how MCP fits into an Azure and AKS-based architecture. First, you will explore the theory behind MCP and why it matters for AI platforms. After that, you will deploy an MCP server on Azure Kubernetes Service and connect it to real-world systems like GitHub and Docker.

When you finish this article, you will have a clear understanding of MCP in an AI platform context, a running MCP server on AKS, and a practical setup that connects AI to GitHub and Docker contexts.

Prerequisites

Before diving into the next steps, ensure you have the following prerequisites in place:

  • An active Azure subscription with an existing Azure Kubernetes Service (AKS) cluster, and sufficient Azure permissions.
  • Azure CLI installed and logged in
  • kubectl configured to access your AKS cluster
  • Helm installed on your local machine
  • A personal or organization account on GitHub, and a Personal Access Token (PAT) from GitHub.
  • A local development environment, such as VS Code, for editing the YAML files.

Understanding MCP in an AI and AKS Context

AI models work best when they have context. Without context, an LLM can only guess. MCP is designed to solve this problem by standardizing how context is provided to AI agents.

An MCP server exposes context sources as tools. Examples of context sources are:

  • GitHub repositories
  • Docker images and containers
  • File systems
  • Internal APIs

From an architectural view, MCP fits well in Kubernetes. It runs as a stateless service, scales horizontally, and integrates with cloud-native security models.

In AKS, MCP servers are often part of a larger AI platform:

  • AKS runs the MCP server
  • Azure OpenAI or another LLM consumes MCP context
  • GitHub and Docker provide real-time operational data

This setup allows AI agents to answer questions like: “What is running in production and which code version is deployed?”

In the next steps, you will build this architecture step by step.

Deploying an MCP Server on AKS

To make MCP more concrete, this article uses a simple but realistic scenario where an AI assistant helps platform engineers understand their runtime environment. It can read deployment manifests from GitHub, inspect Docker images used in AKS, and answer questions about running workloads.

To support this, the MCP server will connect to the GitHub MCP provider for repository context, and the Docker MCP provider for container context. This setup is common in internal developer platforms and AI-assisted operations.

First, create a namespace for the MCP server:

kubectl create namespace mcp-system

Next, create a basic deployment for the MCP server. This example uses a generic MCP server container. Create the file mcp-deployment.yaml:

[label mcp-deployment.yaml]
apiVersion: apps/v1
kind: Deployment
metadata:
  name: mcp-server
  namespace: mcp-system
spec:
  replicas: 2
  selector:
    matchLabels:
      app: mcp-server
  template:
    metadata:
      labels:
        app: mcp-server
    spec:
      containers:
        - name: mcp-server
          image: <^>your_mcp_server_image<^>
          ports:
            - containerPort: 3333
          env:
            - name: MCP_PORT
              value: "3333"

Apply the deployment by running kubectl apply -f mcp-deployment.yaml.

Now expose the MCP server inside the cluster:

[label mcp-service.yaml]
apiVersion: v1
kind: Service
metadata:
  name: mcp-server
  namespace: mcp-system
spec:
  selector:
    app: mcp-server
  ports:
    - port: 3333
      targetPort: 3333

Apply the service by running kubectl apply -f mcp-service.yaml.

At this point, the MCP server is running inside AKS. In the next step, you will connect external context sources.

Connecting the GitHub MCP Provider

The GitHub MCP provider allows the MCP server to read repositories, issues, and workflows.

First, create a Kubernetes secret with your GitHub token:

kubectl create secret generic github-token \
  --from-literal=token=<your_github_token> \
  -n mcp-system

Now update the MCP deployment to include GitHub configuration. Edit mcp-deployment.yaml and add the following environment variables:

[label mcp-deployment.yaml]
...
env:
  - name: GITHUB_TOKEN
    valueFrom:
      secretKeyRef:
        name: github-token
        key: token
  - name: MCP_GITHUB_ENABLED
    value: "true"
...

Apply the updated deployment by running kubectl apply -f mcp-deployment.yaml.

The MCP server can now fetch context from GitHub repositories. In the next step, you will add Docker context.

Connecting the Docker MCP Provider

Docker context gives insight into container images and metadata. This is useful for AI-assisted debugging and compliance checks.

Update the deployment again and add Docker-related configuration:

[label mcp-deployment.yaml]
...
env:
  - name: MCP_DOCKER_ENABLED
    value: "true"
  - name: DOCKER_HOST
    value: "unix:///var/run/docker.sock"
...
volumeMounts:
  - name: docker-socket
    mountPath: /var/run/docker.sock
volumes:
  - name: docker-socket
    hostPath:
      path: /var/run/docker.sock

Apply the changes by running kubectl apply -f mcp-deployment.yaml

⚠️ Note: Mounting the Docker socket gives powerful access. In production, this should be carefully reviewed and restricted.

Now the MCP server can combine GitHub and Docker context.

Conclusion

In this article, you deployed an Azure MCP server on Azure Kubernetes Service and connected it to real-world context sources like GitHub and Docker. You first explored the theory behind MCP and its role in AI platforms. After that, you built a practical, hands-on setup on AKS. With MCP running, an AI agent can query it using the protocol. Example questions the AI can now answer:

  • “Which Docker image is used by this deployment?”
  • “Which GitHub commit introduced this configuration?”
  • “Is this container using an outdated base image?”

The MCP server translates these questions into structured queries to GitHub and Docker, then returns normalized context to the AI model. This makes AI responses more accurate and less speculative. You can now extend this setup by adding more MCP providers, start securing MCP access with network policies, or connect MCP to Azure OpenAI or other LLM services. Whatever your next step is, MCP helps bring AI closer to real systems. On AKS, it fits naturally into cloud-native and platform engineering patterns.

Thank you for taking the time to go through this post and making it to the end. Stay tuned, because we’ll keep continuing providing more content on topics like this in the future.

Author: Rolf Schutten

Posted on: December 14, 2025