Skip to content

MCP Server Overview

Introduction

Model Context Protocol (MCP) is Anthropic's standard for connecting AI systems to external tools and data sources. Aegis leverages MCP extensively for orchestration, integration, and automation.

Architecture

Tool Naming Convention

All MCP tools follow the pattern: mcp__{server}__{tool_name}

Example: mcp__docker__list_containers, mcp__aegis__spawn_agent

This convention enables: - Clear server identification - Cache-friendly consistent ordering - Easy pattern matching for dynamic discovery

MCP Server Types

Aegis uses three categories of MCP servers:

1. Core Anthropic Servers

Standard servers from the MCP ecosystem: - filesystem - File operations (read, write, search) - docker - Container lifecycle management - postgres - Database queries with pgvector support - github - Repository and issue management - playwright - Browser automation

2. Third-Party Integrations

Community-maintained servers: - discord - Guild messaging and webhooks - telegram - Bot messaging - ollama - Local LLM inference - google-workspace - Gmail, Calendar, Drive - vonage - WhatsApp, SMS, Voice

3. Custom Aegis Servers

Aegis-specific orchestration and automation: - aegis - Core orchestration (agents, planning, workflows) - graphiti - Knowledge graph operations - stackwiz - Docker deployment to rbnk.uk - memory - Entity storage and retrieval

Available MCP Servers

Communication

Server Tools Purpose
discord 15+ Send messages, read channels, manage webhooks
telegram 8+ Bot messaging, channel management
vonage 12+ WhatsApp (WABA), SMS, Voice, RCS
google-workspace 20+ Gmail triage, Calendar events, Drive files

Infrastructure

Server Tools Purpose
docker 25+ Container, image, network, volume management
stackwiz 10+ Service stacks, DNS records, health checks
postgres 5+ SQL queries, schema inspection, pgvector ops

Knowledge & Memory

Server Tools Purpose
graphiti 5+ Knowledge graph: add_episode, search_nodes, search_facts
memory 8+ Entity storage, relations, observations

Development

Server Tools Purpose
github 30+ Repos, issues, PRs, code search, file operations
filesystem 10+ Read, write, search, directory operations

Orchestration

Server Tools Purpose
aegis 50+ Agent spawning, HTN planning, workflows, error tracking

Research & Automation

Server Tools Purpose
playwright 15+ Browser automation, screenshots, form filling
ollama 5+ Local model inference (vision, reasoning)
notebooklm 37+ Gemini research partner, source ingestion
twitter 15+ User info, tweets, search, trends
annas-archive 3+ Book search and download

Financial

Server Tools Purpose
starling 11+ Balance, transactions, spaces, card controls

AI/LLM

Server Tools Purpose
zai-vision 5+ UI analysis, OCR, diagram understanding
model-intelligence 10+ Model database (693 models), pricing, benchmarks

Dynamic Discovery with mcp-cli

Problem: Context Window Bloat

Loading all MCP tools at session start consumes ~234,000 tokens (26 servers × ~300 tools). This reduces available context for reasoning and increases API costs.

Solution: Dynamic Tool Discovery

The mcp-cli tool enables on-demand tool loading:

# List all servers (minimal context)
mcp-cli

# Show tools for specific server
mcp-cli docker

# Search by pattern
mcp-cli grep "*deploy*"

# Get tool schema
mcp-cli docker/list_containers

# Execute tool
mcp-cli docker/list_containers '{"all": false}'

Token Savings: 99% reduction (234,000 → ~2,000 tokens)

Tool Profiles

Aegis uses task-aware profiles for efficient discovery:

Profile Servers Token Budget
infrastructure docker, stackwiz, postgres 8,000
code github, filesystem, memory 12,000
communication discord, telegram, vonage, gmail 8,000
research notebooklm, twitter, gdrive 10,000
monitoring docker, postgres, starling, aegis 6,000
essential filesystem, postgres, memory, discord 10,000

Python API

from aegis.mcp import discover_tools, execute_tool

# Discover tools for a task
result = await discover_tools(["docker", "stackwiz"])
print(f"Found {result.tool_count} tools, saved {result.token_savings} tokens")

# Execute a tool
containers = await execute_tool(
    "docker",
    "list_containers",
    {"all": False}
)

MCP Server Development

Simple decorator-based API for rapid development:

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("my-server", json_response=True)

@mcp.tool()
def calculate(operation: str, a: int, b: int) -> int:
    """Perform calculation"""
    return a + b if operation == "add" else a * b

if __name__ == "__main__":
    mcp.run(transport="stdio")

Low-Level API

For complex validation and custom transports:

from mcp.server.lowlevel import Server
import mcp.types as types

server = Server("my-server")

@server.list_tools()
async def handle_list_tools() -> list[types.Tool]:
    return [
        types.Tool(
            name="calculate",
            description="Perform calculations",
            inputSchema={...},
            outputSchema={...}
        )
    ]

@server.call_tool()
async def handle_tool(name: str, arguments: dict) -> dict:
    if name == "calculate":
        return {"result": arguments["a"] + arguments["b"]}
    raise ValueError(f"Unknown tool: {name}")

Configuration

MCP servers are configured in ~/.claude.json:

{
  "mcpServers": {
    "docker": {
      "command": "docker-mcp",
      "args": []
    },
    "aegis": {
      "command": "python",
      "args": ["-m", "aegis_mcp.server"]
    }
  }
}

Best Practices

1. Use Dynamic Discovery

Don't load all tools upfront. Use discover_tools() with task-specific patterns.

2. Consistent Naming

Follow the mcp__{server}__{tool} convention for cache efficiency.

3. Tool Grouping

Group related tools with verb_noun patterns: - create_repository, list_repositories, delete_repository

4. Schema Validation

Define inputSchema and outputSchema for all tools to prevent errors.

5. Async Operations

Use async handlers for I/O operations to avoid blocking.

Performance Metrics

Metric Static Loading Dynamic Discovery
Token Usage ~234,000 ~2,000
Session Startup Slow Fast
Context Compactions Frequent Rare
Cost per Session High Low

Next Steps

References