PostgreSQL MCP Server¶
Overview¶
The PostgreSQL MCP server provides database query execution, schema introspection, and pgvector operations for semantic search. It's used for workflow checkpointing, agent state persistence, and knowledge storage.
Tool Prefix: mcp__postgres__
Repository: https://github.com/modelcontextprotocol/servers/tree/main/src/postgres
Database Operations¶
query¶
Execute a SELECT query and return results.
Returns:
{
"rows": [
{"id": 1, "name": "aegis", "status": "active"},
{"id": 2, "name": "dashboard", "status": "running"}
],
"row_count": 2,
"columns": ["id", "name", "status"]
}
Example:
# Simple query
result = await mcp__postgres__query(
"SELECT * FROM workflows WHERE status = 'running'"
)
# Parameterized query
result = await mcp__postgres__query(
"SELECT * FROM agents WHERE template = $1 AND tokens_used < $2",
params=["researcher", 50000]
)
execute¶
Execute an INSERT, UPDATE, DELETE, or DDL statement.
Returns:
Example:
# Insert
await mcp__postgres__execute(
"INSERT INTO workflows (name, status) VALUES ($1, $2)",
params=["deployment-approval", "pending"]
)
# Update
await mcp__postgres__execute(
"UPDATE agents SET status = $1 WHERE agent_id = $2",
params=["completed", "agent-abc123"]
)
# Delete
await mcp__postgres__execute(
"DELETE FROM workflows WHERE created_at < NOW() - INTERVAL '30 days'"
)
transaction¶
Execute multiple statements in a transaction. All succeed or all rollback.
Example:
await mcp__postgres__transaction([
{
"sql": "INSERT INTO workflows (name) VALUES ($1) RETURNING id",
"params": ["research-pipeline"]
},
{
"sql": "INSERT INTO workflow_steps (workflow_id, step_name) VALUES ($1, $2)",
"params": ["{workflow_id}", "discover"] # Use result from first query
}
])
Schema Introspection¶
list_tables¶
List all tables in a schema.
Returns:
{
"tables": [
{"name": "workflows", "row_count": 156},
{"name": "agents", "row_count": 42},
{"name": "workflow_checkpoints", "row_count": 89}
]
}
describe_table¶
Get table structure with column types and constraints.
Returns:
{
"table": "workflows",
"columns": [
{
"name": "id",
"type": "uuid",
"nullable": false,
"default": "gen_random_uuid()",
"is_primary_key": true
},
{
"name": "name",
"type": "text",
"nullable": false,
"unique": true
},
{
"name": "status",
"type": "text",
"nullable": false,
"default": "'pending'"
},
{
"name": "created_at",
"type": "timestamp",
"nullable": false,
"default": "NOW()"
}
],
"indexes": [
{"name": "idx_workflows_status", "columns": ["status"]}
]
}
list_schemas¶
List all schemas in the database.
get_indexes¶
Get indexes for a table.
pgvector Operations¶
Aegis uses pgvector for semantic search and embedding storage.
Setup Extension¶
vector_search¶
mcp__postgres__vector_search(
table: str,
embedding_column: str,
query_embedding: list,
limit: int = 10,
where_clause: str = None
)
Semantic similarity search using cosine distance.
Example:
# Search knowledge graph entities
results = await mcp__postgres__vector_search(
table="entities",
embedding_column="embedding",
query_embedding=[0.1, 0.2, ...], # 384-dim vector
limit=5,
where_clause="entity_type = 'error'"
)
insert_vector¶
Insert a row with embedding.
Example:
await mcp__postgres__insert_vector(
table="entities",
columns={
"name": "ModuleNotFoundError",
"entity_type": "error",
"context": "Dashboard startup"
},
embedding_column="embedding",
embedding=error_embedding # From ollama nomic-embed-text
)
Connection Management¶
Configuration¶
PostgreSQL connection is configured via environment variables in ~/.claude.json:
{
"mcpServers": {
"postgres": {
"command": "postgres-mcp",
"env": {
"PGHOST": "localhost",
"PGPORT": "5432",
"PGDATABASE": "aegis",
"PGUSER": "aegis",
"PGPASSWORD": "secure-password"
}
}
}
}
Connection Pooling¶
The MCP server uses connection pooling for performance. Default pool size: 10 connections.
Common Query Patterns¶
Workflow Checkpointing¶
# Save workflow state
await mcp__postgres__execute(
"""
INSERT INTO workflow_checkpoints (workflow_id, node_id, state, created_at)
VALUES ($1, $2, $3, NOW())
""",
params=[workflow_id, node_id, json.dumps(state)]
)
# Load latest checkpoint
result = await mcp__postgres__query(
"""
SELECT state FROM workflow_checkpoints
WHERE workflow_id = $1
ORDER BY created_at DESC
LIMIT 1
""",
params=[workflow_id]
)
Agent State Persistence¶
# Update agent state
await mcp__postgres__execute(
"""
UPDATE agents
SET state = $1, tokens_used = $2, updated_at = NOW()
WHERE agent_id = $3
""",
params=["running", tokens, agent_id]
)
# Get agents by status
agents = await mcp__postgres__query(
"SELECT * FROM agents WHERE state = $1",
params=["running"]
)
Knowledge Graph Queries¶
# Find entities related to a concept
entities = await mcp__postgres__query(
"""
SELECT e.*, r.relation_type
FROM entities e
JOIN relations r ON r.target_id = e.id
WHERE r.source_id = $1
""",
params=[entity_id]
)
# Semantic search with filters
results = await mcp__postgres__vector_search(
table="episodes",
embedding_column="embedding",
query_embedding=query_vector,
limit=10,
where_clause="episode_type = 'error' AND created_at > NOW() - INTERVAL '7 days'"
)
Time-Series Queries¶
# Agent token usage over time
usage = await mcp__postgres__query(
"""
SELECT
DATE_TRUNC('day', created_at) as day,
template_name,
SUM(tokens_used) as total_tokens
FROM agent_executions
WHERE created_at > NOW() - INTERVAL '30 days'
GROUP BY day, template_name
ORDER BY day DESC
"""
)
Integration with Aegis¶
Workflow Storage¶
Workflows use PostgreSQL for checkpointing:
from aegis.workflows import run_workflow
# Execute workflow with checkpoint persistence
result = await run_workflow(
"deployment-approval",
{"service": "nginx", "environment": "production"}
)
# If interrupted, resume from checkpoint
if result.status == "interrupted":
resumed = await resume_workflow(
result.workflow_id,
response="Approve"
)
Knowledge Graph Backend¶
FalkorDB (used by Graphiti) stores embeddings in PostgreSQL via pgvector:
from aegis.memory.graphiti_client import GraphitiClient
client = GraphitiClient()
await client.initialize()
# Add episode (stores in FalkorDB + embeddings in Postgres)
await client.add_episode(
name="deployment-event",
content="Deployed nginx to production",
source_description="journal"
)
Agent Metrics¶
Agent registry persists metrics to PostgreSQL:
from aegis.agents.registry import get_agent_registry
registry = get_agent_registry()
# Metrics automatically persisted
metrics = registry.get_all_metrics()
Performance Optimization¶
Indexes¶
Create indexes for frequently queried columns:
# Workflow status index
await mcp__postgres__execute(
"CREATE INDEX IF NOT EXISTS idx_workflows_status ON workflows(status)"
)
# Agent template index
await mcp__postgres__execute(
"CREATE INDEX IF NOT EXISTS idx_agents_template ON agents(template_name)"
)
# Time-series index
await mcp__postgres__execute(
"CREATE INDEX IF NOT EXISTS idx_executions_created ON agent_executions(created_at DESC)"
)
pgvector Indexes¶
For semantic search performance:
# IVFFlat index for approximate nearest neighbor
await mcp__postgres__execute(
"""
CREATE INDEX IF NOT EXISTS idx_entities_embedding
ON entities USING ivfflat (embedding vector_cosine_ops)
WITH (lists = 100)
"""
)
# HNSW index (more accurate, slower build)
await mcp__postgres__execute(
"""
CREATE INDEX IF NOT EXISTS idx_entities_embedding_hnsw
ON entities USING hnsw (embedding vector_cosine_ops)
WITH (m = 16, ef_construction = 64)
"""
)
Query Optimization¶
# Use EXPLAIN ANALYZE to understand query performance
explain = await mcp__postgres__query(
"""
EXPLAIN ANALYZE
SELECT * FROM workflows
WHERE status = 'running'
AND created_at > NOW() - INTERVAL '1 day'
"""
)
Security Best Practices¶
1. Parameterized Queries¶
Always use parameterized queries to prevent SQL injection:
# GOOD
await mcp__postgres__query(
"SELECT * FROM users WHERE email = $1",
params=[user_email]
)
# BAD (SQL injection risk)
await mcp__postgres__query(
f"SELECT * FROM users WHERE email = '{user_email}'"
)
2. Least Privilege¶
Grant minimal permissions to the Aegis database user:
-- Create restricted user
CREATE USER aegis WITH PASSWORD 'secure-password';
-- Grant specific permissions
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO aegis;
GRANT USAGE ON SCHEMA public TO aegis;
GRANT USAGE, SELECT ON ALL SEQUENCES IN SCHEMA public TO aegis;
-- Revoke dangerous permissions
REVOKE CREATE ON SCHEMA public FROM aegis;
3. Connection Encryption¶
Use SSL for database connections:
4. Credential Management¶
Store database credentials in ~/.secure/:
{
"mcpServers": {
"postgres": {
"env": {
"PGPASSWORD_FILE": "/home/agent/.secure/postgres.env"
}
}
}
}
Backup and Recovery¶
Automated Backups¶
#!/bin/bash
# scripts/backup-postgres.sh
pg_dump -h localhost -U aegis aegis > /backups/aegis-$(date +%Y%m%d).sql
Point-in-Time Recovery¶
Troubleshooting¶
Connection Errors¶
# Test connection
try:
result = await mcp__postgres__query("SELECT 1")
print("Connected successfully")
except Exception as e:
print(f"Connection failed: {e}")
Slow Queries¶
# Check for missing indexes
slow_queries = await mcp__postgres__query(
"""
SELECT query, calls, mean_exec_time, rows
FROM pg_stat_statements
WHERE mean_exec_time > 1000
ORDER BY mean_exec_time DESC
LIMIT 10
"""
)
Lock Contention¶
# Check for locks
locks = await mcp__postgres__query(
"""
SELECT pid, usename, query, state
FROM pg_stat_activity
WHERE wait_event_type = 'Lock'
"""
)
Next Steps¶
- Graphiti MCP - Knowledge graph operations
- Aegis MCP - Workflow and agent tools
- MCP Overview - Protocol basics