Przejdź do głównej zawartości

Database Connections via MCP

Ta treść nie jest jeszcze dostępna w Twoim języku.

Enable your AI assistant to understand database schemas, generate queries, and build data-aware applications by connecting directly to your databases through MCP.

Traditional database development involves constant context switching between:

  • Schema documentation that’s often outdated
  • Query tools separate from your code editor
  • Manual translation between business requirements and SQL
  • Repetitive boilerplate for CRUD operations

With database MCP servers, your AI gains:

  • Live schema introspection for accurate table and column references
  • Query generation based on natural language requirements
  • Data-aware code generation that matches your actual database
  • Migration scripting with full context of current structure

Features

  • Schema inspection and documentation
  • Query generation and optimization
  • Migration script creation
  • Index recommendations
  • Performance analysis support
{
"mcpServers": {
"postgres": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-postgres",
"postgresql://user:password@localhost:5432/dbname"
]
}
}
}

Features

  • Full schema introspection
  • Stored procedure analysis
  • Character set handling
  • Replication-aware queries
{
"mcpServers": {
"mysql": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-mysql",
"mysql://user:password@localhost:3306/database"
]
}
}
}

Features

  • Local file-based databases
  • Zero configuration
  • Perfect for development/testing
  • Full-text search support
{
"mcpServers": {
"sqlite": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-sqlite",
"/path/to/database.db"
]
}
}
}

Features

  • NoSQL schema analysis
  • Aggregation pipeline generation
  • Index optimization
  • Collection statistics
Terminal window
# Community server example
claude mcp add mongodb -- npx -y mongodb-mcp-server \
"mongodb://localhost:27017/mydb"

Microsoft’s Azure MCP includes SQL Database support:

Terminal window
# Using Azure MCP Server
claude mcp add azure-sql -- npx -y @azure/mcp server start --namespace sql

Features:

  • Managed instance support
  • Azure AD authentication
  • Elastic pool awareness
  • Built-in security best practices

Natural language queries to understand your database:

"Show me all tables in the database"
"What columns does the users table have?"
"List all foreign key relationships"
"Which tables reference the products table?"
"Show me the indexes on the orders table"

Transform requirements into optimized SQL:

"Write a query to find all users who made purchases in the last 30 days"
"Create a report showing monthly revenue by product category"
"Find duplicate email addresses in the customers table"
"Generate a query for the top 10 best-selling products with inventory levels"

Example generated query:

-- Top customers by total purchase value in last quarter
WITH customer_purchases AS (
SELECT
c.customer_id,
c.first_name,
c.last_name,
c.email,
SUM(o.total_amount) as total_spent,
COUNT(DISTINCT o.order_id) as order_count,
AVG(o.total_amount) as avg_order_value
FROM customers c
JOIN orders o ON c.customer_id = o.customer_id
WHERE o.created_at >= DATE_TRUNC('quarter', CURRENT_DATE - INTERVAL '3 months')
AND o.status = 'completed'
GROUP BY c.customer_id, c.first_name, c.last_name, c.email
)
SELECT
customer_id,
first_name || ' ' || last_name as full_name,
email,
total_spent,
order_count,
ROUND(avg_order_value, 2) as avg_order_value
FROM customer_purchases
ORDER BY total_spent DESC
LIMIT 10;

Generate models and queries that match your actual database:

"Create a TypeScript interface for the products table"
"Generate Prisma models for all tables"
"Create a repository pattern implementation for the users table"
"Build SQLAlchemy models matching my database schema"

Example output:

// Generated from actual database schema
interface Product {
id: number;
name: string;
description: string | null;
price: number;
category_id: number;
sku: string;
inventory_count: number;
is_active: boolean;
created_at: Date;
updated_at: Date;
}
class ProductRepository {
async findById(id: number): Promise<Product | null> {
const result = await db.query(
'SELECT * FROM products WHERE id = $1',
[id]
);
return result.rows[0] || null;
}
async findByCategory(categoryId: number): Promise<Product[]> {
const result = await db.query(
'SELECT * FROM products WHERE category_id = $1 AND is_active = true',
[categoryId]
);
return result.rows;
}
async updateInventory(id: number, count: number): Promise<void> {
await db.query(
'UPDATE products SET inventory_count = $1, updated_at = NOW() WHERE id = $2',
[count, id]
);
}
}

Create migration scripts with full context:

"Generate a migration to add a status column to orders table with enum values"
"Create a migration to add full-text search to the products table"
"Build a migration to create a junction table for many-to-many between users and roles"

When working with multiple databases:

"Compare the schema of staging and production databases"
"Find tables that exist in development but not in production"
"Generate a script to sync user permissions across databases"

Leverage AI for database tuning:

"Analyze slow queries and suggest indexes"
"Review the orders table and recommend partitioning strategies"
"Identify missing foreign key indexes"
"Suggest query optimizations for this complex join"

Build validation rules from schema:

"Generate validation rules for all columns in the users table"
"Create a script to find data integrity issues"
"Build constraint checks for the inventory system"
  1. Use Read-Only Credentials

    -- Create read-only user for MCP
    CREATE USER mcp_reader WITH PASSWORD 'secure_password';
    GRANT CONNECT ON DATABASE myapp TO mcp_reader;
    GRANT USAGE ON SCHEMA public TO mcp_reader;
    GRANT SELECT ON ALL TABLES IN SCHEMA public TO mcp_reader;
  2. Environment Variables

    Terminal window
    # .env.local (add to .gitignore)
    DATABASE_URL=postgresql://mcp_reader:pass@localhost:5432/myapp
    # Use in configuration
    claude mcp add db -e DATABASE_URL=$DATABASE_URL -- \
    npx -y @modelcontextprotocol/server-postgres
  3. Network Restrictions

    • Use SSH tunnels for remote databases
    • Implement IP allowlisting
    • Use VPN for production access

Sensitive Data Handling

  • Avoid production data in development
  • Use data masking for PII
  • Implement row-level security
  • Audit MCP query logs

Configure minimal permissions:

-- PostgreSQL example: Grant specific table access
GRANT SELECT ON users, products, orders TO mcp_reader;
-- MySQL example: Limit to specific operations
GRANT SELECT, SHOW VIEW ON myapp.* TO 'mcp_reader'@'localhost';
-- Add query timeout for safety
ALTER USER mcp_reader SET statement_timeout = '30s';
IssueSolution
Connection refusedCheck database is running and accepting connections
Authentication failedVerify credentials and user permissions
Timeout errorsEnsure network connectivity and firewall rules
”No tools available”Confirm MCP server started successfully
SSL/TLS errorsConfigure SSL mode in connection string
Terminal window
# Test database connectivity
psql "postgresql://user:pass@localhost:5432/db" -c "SELECT 1"
# Check MCP server logs
claude mcp list
claude --mcp-debug
  1. Connection Pooling

    • MCP servers typically maintain a single connection
    • For high-frequency operations, consider caching
  2. Query Limits

    • Implement timeouts for long-running queries
    • Use LIMIT clauses in generated queries
    • Consider read replicas for analytics
  3. Context Window Management

    • Large result sets can fill AI context
    • Request summaries rather than full data
    • Use aggregations where possible
  1. Start with Development Database

    • Use Docker for consistent environments
    • Seed with representative data
    • Test MCP queries safely
  2. Progress to Staging

    • Mirror production schema
    • Use anonymized data
    • Validate generated queries
  3. Production Considerations

    • Read-only access only
    • Audit trail for queries
    • Time-based access windows

Share database MCP configurations safely:

// .mcp.json (committed to repo)
{
"mcpServers": {
"app-db": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-postgres"],
"env": {
"DATABASE_URL": "${DATABASE_URL}" // Resolved from environment
}
}
}
}

Combine database MCP with documentation:

"Generate comprehensive documentation for the users table including relationships"
"Create a data dictionary for all tables"
"Document the business logic implemented in database constraints"
"Build an ERD diagram description for the order processing system"

The database MCP ecosystem continues to evolve:

  • Query explanation with performance insights
  • Automatic indexing recommendations
  • Data lineage tracking across systems
  • Natural language stored procedures
  • Cross-database federation support