
PostgreSQL Database Management Server Featured
A Model Context Protocol server that enables powerful PostgreSQL database management capabilities including analysis, schema management, data migration, and monitoring through natural language interactions.
HenkDz
README
PostgreSQL MCP Server
A Model Context Protocol (MCP) server that provides PostgreSQL database management capabilities. This server assists with analyzing existing PostgreSQL setups, providing implementation guidance, debugging database issues, managing schemas, migrating data, and monitoring database performance.
Features
Database Analysis and Debugging
1. Database Analysis (analyze_database
)
Analyzes PostgreSQL database configuration and performance metrics:
- Configuration analysis
- Performance metrics
- Security assessment
- Recommendations for optimization
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"analysisType": "performance" // Optional: "configuration" | "performance" | "security"
}
2. Setup Instructions (get_setup_instructions
)
Provides step-by-step PostgreSQL installation and configuration guidance:
- Platform-specific installation steps
- Configuration recommendations
- Security best practices
- Post-installation tasks
// Example usage
{
"platform": "linux", // Required: "linux" | "macos" | "windows"
"version": "15", // Optional: PostgreSQL version
"useCase": "production" // Optional: "development" | "production"
}
3. Database Debugging (debug_database
)
Debug common PostgreSQL issues:
- Connection problems
- Performance bottlenecks
- Lock conflicts
- Replication status
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"issue": "performance", // Required: "connection" | "performance" | "locks" | "replication"
"logLevel": "debug" // Optional: "info" | "debug" | "trace"
}
Schema Management
4. Schema Information (get_schema_info
)
Get detailed schema information for a database or specific table:
- List of tables in a database
- Column definitions
- Constraints (primary keys, foreign keys, etc.)
- Indexes
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users" // Optional: specific table to get info for
}
5. Create Table (create_table
)
Create a new table with specified columns:
- Define column names and types
- Set nullable constraints
- Set default values
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"columns": [
{ "name": "id", "type": "SERIAL", "nullable": false },
{ "name": "username", "type": "VARCHAR(100)", "nullable": false },
{ "name": "email", "type": "VARCHAR(255)", "nullable": false },
{ "name": "created_at", "type": "TIMESTAMP", "default": "NOW()" }
]
}
6. Alter Table (alter_table
)
Modify existing tables:
- Add new columns
- Modify column types or constraints
- Drop columns
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"operations": [
{ "type": "add", "columnName": "last_login", "dataType": "TIMESTAMP" },
{ "type": "alter", "columnName": "email", "nullable": false },
{ "type": "drop", "columnName": "temporary_field" }
]
}
Data Migration
7. Export Table Data (export_table_data
)
Export table data to JSON or CSV format:
- Filter data with WHERE clause
- Limit number of rows
- Choose output format
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"outputPath": "./exports/users.json",
"where": "created_at > '2023-01-01'", // Optional
"limit": 1000, // Optional
"format": "json" // Optional: "json" | "csv"
}
8. Import Table Data (import_table_data
)
Import data from JSON or CSV files:
- Optionally truncate table before import
- Support for different formats
- Custom CSV delimiters
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"inputPath": "./imports/users.json",
"truncateFirst": false, // Optional
"format": "json", // Optional: "json" | "csv"
"delimiter": "," // Optional: for CSV files
}
9. Copy Between Databases (copy_between_databases
)
Copy data between two PostgreSQL databases:
- Filter data with WHERE clause
- Optionally truncate target table
// Example usage
{
"sourceConnectionString": "postgresql://user:password@localhost:5432/source_db",
"targetConnectionString": "postgresql://user:password@localhost:5432/target_db",
"tableName": "users",
"where": "active = true", // Optional
"truncateTarget": false // Optional
}
Monitoring
10. Monitor Database (monitor_database
)
Real-time monitoring of PostgreSQL database:
- Database metrics (connections, cache hit ratio, etc.)
- Table metrics (size, row counts, dead tuples)
- Active query information
- Lock information
- Replication status
- Configurable alerts
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"includeTables": true, // Optional
"includeQueries": true, // Optional
"includeLocks": true, // Optional
"includeReplication": false, // Optional
"alertThresholds": { // Optional
"connectionPercentage": 80,
"longRunningQuerySeconds": 30,
"cacheHitRatio": 0.95,
"deadTuplesPercentage": 10,
"vacuumAge": 7
}
}
Prerequisites
- Node.js >= 18.0.0
- PostgreSQL server (for target database operations)
- Network access to target PostgreSQL instances
Installation
- Clone the repository
- Install dependencies:
npm install
- Build the server:
npm run build
- Add to MCP settings file:
{ "mcpServers": { "postgresql-mcp": { "command": "node", "args": ["/path/to/postgresql-mcp-server/build/index.js"], "disabled": false, "alwaysAllow": [] } } }
Development
npm run dev
- Start development server with hot reloadnpm run lint
- Run ESLintnpm test
- Run tests
Security Considerations
-
Connection Security
- Uses connection pooling
- Implements connection timeouts
- Validates connection strings
- Supports SSL/TLS connections
-
Query Safety
- Validates SQL queries
- Prevents dangerous operations
- Implements query timeouts
- Logs all operations
-
Authentication
- Supports multiple authentication methods
- Implements role-based access control
- Enforces password policies
- Manages connection credentials securely
Best Practices
- Always use secure connection strings with proper credentials
- Follow production security recommendations for sensitive environments
- Regularly monitor and analyze database performance
- Keep PostgreSQL version up to date
- Implement proper backup strategies
- Use connection pooling for better resource management
- Implement proper error handling and logging
- Regular security audits and updates
Error Handling
The server implements comprehensive error handling:
- Connection failures
- Query timeouts
- Authentication errors
- Permission issues
- Resource constraints
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
This project is licensed under the AGPLv3 License - see LICENSE file for details.
Recommended Servers

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
Hyperbrowser MCP Server
Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to

Any OpenAI Compatible API Integrations
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
Exa MCP
A Model Context Protocol server that enables AI assistants like Claude to perform real-time web searches using the Exa AI Search API in a safe and controlled manner.
BigQuery
This is a server that lets your LLMs (like Claude) talk directly to your BigQuery data! Think of it as a friendly translator that sits between your AI assistant and your database, making sure they can chat securely and efficiently.
Web Research Server
A Model Context Protocol server that enables Claude to perform web research by integrating Google search, extracting webpage content, and capturing screenshots.