PostgreSQL Database Management Server Featured
A Model Context Protocol server that enables powerful PostgreSQL database management capabilities including analysis, schema management, data migration, and monitoring through natural language interactions.
HenkDz
README
PostgreSQL MCP Server
A Model Context Protocol (MCP) server that provides PostgreSQL database management capabilities. This server assists with analyzing existing PostgreSQL setups, providing implementation guidance, debugging database issues, managing schemas, migrating data, and monitoring database performance.
Features
Database Analysis and Debugging
1. Database Analysis (analyze_database)
Analyzes PostgreSQL database configuration and performance metrics:
- Configuration analysis
- Performance metrics
- Security assessment
- Recommendations for optimization
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"analysisType": "performance" // Optional: "configuration" | "performance" | "security"
}
2. Setup Instructions (get_setup_instructions)
Provides step-by-step PostgreSQL installation and configuration guidance:
- Platform-specific installation steps
- Configuration recommendations
- Security best practices
- Post-installation tasks
// Example usage
{
"platform": "linux", // Required: "linux" | "macos" | "windows"
"version": "15", // Optional: PostgreSQL version
"useCase": "production" // Optional: "development" | "production"
}
3. Database Debugging (debug_database)
Debug common PostgreSQL issues:
- Connection problems
- Performance bottlenecks
- Lock conflicts
- Replication status
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"issue": "performance", // Required: "connection" | "performance" | "locks" | "replication"
"logLevel": "debug" // Optional: "info" | "debug" | "trace"
}
Schema Management
4. Schema Information (get_schema_info)
Get detailed schema information for a database or specific table:
- List of tables in a database
- Column definitions
- Constraints (primary keys, foreign keys, etc.)
- Indexes
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users" // Optional: specific table to get info for
}
5. Create Table (create_table)
Create a new table with specified columns:
- Define column names and types
- Set nullable constraints
- Set default values
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"columns": [
{ "name": "id", "type": "SERIAL", "nullable": false },
{ "name": "username", "type": "VARCHAR(100)", "nullable": false },
{ "name": "email", "type": "VARCHAR(255)", "nullable": false },
{ "name": "created_at", "type": "TIMESTAMP", "default": "NOW()" }
]
}
6. Alter Table (alter_table)
Modify existing tables:
- Add new columns
- Modify column types or constraints
- Drop columns
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"operations": [
{ "type": "add", "columnName": "last_login", "dataType": "TIMESTAMP" },
{ "type": "alter", "columnName": "email", "nullable": false },
{ "type": "drop", "columnName": "temporary_field" }
]
}
Data Migration
7. Export Table Data (export_table_data)
Export table data to JSON or CSV format:
- Filter data with WHERE clause
- Limit number of rows
- Choose output format
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"outputPath": "./exports/users.json",
"where": "created_at > '2023-01-01'", // Optional
"limit": 1000, // Optional
"format": "json" // Optional: "json" | "csv"
}
8. Import Table Data (import_table_data)
Import data from JSON or CSV files:
- Optionally truncate table before import
- Support for different formats
- Custom CSV delimiters
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"tableName": "users",
"inputPath": "./imports/users.json",
"truncateFirst": false, // Optional
"format": "json", // Optional: "json" | "csv"
"delimiter": "," // Optional: for CSV files
}
9. Copy Between Databases (copy_between_databases)
Copy data between two PostgreSQL databases:
- Filter data with WHERE clause
- Optionally truncate target table
// Example usage
{
"sourceConnectionString": "postgresql://user:password@localhost:5432/source_db",
"targetConnectionString": "postgresql://user:password@localhost:5432/target_db",
"tableName": "users",
"where": "active = true", // Optional
"truncateTarget": false // Optional
}
Monitoring
10. Monitor Database (monitor_database)
Real-time monitoring of PostgreSQL database:
- Database metrics (connections, cache hit ratio, etc.)
- Table metrics (size, row counts, dead tuples)
- Active query information
- Lock information
- Replication status
- Configurable alerts
// Example usage
{
"connectionString": "postgresql://user:password@localhost:5432/dbname",
"includeTables": true, // Optional
"includeQueries": true, // Optional
"includeLocks": true, // Optional
"includeReplication": false, // Optional
"alertThresholds": { // Optional
"connectionPercentage": 80,
"longRunningQuerySeconds": 30,
"cacheHitRatio": 0.95,
"deadTuplesPercentage": 10,
"vacuumAge": 7
}
}
Prerequisites
- Node.js >= 18.0.0
- PostgreSQL server (for target database operations)
- Network access to target PostgreSQL instances
Installation
- Clone the repository
- Install dependencies:
npm install - Build the server:
npm run build - Add to MCP settings file:
{ "mcpServers": { "postgresql-mcp": { "command": "node", "args": ["/path/to/postgresql-mcp-server/build/index.js"], "disabled": false, "alwaysAllow": [] } } }
Development
npm run dev- Start development server with hot reloadnpm run lint- Run ESLintnpm test- Run tests
Security Considerations
-
Connection Security
- Uses connection pooling
- Implements connection timeouts
- Validates connection strings
- Supports SSL/TLS connections
-
Query Safety
- Validates SQL queries
- Prevents dangerous operations
- Implements query timeouts
- Logs all operations
-
Authentication
- Supports multiple authentication methods
- Implements role-based access control
- Enforces password policies
- Manages connection credentials securely
Best Practices
- Always use secure connection strings with proper credentials
- Follow production security recommendations for sensitive environments
- Regularly monitor and analyze database performance
- Keep PostgreSQL version up to date
- Implement proper backup strategies
- Use connection pooling for better resource management
- Implement proper error handling and logging
- Regular security audits and updates
Error Handling
The server implements comprehensive error handling:
- Connection failures
- Query timeouts
- Authentication errors
- Permission issues
- Resource constraints
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
This project is licensed under the AGPLv3 License - see LICENSE file for details.
Recommended Servers
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Neon Database
MCP server for interacting with Neon Management API and databases
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
Hyperbrowser MCP Server
Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to
Etherscan Tools
Facilitates interaction with Ethereum blockchain data via Etherscan's API, providing real-time access to balances, transactions, token transfers, contract ABIs, gas prices, and ENS name resolutions.
Perplexity Deep Research
A server that allows AI assistants to perform web searches using Perplexity's sonar-deep-research model with citation support.
OpenRouter MCP Server
Provides integration with OpenRouter.ai, allowing access to various AI models through a unified interface.
Search1API MCP Server
A Model Context Protocol (MCP) server that provides search and crawl functionality using Search1API.