DynamoDB MCP Server
Mirror of
MCP-Mirror
README
DynamoDB MCP Server
A Model Context Protocol server for managing Amazon DynamoDB resources. This server provides tools for table management, capacity management, and data operations.
Author
Iman Kamyabi (ikmyb@icloud.com)
Features
Table Management
- Create new DynamoDB tables with customizable configurations
- List existing tables
- Get detailed table information
- Configure table settings
Index Management
- Create and manage Global Secondary Indexes (GSI)
- Update GSI capacity
- Create Local Secondary Indexes (LSI)
Capacity Management
- Update provisioned read/write capacity units
- Manage table throughput settings
Data Operations
- Insert or replace items in tables
- Retrieve items by primary key
- Update specific item attributes
- Query tables with conditions
- Scan tables with filters
Note: Delete operations are not supported to prevent accidental data loss.
Setup
- Install dependencies:
npm install
- Configure AWS credentials as environment variables:
export AWS_ACCESS_KEY_ID="your_access_key"
export AWS_SECRET_ACCESS_KEY="your_secret_key"
export AWS_REGION="your_region"
- Build the server:
npm run build
- Start the server:
npm start
Tools
create_table
Creates a new DynamoDB table with specified configuration.
Parameters:
tableName
: Name of the table to createpartitionKey
: Name of the partition keypartitionKeyType
: Type of partition key (S=String, N=Number, B=Binary)sortKey
: (Optional) Name of the sort keysortKeyType
: (Optional) Type of sort keyreadCapacity
: Provisioned read capacity unitswriteCapacity
: Provisioned write capacity units
Example:
{
"tableName": "Users",
"partitionKey": "userId",
"partitionKeyType": "S",
"readCapacity": 5,
"writeCapacity": 5
}
list_tables
Lists all DynamoDB tables in the account.
Parameters:
limit
: (Optional) Maximum number of tables to returnexclusiveStartTableName
: (Optional) Name of the table to start from for pagination
Example:
{
"limit": 10
}
describe_table
Gets detailed information about a DynamoDB table.
Parameters:
tableName
: Name of the table to describe
Example:
{
"tableName": "Users"
}
create_gsi
Creates a global secondary index on a table.
Parameters:
tableName
: Name of the tableindexName
: Name of the new indexpartitionKey
: Partition key for the indexpartitionKeyType
: Type of partition keysortKey
: (Optional) Sort key for the indexsortKeyType
: (Optional) Type of sort keyprojectionType
: Type of projection (ALL, KEYS_ONLY, INCLUDE)nonKeyAttributes
: (Optional) Non-key attributes to projectreadCapacity
: Provisioned read capacity unitswriteCapacity
: Provisioned write capacity units
Example:
{
"tableName": "Users",
"indexName": "EmailIndex",
"partitionKey": "email",
"partitionKeyType": "S",
"projectionType": "ALL",
"readCapacity": 5,
"writeCapacity": 5
}
update_gsi
Updates the provisioned capacity of a global secondary index.
Parameters:
tableName
: Name of the tableindexName
: Name of the index to updatereadCapacity
: New read capacity unitswriteCapacity
: New write capacity units
Example:
{
"tableName": "Users",
"indexName": "EmailIndex",
"readCapacity": 10,
"writeCapacity": 10
}
create_lsi
Creates a local secondary index on a table (must be done during table creation).
Parameters:
tableName
: Name of the tableindexName
: Name of the new indexpartitionKey
: Partition key for the tablepartitionKeyType
: Type of partition keysortKey
: Sort key for the indexsortKeyType
: Type of sort keyprojectionType
: Type of projection (ALL, KEYS_ONLY, INCLUDE)nonKeyAttributes
: (Optional) Non-key attributes to projectreadCapacity
: (Optional) Provisioned read capacity unitswriteCapacity
: (Optional) Provisioned write capacity units
Example:
{
"tableName": "Users",
"indexName": "CreatedAtIndex",
"partitionKey": "userId",
"partitionKeyType": "S",
"sortKey": "createdAt",
"sortKeyType": "N",
"projectionType": "ALL"
}
update_capacity
Updates the provisioned capacity of a table.
Parameters:
tableName
: Name of the tablereadCapacity
: New read capacity unitswriteCapacity
: New write capacity units
Example:
{
"tableName": "Users",
"readCapacity": 10,
"writeCapacity": 10
}
put_item
Inserts or replaces an item in a table.
Parameters:
tableName
: Name of the tableitem
: Item to put into the table (as JSON object)
Example:
{
"tableName": "Users",
"item": {
"userId": "123",
"name": "John Doe",
"email": "john@example.com"
}
}
get_item
Retrieves an item from a table by its primary key.
Parameters:
tableName
: Name of the tablekey
: Primary key of the item to retrieve
Example:
{
"tableName": "Users",
"key": {
"userId": "123"
}
}
update_item
Updates specific attributes of an item in a table.
Parameters:
tableName
: Name of the tablekey
: Primary key of the item to updateupdateExpression
: Update expressionexpressionAttributeNames
: Attribute name mappingsexpressionAttributeValues
: Values for the update expressionconditionExpression
: (Optional) Condition for updatereturnValues
: (Optional) What values to return
Example:
{
"tableName": "Users",
"key": {
"userId": "123"
},
"updateExpression": "SET #n = :name",
"expressionAttributeNames": {
"#n": "name"
},
"expressionAttributeValues": {
":name": "Jane Doe"
}
}
query_table
Queries a table using key conditions and optional filters.
Parameters:
tableName
: Name of the tablekeyConditionExpression
: Key condition expressionexpressionAttributeValues
: Values for the key condition expressionexpressionAttributeNames
: (Optional) Attribute name mappingsfilterExpression
: (Optional) Filter expression for resultslimit
: (Optional) Maximum number of items to return
Example:
{
"tableName": "Users",
"keyConditionExpression": "userId = :id",
"expressionAttributeValues": {
":id": "123"
}
}
scan_table
Scans an entire table with optional filters.
Parameters:
tableName
: Name of the tablefilterExpression
: (Optional) Filter expressionexpressionAttributeValues
: (Optional) Values for the filter expressionexpressionAttributeNames
: (Optional) Attribute name mappingslimit
: (Optional) Maximum number of items to return
Example:
{
"tableName": "Users",
"filterExpression": "age > :minAge",
"expressionAttributeValues": {
":minAge": 21
}
}
Sample Questions
Here are some example questions you can ask Claude when using this DynamoDB MCP server:
Table Management
- "Create a new DynamoDB table called 'Products' with a partition key 'productId' (string) and sort key 'timestamp' (number)"
- "List all DynamoDB tables in my account"
- "What's the current configuration of the Users table?"
- "Add a global secondary index on the email field of the Users table"
Capacity Management
- "Update the Users table capacity to 20 read units and 15 write units"
- "Scale up the EmailIndex GSI capacity on the Users table"
- "What's the current provisioned capacity for the Orders table?"
Data Operations
- "Insert a new user with ID '123', name 'John Doe', and email 'john@example.com'"
- "Get the user with ID '123'"
- "Update the email address for user '123' to 'john.doe@example.com'"
- "Find all orders placed by user '123'"
- "List all users who are over 21 years old"
- "Query the EmailIndex to find the user with email 'john@example.com'"
Configuration
Setting up AWS Credentials
- Obtain AWS access key ID, secret access key, and region from the AWS Management Console.
- If using temporary credentials (e.g., IAM role), also obtain a session token.
- Ensure these credentials have appropriate permissions for DynamoDB operations.
Usage with Claude Desktop
Add this to your claude_desktop_config.json
:
Docker (Recommended)
{
"mcpServers": {
"dynamodb": {
"command": "docker",
"args": [ "run", "-i", "--rm", "-e", "AWS_ACCESS_KEY_ID", "-e", "AWS_SECRET_ACCESS_KEY", "-e", "AWS_REGION", "-e", "AWS_SESSION_TOKEN", "mcp/dynamodb-mcp-server" ],
"env": {
"AWS_ACCESS_KEY_ID": "your_access_key",
"AWS_SECRET_ACCESS_KEY": "your_secret_key",
"AWS_REGION": "your_region",
"AWS_SESSION_TOKEN": "your_session_token"
}
}
}
}
Building
Docker:
docker build -t mcp/dynamodb-mcp-server -f Dockerfile .
Development
To run in development mode with auto-reloading:
npm run dev
License
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Recommended Servers
contentful-mcp
Update, create, delete content, content-models and assets in your Contentful Space

Supabase MCP Server
A Model Context Protocol (MCP) server that provides programmatic access to the Supabase Management API. This server allows AI models and other clients to manage Supabase projects and organizations through a standardized interface.

Azure MCP Server
Enables natural language interaction with Azure services through Claude Desktop, supporting resource management, subscription handling, and tenant selection with secure authentication.

SettleMint
Leverage SettleMint's Model Context Protocol server to seamlessly interact with enterprise blockchain infrastructure. Build, deploy, and manage smart contracts through AI-powered assistants, streamlining your blockchain development workflow for maximum efficiency.

Brev
Run, build, train, and deploy ML models on the cloud.

Story SDK MCP Server
This server provides MCP (Model Context Protocol) tools for interacting with Story's Python SDK. Features Get license terms Mint and register IP Asset with PIL Terms Mint license tokens Send $IP to a wallet Upload image to ipfs via Pinata [External] Upload ip and nft metadata via Pinata [External]

Tembo MCP Server
An MCP server that enables Claude to interact with Tembo Cloud platform API, allowing users to manage Tembo Cloud resources through natural language.

Workers MCP
A package that connects Claude Desktop and other MCP clients to Cloudflare Workers, enabling custom functionality to be accessed via natural language through the Model Context Protocol.

Appwrite MCP Server
A Model Context Protocol server that allows AI assistants to interact with Appwrite's API, providing tools to manage databases, users, functions, teams, and other resources within Appwrite projects.
MCP2Lambda
Enables AI models to interact with AWS Lambda functions via the MCP protocol, allowing access to private resources, real-time data, and custom computation in a secure environment.