Loop MCP Server
Enables LLMs to process arrays item-by-item or in batches with a specific task, storing and retrieving results with optional summarization after completion.
README
Loop MCP Server
An MCP (Model Context Protocol) server that enables LLMs to process arrays item by item with a specific task.
Overview
This MCP server provides tools for:
- Initializing an array with a task description
- Fetching items one by one or in batches for processing
- Storing results for each processed item or batch
- Retrieving all results (only after all items are processed)
- Optional result summarization
- Configurable batch size for efficient processing
Installation
npm install
Usage
Running the Server
npm start
Available Tools
-
initialize_array - Set up the array and task
array: The array of items to processtask: Description of what to do with each itembatchSize(optional): Number of items to process in each batch (default: 1)
-
get_next_item - Get the next item to process
- Returns: Current item, index, task, and remaining count
-
get_next_batch - Get the next batch of items based on batch size
- Returns: Array of items, indices, task, and remaining count
-
store_result - Store the result of processing
result: The processing result (single value or array for batch processing)
-
get_all_results - Get all results after completion
summarize(optional): Include a summary- Note: This will error if processing is not complete
-
reset - Clear the current processing state
Example Workflows
Single Item Processing
// 1. Initialize
await callTool('initialize_array', {
array: [1, 2, 3, 4, 5],
task: 'Square each number'
});
// 2. Process each item
while (true) {
const item = await callTool('get_next_item');
if (item.text === 'All items have been processed.') break;
// Process the item (e.g., square it)
const result = item.value * item.value;
await callTool('store_result', { result });
}
// 3. Get final results
const results = await callTool('get_all_results', { summarize: true });
Batch Processing
// 1. Initialize with batch size
await callTool('initialize_array', {
array: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
task: 'Double each number',
batchSize: 3
});
// 2. Process in batches
while (true) {
const batch = await callTool('get_next_batch');
if (batch.text === 'All items have been processed.') break;
// Process the batch
const results = batch.items.map(item => item * 2);
await callTool('store_result', { result: results });
}
// 3. Get final results
const results = await callTool('get_all_results', { summarize: true });
Running the Example
node example-client.js
Integration with Claude Desktop
Add to your Claude Desktop configuration:
{
"mcpServers": {
"loop-processor": {
"command": "node",
"args": ["/path/to/loop_mcp/server.js"]
}
}
}
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.