LibSQL Memory

LibSQL Memory

A high-performance MCP server utilizing libSQL for persistent memory and vector search capabilities, enabling efficient entity management and semantic knowledge storage.

spences10

Programming Docs Access
AI Memory Systems
Database Interaction
Visit Server

Tools

create_entities

Create new entities with observations and optional embeddings

search_nodes

Search for entities and their relations using text or vector similarity

read_graph

Get recent entities and their relations

create_relations

Create relations between entities

delete_entity

Delete an entity and all its associated data (observations and relations)

delete_relation

Delete a specific relation between entities

README

mcp-memory-libsql

A high-performance, persistent memory system for the Model Context Protocol (MCP) powered by libSQL. This server provides vector search capabilities and efficient knowledge storage using libSQL as the backing store.

<a href="https://glama.ai/mcp/servers/22lg4lq768"> <img width="380" height="200" src="https://glama.ai/mcp/servers/22lg4lq768/badge" alt="Glama badge" /> </a>

Features

  • 🚀 High-performance vector search using libSQL
  • 💾 Persistent storage of entities and relations
  • 🔍 Semantic search capabilities
  • 🔄 Knowledge graph management
  • 🌐 Compatible with local and remote libSQL databases
  • 🔒 Secure token-based authentication for remote databases

Configuration

This server is designed to be used as part of an MCP configuration. Here are examples for different environments:

Cline Configuration

Add this to your Cline MCP settings:

{
	"mcpServers": {
		"mcp-memory-libsql": {
			"command": "npx",
			"args": ["-y", "mcp-memory-libsql"],
			"env": {
				"LIBSQL_URL": "file:/path/to/your/database.db"
			}
		}
	}
}

Claude Desktop with WSL Configuration

For a detailed guide on setting up this server with Claude Desktop in WSL, see Getting MCP Server Working with Claude Desktop in WSL.

Add this to your Claude Desktop configuration for WSL environments:

{
	"mcpServers": {
		"mcp-memory-libsql": {
			"command": "wsl.exe",
			"args": [
				"bash",
				"-c",
				"source ~/.nvm/nvm.sh && LIBSQL_URL=file:/path/to/database.db /home/username/.nvm/versions/node/v20.12.1/bin/npx mcp-memory-libsql"
			]
		}
	}
}

Database Configuration

The server supports both local SQLite and remote libSQL databases through the LIBSQL_URL environment variable:

For local SQLite databases:

{
	"env": {
		"LIBSQL_URL": "file:/path/to/database.db"
	}
}

For remote libSQL databases (e.g., Turso):

{
	"env": {
		"LIBSQL_URL": "libsql://your-database.turso.io",
		"LIBSQL_AUTH_TOKEN": "your-auth-token"
	}
}

Note: When using WSL, ensure the database path uses the Linux filesystem format (e.g., /home/username/...) rather than Windows format.

By default, if no URL is provided, it will use file:/memory-tool.db in the current directory.

API

The server implements the standard MCP memory interface with additional vector search capabilities:

  • Entity Management
    • Create/Update entities with embeddings
    • Delete entities
    • Search entities by similarity
  • Relation Management
    • Create relations between entities
    • Delete relations
    • Query related entities

Architecture

The server uses a libSQL database with the following schema:

  • Entities table: Stores entity information and embeddings
  • Relations table: Stores relationships between entities
  • Vector search capabilities implemented using libSQL's built-in vector operations

Development

Publishing

Due to npm 2FA requirements, publishing needs to be done manually:

  1. Create a changeset (documents your changes):
pnpm changeset
  1. Version the package (updates version and CHANGELOG):
pnpm changeset version
  1. Publish to npm (will prompt for 2FA code):
pnpm release

Contributing

Contributions are welcome! Please read our contributing guidelines before submitting pull requests.

License

MIT License - see the LICENSE file for details.

Acknowledgments

Recommended Servers

E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
AIO-MCP Server

AIO-MCP Server

🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from

Featured
Local
Persistent Knowledge Graph

Persistent Knowledge Graph

An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.

Featured
Local
Hyperbrowser MCP Server

Hyperbrowser MCP Server

Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to

Featured
Local
Any OpenAI Compatible API Integrations

Any OpenAI Compatible API Integrations

Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.

Featured
Exa MCP

Exa MCP

A Model Context Protocol server that enables AI assistants like Claude to perform real-time web searches using the Exa AI Search API in a safe and controlled manner.

Featured
BigQuery

BigQuery

This is a server that lets your LLMs (like Claude) talk directly to your BigQuery data! Think of it as a friendly translator that sits between your AI assistant and your database, making sure they can chat securely and efficiently.

Featured