nativ-mcp
AI-powered localization platform. Translate text, search translation memory, and access style guides from any MCP-compatible AI tool.
README
Nativ MCP Server
mcp-name: io.github.Nativ-Technologies/nativ
AI-powered localization for any MCP-compatible tool — Claude Code, Cursor, Windsurf, and more.
Nativ is a localization platform that uses AI to translate content while respecting your brand voice, translation memory, glossaries, and style guides. This MCP server brings Nativ's full localization engine into your AI coding workflow.
<a href="https://smithery.ai/server/@nativ-ai/nativ-mcp"><img alt="Smithery" src="https://smithery.ai/badge/@nativ-ai/nativ-mcp"></a>
Why use Nativ via MCP?
- Translate in-context — localize strings, copy, and content directly from your editor without switching to a browser
- Translation Memory aware — every translation checks your TM first, ensuring consistency across your project
- Brand voice built-in — your team's tone, formality, and style guides are applied automatically
- Review and approve — add approved translations to TM from your editor, building quality over time
- Multi-format — JSON, CSV, Markdown, or freeform text — Nativ handles it all
Quick Start
1. Get a Nativ API Key
Sign up at dashboard.usenativ.com, go to Settings → API Keys, and create a key. It looks like nativ_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.
2. Install
Add to your MCP configuration:
Claude Code / Claude Desktop (~/.claude/claude_desktop_config.json)
{
"mcpServers": {
"nativ": {
"command": "npx",
"args": ["-y", "nativ-mcp"],
"env": {
"NATIV_API_KEY": "nativ_your_api_key_here"
}
}
}
}
Cursor (.cursor/mcp.json in your project or ~/.cursor/mcp.json globally)
{
"mcpServers": {
"nativ": {
"command": "npx",
"args": ["-y", "nativ-mcp"],
"env": {
"NATIV_API_KEY": "nativ_your_api_key_here"
}
}
}
}
Windsurf
{
"mcpServers": {
"nativ": {
"command": "npx",
"args": ["-y", "nativ-mcp"],
"env": {
"NATIV_API_KEY": "nativ_your_api_key_here"
}
}
}
}
Note:
npxauto-downloads the package on first run — no manual install needed. Ifuvisn't already on your machine, it will be installed automatically on first launch.<details><summary>Alternative: use <code>uvx</code> directly</summary>
If you already have
uvinstalled and prefer to skip the npm wrapper:{ "mcpServers": { "nativ": { "command": "uvx", "args": ["nativ-mcp"], "env": { "NATIV_API_KEY": "nativ_your_api_key_here" } } } }macOS tip: If you get
spawn uvx ENOENTin Cursor or Claude Desktop, GUI apps don't inherit your shell PATH. Use the full path (e.g."command": "/Users/you/.local/bin/uvx") or wrap in a login shell:"command": "/bin/sh", "args": ["-lc", "uvx nativ-mcp"].</details>
3. Use it
Ask your AI assistant things like:
- "Translate 'Welcome back!' to French and German"
- "Check our translation memory for existing translations of 'Sign up'"
- "What are our style guides for localization?"
- "Localize these i18n strings to all configured languages"
- "Review this German translation against our TM and brand voice"
Tools
| Tool | Description |
|---|---|
translate |
Translate text using the full localization engine (TM, style guides, brand voice, glossary) |
translate_batch |
Translate multiple texts to a target language in one call |
search_translation_memory |
Fuzzy-search the translation memory for existing translations |
add_translation_memory_entry |
Add an approved translation to TM for future reuse |
get_languages |
List all configured languages with formality and style settings |
get_translation_memory_stats |
Get TM statistics — total entries, sources, and breakdown |
get_style_guides |
List all style guides with their content and status |
get_brand_voice |
Get the brand voice prompt that shapes all translations |
Resources
| URI | Description |
|---|---|
nativ://languages |
Configured languages (JSON) |
nativ://style-guides |
All style guides (JSON) |
nativ://brand-prompt |
Brand voice prompt (JSON) |
nativ://tm/stats |
Translation memory statistics (JSON) |
Prompts
| Prompt | Description |
|---|---|
localize-content |
Guided workflow to localize content into target languages |
review-translation |
Review a translation against TM, style guides, and brand voice |
batch-localize-strings |
Batch-localize i18n strings with structured output |
Examples
Translate a marketing headline
You: Translate "The future of luxury, delivered" to French and Japanese
AI: [calls translate tool for each language]
Translation (French): "L'avenir du luxe, livré chez vous"
TM Match: 0% — new translation, no prior TM entries
Rationale: "Livré chez vous" adds a personal touch absent from the literal
"livré", aligning with the brand's premium yet approachable voice.
Translation (Japanese): "ラグジュアリーの未来を、あなたの元へ"
TM Match: 45% partial — similar pattern found in TM from brand_voice source
Check existing translations
You: Do we have translations for "Add to cart" in our TM?
AI: [calls search_translation_memory]
TM Search Results for "Add to cart" (3 matches):
- 95% [strong] "Add to cart" → "Ajouter au panier" (source: approved)
- 95% [strong] "Add to cart" → "In den Warenkorb" (source: brand_voice)
- 72% [partial] "Add items to cart" → "Ajouter des articles" (source: phrase_tm)
Batch localize i18n strings
You: Localize these to French:
- "Sign up"
- "Log in"
- "Forgot password?"
- "Continue with Google"
AI: [calls translate_batch]
Batch translation to French (4 items):
1. "Sign up" → "S'inscrire" (TM 100%)
2. "Log in" → "Se connecter" (TM 100%)
3. "Forgot password?" → "Mot de passe oublié ?" (TM 92%)
4. "Continue with Google" → "Continuer avec Google" (TM 85%)
Configuration
| Environment Variable | Required | Description |
|---|---|---|
NATIV_API_KEY |
Yes | Your Nativ API key (nativ_xxx...) |
NATIV_API_URL |
No | API base URL (defaults to https://api.usenativ.com) |
How It Works
This MCP server acts as a bridge between your AI coding assistant and the Nativ API:
┌─────────────────────┐ ┌──────────────┐ ┌─────────────────┐
│ Claude / Cursor / │────▶│ Nativ MCP │────▶│ Nativ API │
│ Windsurf / etc. │◀────│ Server │◀────│ (Translation, │
│ │ │ (stdio) │ │ TM, Styles) │
└─────────────────────┘ └──────────────┘ └─────────────────┘
The MCP server runs locally via stdio. It authenticates with your API key and calls the Nativ REST API on your behalf. Your AI assistant sees Nativ's tools, resources, and prompts as native capabilities.
Development
# Clone the repo
git clone https://github.com/nativ-ai/nativ-mcp.git
cd nativ-mcp
# Set up environment
uv venv
source .venv/bin/activate
uv pip install -e ".[dev]"
# Run the server (for testing)
NATIV_API_KEY=nativ_xxx nativ-mcp
# Run with MCP Inspector
NATIV_API_KEY=nativ_xxx npx @modelcontextprotocol/inspector uv run nativ-mcp
License
MIT — see LICENSE.
Links
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.