lamatok-mcp

lamatok-mcp

MCP server for LamaTok TikTok data API — auto-generates 20+ tools from the live OpenAPI spec (users, videos, hashtags, music, comments). Local stdio via npx -y lamatok-mcp, requires only LAMATOK_KEY.

Category
Visit Server

README

lamatok-mcp

npm version npm downloads License: MIT

MCP server for LamaTok — TikTok data API. Available on npm: lamatok-mcp.

Auto-generates MCP tools from the LamaTok OpenAPI spec at startup, so every non-deprecated GET endpoint is exposed without hand-written wrappers. Tools map 1:1 to REST endpoints (GET /v1/user/by/usernameget_v1_user_by_username).

Get 100 Free API Requests

Sign up with this link and get 100 free LamaTok requests — no credit card required. Enough to wire up the MCP server, try a few prompts in Claude/Cursor/Codex, and evaluate the data quality before committing.

Get your free 100 requests here

Quick start

  1. Get an API key at lamatok.com.
  2. Add the server to your AI assistant.
  3. Ask your assistant something like:
    • "Get the TikTok profile for @nasa."
    • "List the last 10 videos by user_id 6707206320333226502."
    • "Find recent TikTok videos for the hashtag photography."

Claude Code

claude mcp add lamatok -e LAMATOK_KEY=your-api-key -- npx -y lamatok-mcp

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "lamatok": {
      "command": "npx",
      "args": ["-y", "lamatok-mcp"],
      "env": {
        "LAMATOK_KEY": "your-api-key"
      }
    }
  }
}

Cursor / Windsurf

Same shape as Claude Desktop — put the block under mcpServers in the app's MCP config file.

Zed

Add to ~/.config/zed/settings.json:

{
  "context_servers": {
    "lamatok": {
      "command": "npx",
      "args": ["-y", "lamatok-mcp"],
      "env": {
        "LAMATOK_KEY": "your-api-key"
      }
    }
  }
}

OpenAI Codex

Append to ~/.codex/config.toml:

[mcp_servers.lamatok]
command = "npx"
args = ["-y", "lamatok-mcp"]

[mcp_servers.lamatok.env]
LAMATOK_KEY = "your-api-key"

Tools

Tools are generated at startup from the live LamaTok OpenAPI spec, so the list always matches the current API. ~19 tools across these groups (sizes as of this writing):

Group Tools Examples
v1/user 9 get_v1_user_by_username, get_v1_user_by_id, get_v1_user_medias
v1/media 8 get_v1_media_info_by_id, get_v1_media_comments
v1/hashtag 2 get_v1_hashtag_medias_recent

Each tool name mirrors its endpoint (GET /v1/user/by/usernameget_v1_user_by_username). Your assistant can call tools/list over MCP to get the full, up-to-date list with parameter schemas. /sys, Legacy, and System tag groups are excluded by default.

Configuration

Variable Description Required
LAMATOK_KEY Your LamaTok access key (sent as x-access-key header) yes
LAMATOK_URL Base URL. Default: https://api.lamatok.com no
LAMATOK_SPEC_URL OpenAPI spec URL. Default: ${LAMATOK_URL}/openapi.json no
LAMATOK_TAGS Whitelist: only include operations with these tags (comma-separated) no
LAMATOK_EXCLUDE_TAGS Blacklist: additional tags to exclude (on top of Legacy, System, /sys) no
LAMATOK_TIMEOUT_MS Per-request timeout for API calls. Default: 30000 no
LAMATOK_SPEC_TIMEOUT_MS Timeout for the startup spec fetch. Default: 60000 no
LAMATOK_MAX_RESPONSE_BYTES Max bytes read from each API response. Default: 10485760 (10 MB) no
LAMATOK_MAX_SPEC_BYTES Max bytes read from the OpenAPI spec. Default: 8388608 (8 MB) no

Legacy, System, and /sys tags are excluded by default. Deprecated operations are also skipped.

If LAMATOK_URL points to a host other than api.lamatok.com, the server prints a warning on startup — your key will be sent there, so only use it for a self-hosted or proxied LamaTok.

How it works

AI Assistant ←stdio→ lamatok-mcp ──https──> api.lamatok.com
                          │
                          └─ fetches /openapi.json once on startup,
                             builds one MCP tool per GET endpoint

Tool arguments map to the endpoint's query and path parameters. The response body is returned as-is (JSON text). Non-2xx responses are surfaced as tool errors with the HTTP status and body.

Development

git clone https://github.com/subzeroid/lamatok-mcp.git
cd lamatok-mcp
npm install
npm run build
LAMATOK_KEY=your-key node dist/index.js

Run in watch mode:

LAMATOK_KEY=your-key npm run dev

Run tests (unit + stdio smoke tests against a local mock server, no network/API key required):

npm test

License

MIT

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured