
Deepchat
ThinkInAIXYZ
README
<p align='center'> <img src='./build/icon.png' width="150" height="150" alt="logo" /> </p>
<h1 align="center">DeepChat</h1>
<p align="center">Dolphins are good friends of whales, and DeepChat is your good assistant</p>
<div align="center"> <a href="./README.zh.md">中文</a> / English </div>
Reasoning
<p align='center'> <img src='./build/screen.jpg'/> </p>
Search
<p align='center'> <img src='./build/screen.search.jpg'/> </p>
Latex
<p align='center'> <img src='./build/screen.latex.jpg'/> </p>
Artifacts support
<p align='center'> <img src='./build/screen.artifacts.jpg'/> </p>
Main Features
- 🌐 Supports multiple model cloud services: DeepSeek, OpenAI, Silicon Flow, etc.
- 🏠 Supports local model deployment: Ollama
- 🚀 Multi-channel chat concurrency support, switch to other conversations without waiting for the model to finish generating, efficiency Max
- 💻 Supports multiple platforms: Windows, macOS, Linux
- 📄 Complete Markdown rendering, excellent code module rendering
- 🌟 Easy to use, with a complete guide page, you can get started immediately without understanding complex concepts
Currently Supported Model Providers
<table> <tr align="center"> <td> <img src="./src/renderer/src/assets/llm-icons/ollama.svg" width="50" height="50"><br/> <a href="https://ollama.com">Ollama</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/deepseek-color.svg" width="50" height="50"><br/> <a href="https://deepseek.com/">Deepseek</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/siliconcloud.svg" width="50" height="50"><br/> <a href="https://www.siliconflow.cn/">Silicon</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/qwen-color.svg" width="50" height="50"><br/> <a href="https://chat.qwenlm.ai">QwenLM</a> </td> </tr> <tr align="center"> <td> <img src="./src/renderer/src/assets/llm-icons/doubao-color.svg" width="50" height="50"><br/> <a href="https://console.volcengine.com/ark/">Doubao</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/minimax-color.svg" width="50" height="50"><br/> <a href="https://platform.minimaxi.com/">MiniMax</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/fireworks-color.svg" width="50" height="50"><br/> <a href="https://fireworks.ai/">Fireworks</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/ppio-color.svg" width="50" height="50"><br/> <a href="https://ppinfra.com/">PPIO</a> </td> </tr> <tr align="center"> <td> <img src="./src/renderer/src/assets/llm-icons/openai.svg" width="50" height="50"><br/> <a href="https://openai.com/">OpenAI</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/gemini-color.svg" width="50" height="50"><br/> <a href="https://gemini.google.com/">Gemini</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/github.svg" width="50" height="50"><br/> <a href="https://github.com/marketplace/models">GitHub Models</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/moonshot.svg" width="50" height="50"><br/> <a href="https://moonshot.ai/">Moonshot</a> </td> </tr> <tr align="center"> <td> <img src="./src/renderer/src/assets/llm-icons/openrouter.svg" width="50" height="50"><br/> <a href="https://openrouter.ai/">OpenRouter</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/azure-color.svg" width="50" height="50"><br/> <a href="https://azure.microsoft.com/en-us/products/ai-services/openai-service">Azure OpenAI</a> </td> <td> <img src="./src/renderer/src/assets/llm-icons/qiniu.svg" width="50" height="50"><br/> <a href="https://www.qiniu.com/products/ai-token-api">Qiniu</a> </td> <td colspan="1"> Compatible with any model provider in openai/gemini API format </td> </tr> </table>
Other Features
- Support for local model management with Ollama
- Support for local file processing
- Artifacts support
- Customizable search engines (parsed through models, no API adaptation required)
- MCP support (built-in npx, no additional node environment installation needed)
- Support for multimodality models
- Local chat data backup and recovery
- Compatibility with any model provider in OpenAI, Gemini, and Anthropic API formats
Development
Please read the Contribution Guidelines Windows and Linux are packaged by GitHub Action. For Mac-related signing and packaging, please refer to the Mac Release Guide.
Install dependencies
$ npm install
$ npm run installRuntime
# if got err: No module named 'distutils'
$ pip install setuptools
# for windows x64
$ npm install --cpu=x64 --os=win32 sharp
# for mac apple silicon
$ npm install --cpu=arm64 --os=darwin sharp
# for mac intel
$ npm install --cpu=x64 --os=darwin sharp
# for linux x64
$ npm install --cpu=x64 --os=linux sharp
Start development
$ npm run dev
Build
# For windows
$ npm run build:win
# For macOS
$ npm run build:mac
# For Linux
$ npm run build:linux
# Specify architecture packaging
$ npm run build:win:x64
$ npm run build:win:arm64
$ npm run build:mac:x64
$ npm run build:mac:arm64
$ npm run build:linux:x64
$ npm run build:linux:arm64
Star History
Contributors
Thank you for considering contributing to deepchat! The contribution guide can be found in the Contribution Guidelines.
<a href="https://github.com/ThinkInAIXYZ/deepchat/graphs/contributors"> <img src="https://contrib.rocks/image?repo=ThinkInAIXYZ/deepchat" /> </a>
📃 License
Recommended Servers
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Apple MCP Server
Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.
@kazuph/mcp-gmail-gas
Model Context Protocol server for Gmail integration. This allows Claude Desktop (or any MCP client) to interact with your Gmail account through Google Apps Script.
MCP Server Trello
Facilitates interaction with Trello boards via the Trello API, offering features like rate limiting, type safety, input validation, and error handling for seamless management of cards, lists, and board activities.

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.
Composio MCP Server
A server implementation that provides MCP-compatible access to Composio applications like Gmail and Linear, allowing interaction through a structured interface for language models.

Folderr
A Model Context Protocol (MCP) server that provides tools to interact with Folderr's API, specifically for managing and communicating with Folderr Assistants.

mcp-google
A specialized Model Context Protocol (MCP) server that integrates Google services (Gmail, Calendar, etc.) into your AI workflows. This server enables seamless access to Google services through MCP, allowing AI agents to interact with Gmail, Google Calendar, and other Google services.
MCP-JIRA-Python Server
A Python-based server allowing seamless integration with JIRA for managing and interacting with projects through custom APIs.
Email Sending MCP
Send emails directly from Cursor with this email sending MCP server