HyperChat is an open Chat client that can use various LLM APIs to provide the best Chat experience and implement productivity tools through the MCP protocol.
- Supports OpenAI-style LLMs, OpenAI, Claude(OpenRouter), Qwen, Deepseek, GLM, Ollama.
- Built-in MCP plugin market with user-friendly MCP installation configuration, one-click installation, and welcome to submit HyperChatMCP.
- Also supports manual installation of third-party MCPs; simply fill in command, args, and env.
HyperChat is a chat client designed to provide an open and flexible experience for interacting with large language models (LLMs). It supports various LLM APIs, including OpenAI, Claude, Qwen, Deepseek, GLM, Ollama, and more, enabling users to choose the best model for their needs. HyperChat also implements productivity tools through the MCP protocol, allowing for seamless integration of custom plugins.
HyperChat is an open-source Chat client that supports MCP and can use APIs from various LLMs to achieve the best Chat experience and productivity tools.
Multi-LLM Support: Works with OpenAI-style LLMs, including Claude (via OpenRouter), Qwen, Deepseek, GLM, and Ollama.
MCP Plugin Market: Built-in plugin market with one-click installation for user-friendly setup of MCP plugins.
Custom Integration: Supports manual installation of third-party MCPs by specifying commands, arguments, and environments.
Audience & Benefit:
Ideal for developers, researchers, and tech enthusiasts seeking a customizable chat client to experiment with different LLMs and productivity tools. HyperChat empowers users to enhance their workflows through diverse AI-driven features.
HyperChat can be installed via winget, making it easy to integrate into your workflow.
Supports English and Chinese
Supports Artifacts, SVG, HTML, Mermaid rendering
Supports defining Agents, with preset prompts and options for allowed MCPs
Supports scheduled tasks, specifying Agents to complete tasks on a schedule and view task completion status.
Supports KaTeX, displaying mathematical formulas, code rendering with syntax highlighting and quick copy
Added RAG, based on MCP knowledge base
Introduced ChatSpace concept, supporting simultaneous chats in multiple conversations
Supports model comparison in chat
TODO:
Support official Claude protocol
LLM
LLM
Usability
Remarks
claude
⭐⭐⭐⭐⭐
No explanation
openai
⭐⭐⭐⭐
Also supports multi-step function calls perfectly (gpt-4o-mini also works)
gemini flash 2.0
⭐⭐⭐⭐
Very usable
qwen
⭐⭐⭐⭐
Very usable
doubao
⭐⭐⭐
Feels okay to use
deepseek
⭐⭐⭐
Multi-step function calls may have issues
Usage
Configure API KEY, ensure your LLM service is compatible with OpenAI style.
Ensure that uv + nodejs and others are installed on your system.
You can access via the Web anywhere + any device, and set a password
Calling terminal MCP automatically analyzes ASAR files + helps decompress them
Calling terminal view interface
Gaode Map MCP
One-click webpage writing and publishing to (cloudflare)
Calling Google Search, asking what the TGA Game of the Year is
What are some limited-time free games, please visit the website to call the tool
Helps you open web pages, analyze results, and write to files
Using web tools + command line tools to open GitHub README for learning + GIT clone + setting up development environment
Multi-chat Workspace + Night mode
Scheduled task list + schedule sending messages to Agent to complete tasks
Install MCP from third-party (supports any MCP)
H5 interface
Testing model capabilities
Knowledge base
Disclaimer
This project is for learning and communication purposes only. If you use this project for any operations, such as crawling behavior, it is unrelated to the developers of this project.