ChatMCP

by daodao97

AI Chat with MCP Server using Any LLM Model
332 stars6 watching13 forks

chatmcp

AI Chat with MCP Server use Any LLM Model

Usage

Make sure you have installed uvx or npx in your system

# uvx
brew install uv

# npx
brew install node 
  1. Configure Your LLM API Key and Endpoint in Setting Page
  2. Install MCP Server from MCP Server Page
  3. Chat with MCP Server

Install

Download MacOS | ~Windows~

Debug

  • logs

~/Library/Application Support/run.daodao.chatmcp/logs

  • chatmcp.db chat history

~/Documents/chatmcp.db

  • mcp_server.json mcp server config

~/Documents/mcp_server.json

reset app can use this command

rm -rf ~/Library/Application\ Support/run.daodao.chatmcp
rm -rf ~/Documents/chatmcp.db
rm -rf ~/Documents/mcp_server.json

Development

flutter pub get
flutter run -d macos

download test.db to test sqlite mcp server

~/Documents/mcp_server.json is the configuration file for the mcp server

Features

  • [x] Chat with MCP Server
  • [ ] MCP Server Market
  • [ ] Auto install MCP Server
  • [ ] SSE MCP Transport Support
  • [x] Auto Choose MCP Server
  • [x] Chat History
  • [x] OpenAI LLM Model
  • [ ] Claude LLM Model
  • [ ] OLLama LLM Model
  • [ ] RAG
  • [ ] Better UI Design

All features are welcome to submit, you can submit your ideas or bugs in Issues

MCP Server Market

You can install MCP Server from MCP Server Market, MCP Server Market is a collection of MCP Server, you can use it to chat with different data.

Thanks

License

This project is licensed under the GNU General Public License v3.0 (GPL-3.0).

Features

Chat
History
OpenAI
Multimodel
Market
AutoServer
Configuration