universal-mcp-ui
A protocol-level CLI designed to interact with a Model Context Protocol server, allowing users to send commands, query data, and interact with various resources.
Model Context Protocol CLI
This repository contains a protocol-level CLI designed to interact with a Model Context Protocol server. The client allows users to send commands, query data, and interact with various resources provided by the server.
Features
- Protocol-level communication with the MCP Server.
- Dynamic tool and resource exploration.
- Support for multiple providers and models:
- Providers: OpenAI, Ollama.
- Default models:
gpt-4ofor OpenAI,qwen2.5-coderfor Ollama.
Prerequisites
- Python 3.8 or higher.
- Required dependencies (see Installation)
- If using ollama you should have ollama installed and running.
- If using openai you should have an api key set in your environment variables (OPENAI_API_KEY=yourkey)
Installation
- Clone the repository:
git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
- Install UV:
pip install uv
- Resynchronize dependencies:
uv sync --reinstall
Usage
To start the client and interact with the SQLite server, run the following command:
uv run mcp-cli --server sqlite
Command-line Arguments
--server: Specifies the server configuration to use. Required.--config-file: (Optional) Path to the JSON configuration file. Defaults toserver_config.json.--provider: (Optional) Specifies the provider to use (openaiorollama). Defaults toopenai.--model: (Optional) Specifies the model to use. Defaults depend on the provider:gpt-4ofor OpenAI.llama3.2for Ollama.
Examples
Run the client with the default OpenAI provider and model:
uv run mcp-cli --server sqlite
Run the client with a specific configuration and Ollama provider:
uv run mcp-cli --server sqlite --provider ollama --model llama3.2
Interactive Mode
The client supports interactive mode, allowing you to execute commands dynamically. Type help for a list of available commands or quit to exit the program.
Supported Commands
ping: Check if the server is responsive.list-tools: Display available tools.list-resources: Display available resources.list-prompts: Display available prompts.chat: Enter interactive chat mode.clear: Clear the terminal screen.help: Show a list of supported commands.quit/exit: Exit the client.
Chat Mode
To enter chat mode and interact with the server:
uv run mcp-cli --server sqlite
In chat mode, you can use tools and query the server interactively. The provider and model used are specified during startup and displayed as follows:
Entering chat mode using provider 'ollama' and model 'llama3.2'...
Using OpenAI Provider:
If you wish to use openai models, you should
- set the
OPENAI_API_KEYenvironment variable before running the client, either in .env or as an environment variable.
Contributing
Contributions are welcome! Please open an issue or submit a pull request with your proposed changes.
License
This project is licensed under the MIT License.