Talk to any LLM from your terminal.
OpenAI, Anthropic, Gemini, Vertex AI, OpenAI Responses API, and OpenClaw. All with tool calling support.
Tokens appear in real-time as the model generates them, with a loading spinner while waiting.
Send images, PDFs, and code files alongside messages. Tab-completion for file paths.
Full conversation history, interactive model selection, and color-coded terminal output.
Non-interactive mode with -m flag. Use -m - to read from stdin — pipe-friendly.
Persistent API keys, default models, and custom provider aliases via ~/.chatchain.yaml.
Connect external MCP servers (filesystem, GitHub, databases) and let AI use them as tools during chat.
Headings, bold, italic, code, and tables are highlighted with ANSI colors in streaming output.
Use as a Claude Code plugin to call other LLMs directly within your coding workflow.
Each provider uses its official SDK. Pass your API key via flag or environment variable (OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY, OPENCLAW_GATEWAY_TOKEN).
Attach files in interactive mode with the /file command. Tab-completion helps you navigate paths. Files are sent with your next message, then cleared automatically.
Supported file types:
ChatChain works as a Claude Code plugin. Install it to call other LLMs directly within Claude Code.
| Flag | Short | Description |
|---|---|---|
--key | -k | API key (or set via env var) |
--url | -u | Custom base URL |
--model | -M | Model name (skip interactive selection) |
--temperature | -t | Sampling temperature (0.0-2.0) |
--message | -m | Non-interactive: send a single message (use - to read from stdin) |
--system | -s | System prompt (omit value for interactive input) |
--list | -l | List configured providers, or models for a given provider |
--mcp | MCP server (command or URL, repeatable) | |
--config | -c | Path to config file (default: ~/.chatchain.yaml) |
--verbose | -v | Verbose: print HTTP request/response bodies |
Save API keys, default models, and custom provider aliases in ~/.chatchain.yaml. Priority: CLI flag > env var > config file.
With this config, chatchain deepseek -m "hello" uses the OpenAI provider with DeepSeek's key, URL, and model. MCP servers are connected automatically on startup. Config files are loaded from ~/ (global) and ./ (project-local), or via -c <path>.
| Command | Description |
|---|---|
/file <path> | Attach a file (with Tab completion for paths) |
/files | List currently attached files |
/clear | Remove all attached files |
/save [path] | Save conversation to Markdown (default: history.md) |
/import [path] | Import conversation from a saved Markdown file (default: history.md) |
/mcp | Show connected MCP servers and their tools |