A command-line interface for interacting with Ollama and Anthropic models.
- Store conversations as files, allowing easy storage and editing
- Add context to a session with
-f/--fileflag and change the context file mid conversation - Use commands to modify and customize the current session
- Newlines are supported with ALT + ENTER
- Reuse and modify prompts
- Define multiple profiles with up to three models per profile (fast, balanced, deep)
- Switch between profiles and models on the fly
- Let models use tools (a limited set)
How the messages array is formed in the request JSON:
| Role | Content |
|---|---|
| system/assistant | cforge system prompt |
| user/assistant | conversation history |
| user | current prompt (+ optional context file) |
# Note, requires the default model to be present for ollama (gemma3:12b)
git clone https://github.com/mituuz/convo-forge.git
cd convo-forge
# When running the command for the first time, it generates a config file with the default values
cargo run -- chat.md
# Or install the binary using cargo
cargo install --path .
# Basic commands
:help # Show available commands
:list # List chat files
:q # Exit- Rust (latest stable)
- Ollama or access to Anthropic API
git clone https://github.com/mituuz/convo-forge.git
cd convo-forge
cargo build --releasecforge uses XDG paths for default chat and configuration.
# First time / optional
cforge <HISTORY_FILE> [OPTIONS]
# After first time
cforge [OPTIONS]<HISTORY_FILE>- Path to the file that acts as chat history (will be created if it doesn't exist)- If a relative path is provided, it will be created inside the data directory (according to XDG)
- If an absolute path is provided, it will be used as-is
- Mandatory for the first time, after that
.cforge.tomlcontains a reference to the previously opened history file
-f, --file <INPUT_FILE>- Optional to be used as context for each chat message. Context file is reloaded with each message-h, -help- Print help-v, --version- Print version
# Start a new conversation saving history to chat.txt
cforge chat.txt
# Continue a conversation with additional context from code.rs
cforge chat.txt -f code.rsFor a full list of commands, see docs/commands.md.
cforge supports two types of tools:
- Built-in tools
- User tools
- gitignored by default
- included dynamically at build time by build.rs
- implemented using Rust
For the full documentation, see docs/tools.md.
You can configure your cforge by creating and modifying TOML configuration located at ~/.config/cforge/cforge.toml.
An example toml populated with the default values.
# Path to the knowledge directory.
# Aliased to `@k/`
knowledge_dir = ""
# System prompt that configures the AI assistant's behavior.
system_prompt = """
You are an AI assistant receiving input from a command-line
application called convo-forge (cforge). The user may include additional context from another file.
Your responses are displayed in the terminal and saved to the history file.
Keep your answers helpful, concise, and relevant to both the user's direct query and any file context provided.
"""
# Show estimated token count compared to the model's on each prompt if the provider supports it (ollama yes, anthropic no)
token_estimation = true
# Control the token limit for anthropic models
max_tokens = 1024
# Modify default prefixes for command completion
# Options support path aliases and absolute paths
# e.g. `:swi <tab> :switch @c/`
# e.g. `:swi <tab> :switch /home/user/my_dir`
[command_prefixes]
switch = "@c/"
list = "@c/"
context = "@k/"
prompt = "@p/"
# You can define multiple profiles with up to three model types per profile (fast, balanced, deep)
[profiles_config]
[[profiles_config.profiles]]
name = "local"
provider = "ollama"
[[profiles_config.profiles.models]]
model = "gemma3:12b"
model_type = "balanced"
[rustyline]
# Switch rustyline input mode between `emacs` and `vi`.
mode = "emacs"
# Switch completion type between `circular` and `list`.
completion_mode = "circular"- ANTHROPIC_API_KEY - Valid API key to use Anthropic's models
If you want to keep everything under your own control, you should only use your local ollama. Nothing has to leave your machine.
Keep in mind that the chat files are stored on your machine and there is no option for temporary chats.
Keep your API keys safe.
You can find the changelog here.