Skip to content

Quickstart

Start an interactive session by running caboose in your project directory:

Terminal window
cd your-project
caboose

The TUI renders a rich terminal interface with syntax highlighting, markdown support, and an embedded terminal.

Before you can chat, you need to configure at least one LLM provider. The fastest way is the /connect command:

/connect anthropic

This prompts you for an API key and stores it securely. You can also set a key via environment variable:

Terminal window
export ANTHROPIC_API_KEY=sk-ant-...

Caboose supports 15+ providers out of the box: Anthropic, OpenAI, Gemini, OpenRouter, xAI, Together AI, Fireworks AI, Cerebras, SambaNova, Perplexity, Cohere, Qwen, DeepSeek, Groq, Mistral — plus Ollama, LM Studio, and llama.cpp for local models. See Providers for the full list.

Once connected, type a message and press Enter. Caboose streams the response token by token. The agent can read files, make edits, run shell commands, and fetch web content — all rendered inline in the TUI.

Caboose has four permission modes that control what the agent can do without asking. Press Tab to cycle between them:

ModeBehavior
PlanRead-only. The agent can explore but cannot write files or run commands.
CreateReads are automatic. Writes and shell commands require your approval.
AutoEditFile edits are automatic. Shell commands still require approval.
ChugFully autonomous. All tool calls execute without prompting.

The current mode is always visible in the status bar at the bottom of the screen.

/connect # add or update a provider API key
/model # switch the active model
/sessions # list saved sessions
/new # start a fresh session
/init # generate a CABOOSE.md project context file
/status # show current provider, model, token usage, and cost
/suggest # scan the codebase for issues and improvements

For scripting or one-off queries, pass a prompt directly:

Terminal window
caboose --prompt "Explain the authentication flow in this codebase"

Combine with other flags:

Terminal window
caboose -p "Add unit tests for utils.rs" --mode chug -m claude-sonnet-4

Use -f json for structured output with token counts and tool calls:

Terminal window
caboose --prompt "List all API endpoints" -f json