Configuration Schema
Caboose uses TOML configuration files with a layered override system. Project-level settings take precedence over global defaults, and CLI flags override both.
Most of these values are managed automatically through the TUI — /connect saves API keys, /model updates the default provider, /mcp toggles servers, and /settings adjusts behavior options. This reference is for when you want to edit the files directly or understand what each field does.
Config file locations
Section titled “Config file locations”| File | Purpose |
|---|---|
~/.config/caboose/config.toml | Global defaults (apply to all projects) |
.caboose/config.toml | Project overrides (checked into version control) |
~/.config/caboose/auth.json | API keys and secrets (never commit this) |
Project config is merged on top of global config field-by-field. Unset project fields inherit the global value.
Authentication
Section titled “Authentication”API keys are stored in ~/.config/caboose/auth.json, not in config files. You can also set them via environment variables.
{ "anthropic": "sk-ant-...", "openai": "sk-..."}Supported environment variables: ANTHROPIC_API_KEY, OPENAI_API_KEY, GOOGLE_API_KEY, OLLAMA_HOST.
Full schema reference
Section titled “Full schema reference”[provider]
Section titled “[provider]”Top-level provider settings.
| Field | Type | Default | Description |
|---|---|---|---|
default | string | "anthropic" | Default provider to use. One of anthropic, openai, gemini, ollama. |
model | string | "claude-sonnet-4" | Default model across all providers. |
[provider.<name>]
Section titled “[provider.<name>]”Per-provider overrides. Supported sections: provider.anthropic, provider.openai, provider.gemini, provider.ollama.
| Field | Type | Default | Description |
|---|---|---|---|
model | string | (inherits provider.model) | Model to use for this specific provider. |
[tools]
Section titled “[tools]”| Field | Type | Default | Description |
|---|---|---|---|
allow | string[] | (all built-in tools) | Allowlist of tool names the agent may use. Built-in tools: read_file, write_file, edit_file, glob, grep, bash, list_directory, fetch. |
[behavior]
Section titled “[behavior]”| Field | Type | Default | Description |
|---|---|---|---|
auto_handoff_prompt | bool | true | Prompt user to hand off when context reaches 90% capacity. |
max_session_cost | float | (off) | Pause the agent when cumulative session cost reaches this dollar amount. |
[memory]
Section titled “[memory]”| Field | Type | Default | Description |
|---|---|---|---|
enabled | bool | true | Enable persistent memory across sessions. |
[skills]
Section titled “[skills]”| Field | Type | Default | Description |
|---|---|---|---|
enabled | bool | true | Enable the skills system. |
[mcp.servers.<name>]
Section titled “[mcp.servers.<name>]”Register custom MCP servers. Each server entry requires a command and optional arguments.
| Field | Type | Default | Description |
|---|---|---|---|
command | string | (required) | Executable to launch the MCP server. |
args | string[] | [] | Arguments passed to the command. |
disabled | bool | false | Temporarily disable this server without removing its config. |
[mcp.presets.<name>]
Section titled “[mcp.presets.<name>]”Override built-in MCP presets (e.g. fetch, context7).
| Field | Type | Default | Description |
|---|---|---|---|
disabled | bool | false | Disable a built-in preset. |
removed | bool | false | Hide a preset entirely so it never loads. |
Complete example
Section titled “Complete example”[provider]default = "anthropic"model = "claude-sonnet-4"
[provider.anthropic]model = "claude-sonnet-4"
[provider.openai]model = "gpt-4.1"
[tools]allow = ["read_file", "write_file", "edit_file", "glob", "grep", "bash", "list_directory", "fetch"]
[behavior]auto_handoff_prompt = truemax_session_cost = 10.0
[memory]enabled = true
[skills]enabled = true
[mcp.servers.filesystem]command = "npx"args = ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
[mcp.servers.context7]disabled = false
[mcp.presets.fetch]removed = true