Skip to content

TOML Config

Cosine CLI can be configured through multiple sources: TOML configuration files, command-line flags, and environment variables. This guide covers all configuration options with complete examples.

Your personal settings, API tokens, and global preferences. Keep this file private — it contains secrets!

# ============================================================
# COSINE CLI USER CONFIGURATION (~/.cosine.toml)
# ============================================================
# ------------------------------------------------------------
# AUTHENTICATION (Auto-populated via `cos login`)
# ------------------------------------------------------------
[auth]
user_id = "usr_123456789"
token = "tok_abcdef123456"
team_id = "team_abc123"
team_name = "My Organization"
team_slug = "my-org"
# OpenAI OAuth for ChatGPT subscription users
[auth.openai]
auth_method = "oauth"
oauth_redirect_uri = "http://localhost:1455/auth/callback"
# ------------------------------------------------------------
# API & INFERENCE
# ------------------------------------------------------------
[api]
base_url = "https://api.cosine.wtf"
[inference]
base_url = "https://api.cosine.wtf"
model = "gpt-5"
small_model = "claude-sonnet-4-6-1m"
micro_model = "claude-haiku-4-5"
review_model = "gemini-3.1-pro"
max_context_tokens = 128000
max_turns = 100
# ------------------------------------------------------------
# GENERAL SETTINGS
# ------------------------------------------------------------
[config]
system_prompt_id = "lumen"
reasoning_level = "medium"
shell = "/bin/zsh"
agent_commits = true
memory_recall_mode = "every"
memory_recall_detail = "snippet"
theme = "dark"
# ------------------------------------------------------------
# BROWSER & MCP
# ------------------------------------------------------------
[browser]
cdp_url = "http://localhost:9222"
# ------------------------------------------------------------
# ULTRA DAEMON (Background task suggestions)
# ------------------------------------------------------------
[ultra.inactivity_minutes]
repo = 15
slack = 0
linear = 0
gmail = 5
# ------------------------------------------------------------
# PROJECT MAPPINGS
# ------------------------------------------------------------
[projects]
"/Users/me/work/api" = { project_id = "proj_abc123" }
# ------------------------------------------------------------
# TOOL CONFIGURATION
# ------------------------------------------------------------
[agent]
disabled_tools = ["edit", "mcp_slack_*"]

Configuration is resolved in this order (highest to lowest priority):

  1. Command-line flags - Override everything else
  2. Repository config - Project-specific settings in cosine.toml
  3. User config - Personal settings in ~/.cosine.toml
  4. Environment variables - System-level overrides
  5. Default values - Built-in defaults
FilePurposeScope
~/.cosine.tomlUser settings, auth tokens, global preferencesUser-wide
cosine.toml or .cosine.tomlProject-specific settingsRepository
cosine.<profile>.tomlProfile-specific project configRepository
~/.cosine.<profile>.tomlProfile-specific user configUser-wide
~/.cosine/mcp.jsonMCP server definitionsUser-wide

Create multiple profiles for different contexts:

Terminal window
# Use work profile (looks for cosine.work.toml or ~/.cosine.work.toml)
cos --profile work start
# Use personal profile
cos --profile personal start

Profile resolution order:

  1. cosine.<profile>.toml in repo (project-specific)
  2. ~/.cosine.<profile>.toml in home (user-specific)

The sections below cover the TOML keys you are most likely to edit directly.

Populated automatically via cos login. Manual configuration is rarely needed.

FieldTypeDescription
user_idstringYour Cosine user ID
tokenstringAPI authentication token
team_idstringYour team ID
team_namestringYour team name
team_slugstringYour team slug
auth.openai.auth_methodstring"cosine" or "oauth"
auth.openai.oauth_redirect_uristringOAuth callback URL
FieldTypeDefaultDescription
base_urlstringhttps://api.cosine.wtfBase URL for Cosine API
FieldTypeDefaultDescription
base_urlstringhttps://api.cosine.wtfOpenAI-compatible endpoint
modelstringgpt-5.4Default agent model
small_modelstringclaude-sonnet-4-6-1mFast model for quick tasks
micro_modelstringclaude-haiku-4-5Tiny model for memory queries
review_modelstringgemini-3.1-proModel for code reviews
max_context_tokensint0 (unlimited)Max context window
max_turnsint0 (unlimited)Max conversation turns

Available models: gpt-5, codex, lumen, claude-sonnet, gemini-3.1-pro, etc.

FieldTypeDefaultDescription
system_prompt_idstringlumenDefault prompt: lumen, judge, orchestrator
reasoning_levelstringmediumnone, low, medium, high, xhigh, adaptive
shellstring$SHELLPreferred shell for terminal
agent_commitsbooltrueAuto-create Agent Commits per turn
memory_recall_modestringeveryevery or first (when to recall)
memory_recall_detailstringsnippetsnippet or full (memory detail)
themestringdarkUI theme: dark or light
ultra_enabledboolfalseEnable Ultra daemon

Reasoning levels: none (minimal internal reasoning), low (fast), medium (balanced), high (thorough), xhigh (supported GPT/Codex models), adaptive (supported Claude 4.6 models)

For guidance on when to use each level, see Reasoning.

Older configs that still use checkpointing are still read, but new configs should use agent_commits.

Shell paths:

  • Bash: /bin/bash
  • Zsh: /bin/zsh
  • Fish: /opt/homebrew/bin/fish
FieldTypeDescription
cdp_urlstringChrome DevTools URL, e.g., http://localhost:9222

[ultra.inactivity_minutes] — Ultra Daemon

Section titled “[ultra.inactivity_minutes] — Ultra Daemon”

Minutes of inactivity before suggesting tasks from each source:

[ultra.inactivity_minutes]
repo = 15 # Local coding (default: 15)
slack = 0 # Slack messages
linear = 0 # Linear issues
gmail = 5 # Gmail
FieldTypeDescription
modelstringModel override for this repo
extra_contextstringPath to context file (e.g., AGENTS.md)
disabled_toolsarrayTools to disable: ["edit", "mcp_*"]

Tool rules:

  • Exact match: "edit"
  • Wildcard: "mcp_*" (disables all MCP tools)

Manage via terminal user interface (TUI): Press Ctrl+PToolsSpace to toggle

FieldTypeDescription
on_savestringCommand to run after file writes

Placeholders:

  • {file} or {{file}} — replaced with saved file path

Environment variables:

  • COSINE_ON_SAVE_FILE — Path to saved file
  • COSINE_ON_SAVE_CONFIG — Path to config file
[hooks]
on_save = "go test ./... -run TestRelatedTo{file}"

Map local directories to Cosine project IDs:

[projects]
"/Users/me/work/api" = { project_id = "proj_abc123" }
"/Users/me/work/web" = { project_id = "proj_def456" }

TOML config is the long-lived source of truth, but you can override it per session with CLI flags.

Common examples:

Terminal window
cos start --model gpt-5 --reasoning high
cos start --profile work --debug
cos start --cdp-url http://localhost:9222 --auto-accept

For the full flag reference, examples, and in-page jump links, see Commands.

Override the default config file location:

Terminal window
export COSINE_CONFIG_FILE=/path/to/custom/config.toml
cos start

Some MCP servers need API keys as environment variables:

Terminal window
export OPENAI_API_KEY=sk-...
export GITHUB_PERSONAL_ACCESS_TOKEN=ghp_...
export SLACK_BOT_TOKEN=xoxb-...
cos start

API tokens go in ~/.cosine.toml (never commit). Project settings go in cosine.toml (safe to commit).

Terminal window
cos --profile work start # Work settings
cos --profile personal start # Personal settings
Terminal window
cos start --model codex --debug # One session only

Mark root config to stop parent directory searches:

is_topmost_config = true
Terminal window
cos start --debug # See loaded config
cos start # Check MCP status in TUI