Configuration commands
term-llm config
term-llm config edit
term-llm config path
term-llm config get default_provider
term-llm config set default_provider zen
term-llm config reset
The main config file lives at:
~/.config/term-llm/config.yaml
Configuration shape
A typical config has a few major parts:
default_providerfor the global LLM defaultprovidersfor model-specific credentials and routing- per-command blocks such as
exec,ask, andedit - feature-specific blocks such as
image,embed,search,sessions, andtools
Example
default_provider: anthropic
providers:
anthropic:
model: claude-sonnet-4-6
openai:
model: gpt-5.2
credentials: codex
xai:
model: grok-4-1-fast
openrouter:
model: x-ai/grok-code-fast-1
app_url: https://github.com/samsaffron/term-llm
app_title: term-llm
exec:
suggestions: 3
instructions: |
I use Arch Linux with zsh.
I prefer ripgrep over grep and fd over find.
ask:
instructions: |
Be concise. I'm an experienced developer.
edit:
model: gpt-5.2-codex
diff_format: auto
search:
provider: duckduckgo
tools:
max_tool_output_chars: 20000
Per-command overrides
Each command can override provider and model independently of the global default.
default_provider: anthropic
providers:
anthropic:
model: claude-sonnet-4-6
openai:
model: gpt-5.2
zen:
model: glm-4.7-free
exec:
provider: zen
model: glm-4.7-free
ask:
model: claude-opus-4
edit:
provider: openai
model: gpt-5.2-codex
Precedence is:
- CLI flag such as
--provider openai:gpt-5.2 - per-command config such as
exec.providerorask.model - global provider selection via
default_providerandproviders.<name>.model
Sessions config
sessions:
enabled: true
max_age_days: 0
max_count: 0
path: ""
Use this to control whether sessions are persisted, how long they are kept, and where the SQLite database lives.
Search config
search:
provider: perplexity
force_external: false
perplexity:
api_key: ${PERPLEXITY_API_KEY}
exa:
api_key: ${EXA_API_KEY}
brave:
api_key: ${BRAVE_API_KEY}
Search is large enough to deserve its own page; see Search.
Image and embedding config
image:
provider: gemini
output_dir: ~/Pictures/term-llm
embed:
provider: gemini
Each feature block can hold provider-specific credentials and defaults. The image and embedding providers are independent of the main text provider.
Dynamic secrets and endpoints
term-llm supports dynamic resolution for some config values:
op://...for 1Password secret referencessrv://...for DNS SRV-based endpoint discovery$()for command-based resolution
Example:
providers:
production-llm:
type: openai_compatible
model: Qwen/Qwen3-30B-A3B
url: "srv://_vllm._tcp.ml.company.com/v1/chat/completions"
api_key: "op://Infrastructure/vLLM Cluster/credential?account=company.1password.com"
These values are resolved lazily when term-llm actually needs them.
Diagnostics
diagnostics:
enabled: true
When edit retries fail, diagnostics can capture prompts, partial responses, and failure context for inspection.