term-llm

What it is

term-llm

term-llm is a terminal-first AI runtime.

It turns natural language into command suggestions, supports persistent chat, runs agents, edits files, calls tools, schedules jobs, and works with local or hosted models.

If you are new, install it, configure one provider, run the quickstart, and then come back for the workflow or reference page that matches the job.

Install

Copy the command and run it

curl -fsSL https://raw.githubusercontent.com/samsaffron/term-llm/main/install.sh | sh

The full command is shown inline, so you can see exactly what you are about to paste.

Start with intent

Three common entry points

Read the right page

Pick the lane that matches the job

Start here

New here

  • Quickstart for the shortest path from install to a working first run
  • Installation if you just want the install command and platform notes
  • Providers and setup to wire up OpenAI, Anthropic, Gemini, Ollama, OpenRouter, and the rest
  • Usage guide when you want the mental model for exec, ask, and chat

Do a thing

Common workflows

  • Skills for portable instruction bundles that add task-specific context
  • Agents for built-in agents and when to use them
  • File editing for safe edit workflows from the terminal
  • MCP servers to connect external tools and services
  • Job runner for scheduled and background automation

Exact details

Reference shortcuts

Model underneath

Architecture and internals

  • Architecture overview for how runtimes, tools, routing, skills, and memory fit together
  • Memory guide for persistent memory behavior and fragment management
  • Web UI and API for browser access and HTTP endpoints
  • Debugging when something feels off and you need the shortest path to evidence