Skip to content

CLI

The CLI is the canonical surface. Everything else wraps the same engine.

Install

Terminal window
npm install -g basalted
basalt about

You should see the engine version, the index path, and an ok status for the embeddings provider.

First brief

Terminal window
cd /path/to/your/vault
basalt init # walks the vault, builds the index
basalt brief # writes today's brief next to your notes

The first init is the slowest step — it embeds every note. On a 1k-note vault with nomic-embed-text via Ollama, expect ~30 seconds.

Common flags

Terminal window
basalt brief # generate the full brief
basalt thesis --top 3 # run a single verb
basalt brief --llm ollama # add v1 LLM augmentation (named thesis + contradiction verdict)
basalt brief --llm anthropic --llm-model claude-sonnet-4-6
basalt audit # falsification pass over recent findings
basalt audit --drift-v1 # also re-evaluate Drift findings against current window
basalt snapshot push # upload local index to the API as a VaultSnapshot

Config + diagnostics

Terminal window
basalt config show # print the resolved config (file + defaults)
basalt doctor # pre-flight checks

basalt config show reads ~/.basalt/config.toml (or %APPDATA%/basalt/Config/config.toml on Windows) merged with built-in defaults and prints every key — vault path, db path, embedding model, Ollama URL, promote-to folder, LLM provider/model, API URL, plus a (set) / (unset) indicator for the API token (never the raw value).

basalt doctor is a one-shot health check:

Basalt doctor — running pre-flight checks
✓ vault path /Users/you/notes
✓ index db /Users/you/.basalt/basalt.db
✓ ollama reachable http://localhost:11434
✓ embedding model nomic-embed-text
✓ api credentials none configured (Open tier — that's fine)
All 5 checks passed.

It checks: vault path exists, index DB is built, Ollama is reachable, the configured embedding model is installed (via /api/tags), and — if you’ve set an API token — that apiUrl is also set. Exit code 1 if any check fails, so the command is CI-friendly.

Troubleshooting

  • Ollama not running. Start it: ollama serve in another terminal. The CLI uses http://localhost:11434 by default; override with BASALT_OLLAMA_URL.
  • No embeddings provider. Briefs still run with a mock embedder — verb output will degrade quality but the engine works end-to-end.
  • Permission errors writing the brief. The CLI writes via O_CREAT|O_EXCL; it will not overwrite an existing file. Specify --out if the default path collides.