diff --git a/docs/concepts/memory.md b/docs/concepts/memory.md index 9b4655c4e5..6da6be1d4c 100644 --- a/docs/concepts/memory.md +++ b/docs/concepts/memory.md @@ -79,6 +79,12 @@ Defaults: - Uses remote embeddings (OpenAI) unless configured for local. - Local mode uses node-llama-cpp and may require `pnpm approve-builds`. +Remote embeddings **require** an OpenAI API key (`OPENAI_API_KEY` or +`models.providers.openai.apiKey`). Codex OAuth only covers chat/completions and +does **not** satisfy embeddings for memory search. If you don't want to set an +API key, use `memorySearch.provider = "local"` or set +`memorySearch.fallback = "none"`. + Config example: ```json5 diff --git a/docs/start/faq.md b/docs/start/faq.md index 7f8862fc38..8554712107 100644 --- a/docs/start/faq.md +++ b/docs/start/faq.md @@ -231,6 +231,14 @@ Clawdbot also runs a **silent pre-compaction memory flush** to remind the model to write durable notes before auto-compaction. This only runs when the workspace is writable (read-only sandboxes skip it). See [Memory](/concepts/memory). +### Why does memory search need an OpenAI API key if I already signed in with Codex? + +Vector memory search uses **embeddings**. Codex OAuth only covers +chat/completions and does **not** grant embeddings access, so the upstream +memory indexer needs a real OpenAI API key (`OPENAI_API_KEY` or +`models.providers.openai.apiKey`). If you don’t want to set a key, switch to +`memorySearch.provider = "local"` or set `memorySearch.fallback = "none"`. + ## Where things live on disk ### Where does Clawdbot store its data?