feat(memory): document Voyage embeddings + VOYAGE_API_KEY (#7078) (thanks @mcinteerj) (#10699)

This commit is contained in:
Tak Hoffman
2026-02-06 15:51:47 -06:00
committed by GitHub
parent 6965a2cc9d
commit 40425db0f1
5 changed files with 32 additions and 3 deletions

View File

@@ -88,7 +88,8 @@ Defaults:
1. `local` if a `memorySearch.local.modelPath` is configured and the file exists.
2. `openai` if an OpenAI key can be resolved.
3. `gemini` if a Gemini key can be resolved.
4. Otherwise memory search stays disabled until configured.
4. `voyage` if a Voyage key can be resolved.
5. Otherwise memory search stays disabled until configured.
- Local mode uses node-llama-cpp and may require `pnpm approve-builds`.
- Uses sqlite-vec (when available) to accelerate vector search inside SQLite.
@@ -96,7 +97,8 @@ Remote embeddings **require** an API key for the embedding provider. OpenClaw
resolves keys from auth profiles, `models.providers.*.apiKey`, or environment
variables. Codex OAuth only covers chat/completions and does **not** satisfy
embeddings for memory search. For Gemini, use `GEMINI_API_KEY` or
`models.providers.google.apiKey`. When using a custom OpenAI-compatible endpoint,
`models.providers.google.apiKey`. For Voyage, use `VOYAGE_API_KEY` or
`models.providers.voyage.apiKey`. When using a custom OpenAI-compatible endpoint,
set `memorySearch.remote.apiKey` (and optional `memorySearch.remote.headers`).
### QMD backend (experimental)

View File

@@ -66,7 +66,8 @@ Semantic memory search uses **embedding APIs** when configured for remote provid
- `memorySearch.provider = "openai"` → OpenAI embeddings
- `memorySearch.provider = "gemini"` → Gemini embeddings
- Optional fallback to OpenAI if local embeddings fail
- `memorySearch.provider = "voyage"` → Voyage embeddings
- Optional fallback to a remote provider if local embeddings fail
You can keep it local with `memorySearch.provider = "local"` (no API usage).