Files
youlu-openclaw-workspace/notesearch/README.md
2026-04-03 15:28:40 -07:00

1.7 KiB

notesearch

Local vector search over markdown notes using LlamaIndex + Ollama.

Point it at an Obsidian vault (or any folder of .md files), build a vector index, and search by meaning — not just keywords.

Setup

cd notesearch
uv sync

Requires Ollama running locally with an embedding model pulled:

ollama pull qwen3-embedding:0.6b

Usage

Build the index

./notesearch.sh index --vault /path/to/vault
./notesearch.sh search "where do I get my allergy shots"

Output:

[0.87] Health/allergy.md
Started allergy shots in March 2026. Clinic is at 123 Main St.

[0.72] Daily/2026-03-25.md
Went to allergy appointment today.

Configuration

Edit config.json:

{
  "vault": "/home/lyx/Documents/obsidian-yanxin",
  "index_dir": null,
  "ollama_url": "http://localhost:11434",
  "embedding_model": "qwen3-embedding:0.6b"
}

Values can also be set via flags or env vars. Priority: flag > env var > config.json > fallback.

Flag Env var Config key Default
--vault NOTESEARCH_VAULT vault /home/lyx/Documents/obsidian-yanxin
--index-dir NOTESEARCH_INDEX_DIR index_dir <vault>/.index/
--ollama-url NOTESEARCH_OLLAMA_URL ollama_url http://localhost:11434
--embedding-model NOTESEARCH_EMBEDDING_MODEL embedding_model qwen3-embedding:0.6b
--top-k 5

Tests

uv run pytest

How it works

  1. Index: reads all .md files, splits on markdown headings, embeds each chunk via Ollama, stores vectors locally
  2. Search: embeds your query, finds the most similar chunks, returns them with file paths and relevance scores