GNO
Local search for your second brain
Your 15,000 documents of notes, journals, and reference material—finally searchable. Hybrid search combines keywords and semantics. Everything runs locally.
BM25. Vectors. HyDE. Reranking. One CLI that unlocks your second brain.
THE PROBLEM
You built a second brain. 15,000 files of accumulated knowledge. But when you ask your AI assistant about it:
- ×grep is fast but dumb—misses conceptual matches
- ×Obsidian MCP servers return irrelevant results
- ×Cloud search tools send your data to external servers
- ×AI assistants hallucinate instead of citing your actual notes
GNO gives your AI long-term memory over your local files.
COMMANDS
gno searchBM25 full-text keyword search
gno vsearchVector similarity semantic search
gno queryHybrid search with RRF fusion
gno askSearch + AI-generated answer
SEARCH PIPELINE
gno query worksThe hybrid search pipeline combines multiple retrieval strategies for best-in-class results:
Query Expansion
LLM generates keyword variants, semantic variants, and HyDE passage
Parallel Search
BM25 and vector searches run simultaneously on all variants
RRF Fusion
Reciprocal Rank Fusion merges results (k=60)
Reranking
Cross-encoder rescores top 20 for final ordering
SEARCH MODES
gno searchFull-text search via SQLite FTS5. Best for exact terms and code.
gno search "authentication JWT"gno vsearchEmbedding-based search via sqlite-vec. Finds conceptual matches.
gno vsearch "how to protect my app"gno queryBM25 + vector + HyDE expansion + cross-encoder reranking.
gno query "best practices for error handling"gno askRetrieval-augmented generation. Get a cited answer from your docs.
gno ask "summarize the API design" --answerFEATURES
BM25 for exact terms, vectors for concepts. RRF fusion combines both for best results.
LLM generates a hypothetical answer to your question, then searches for similar documents.
Embed, rerank, and generate with GGUF models via node-llama-cpp. No API keys needed.
Connect to Claude Desktop or Cursor. Your AI can search and cite your local files.
Install as a skill for Claude Code or Codex. CLI integration with zero context pollution.
Index Markdown, PDF, code, and more. Automatic chunking and content-addressed deduplication.
AI INTEGRATION
GNO integrates with AI tools via MCP (Model Context Protocol) or CLI skills. Your assistant can search and cite your local documents.
gno mcpgno skill installgno mcpgno skill install --target codexExample prompts for your AI
"Search my notes for the project roadmap and summarize Q4 goals"
"Find what I wrote about database migrations last month"
"Have I worked on something like this before?"
INSTALLATION
bun install -g @gmickel/gnogno init ~/notes --name notesgno updategno collection add ~/Documents/Obsidian --name vaultgno collection add ~/code --name code --exclude node_modulesThen run gno update to index.
MODELS — Choose preset: slim (~1GB), balanced (~2GB), or quality (~2.5GB). Run gno models use <preset> to switch.
- ›Bun — Runtime + package manager
- ›SQLite + FTS5 — BM25 full-text search
- ›sqlite-vec — Vector KNN search
- ›node-llama-cpp — Local GGUF models
- ›MCP — Model Context Protocol
Developers, researchers, and writers with years of accumulated notes who want AI assistants to actually use that knowledge instead of hallucinating.
If you've ever asked Claude "have I worked on this before?" and wished it could search your notes—GNO makes that real.