Switching from mcp-local-rag to Pathfinder
mcp-local-rag is excellent for local, private, single-user doc search with zero setup. Pathfinder is for when you need a shared server, multi-source indexing, and team-wide agent access.
Why Switch
- Shared server. Deploy once, and every agent on your team connects. No per-developer setup, no duplicated indexes.
- Multi-source indexing. Go beyond local files — index docs, code, Slack threads, Discord forums, and Notion pages in one server.
- Filesystem exploration. Bash tools alongside semantic search — agents can
grep,cat, andfindyour indexed content, not just vector-search it. - Webhook auto-reindex. Stays current on every git push. No manual re-ingestion, no stale results.
- Knowledge and FAQ. Distill Slack threads and Discord forums into searchable Q&A pairs automatically. Agents get answers, not raw chat logs.
- Production deployment. Docker, Railway, persistent PostgreSQL. Built to run as infrastructure, not a local process.
- PDF/DOCX ingestion. Index binary documents directly with the
documentsource type — no conversion needed.
What You Gain
Team-Wide Access
One server, every agent. No per-developer index setup. Deploy to Docker or Railway and your whole team's agents connect instantly.
Multi-Source Search
Docs, Slack threads, Discord forums, Notion pages — all searchable from one MCP server. local-rag only indexes local files.
Filesystem + Semantic
Agents choose the right tool: grep for precise matches, qmd for meaning-based search. Both paradigms, composable.
Webhook Auto-Reindex
Push to git, docs reindex automatically. No manual re-ingestion, no stale search results.
Knowledge / FAQ
Slack threads and Discord forums are distilled into Q&A pairs. Agents get direct answers, not raw conversation logs.
Production-Ready
Docker, Railway, persistent PostgreSQL, health checks, telemetry. Built to run as team infrastructure, not a local process.
When to Stay with mcp-local-rag
mcp-local-rag is a great tool. Here's when it's the better choice:
Personal / single-user search
If you're the only one searching your docs and don't need to share a server, local-rag's simplicity is hard to beat.
Fully offline operation
mcp-local-rag uses local embedding models — no API keys, no network calls. Pathfinder supports local embeddings via Ollama and @xenova/transformers — no API keys needed — but requires a running server process.
Zero dependencies
npx mcp-local-rag and you're done. No Docker, no PostgreSQL, no config files. Pathfinder's zero-infra mode (PGlite + bash-only) is similar but still needs a running server process.
Migration Walkthrough
Install Pathfinder
CLI:
Docker:
See full setup guide for detailed instructions.
Point at the same docs local-rag was indexing
Open pathfinder.yaml and add your docs directory as a source. If local-rag was indexing ~/projects/my-app/docs:
Or point at a git repo instead of a local path:
Start serving
First boot indexes your sources automatically. Subsequent starts are instant.
Update your MCP client config
Replace the mcp-local-rag entry in your MCP client config:
Before (mcp-local-rag):
After (Pathfinder):
Add sources local-rag couldn't do
Now that you're on Pathfinder, add the sources that weren't possible before:
What's Different
Pathfinder is more capable, but it's also more involved. Here's what changes:
OpenAI embeddings are the default, but local options are available
mcp-local-rag uses local embedding models with zero API dependencies. Pathfinder uses OpenAI embeddings by default, but also supports Ollama and @xenova/transformers for fully local, API-key-free embeddings. If you only use bash tools (no semantic search), you can skip the embedding provider entirely.
Server process stays running
mcp-local-rag starts on-demand when your MCP client invokes it. Pathfinder runs as a persistent server process — via npx serve, Docker, or a cloud deployment.
More setup than npx mcp-local-rag
There's a config file (pathfinder.yaml), environment variables, and optionally PostgreSQL. The tradeoff is significantly more capability — multi-source, shared access, auto-reindex, and production deployment.