


Genuine question for folks working on knowledge tools. Obsidian Markdown vaults are built on a clean idea: your notes are just files on disk. Portable, backup-friendly, git-able. Works great at small scale. But past ~3-5k notes, Dataview starts choking. Past 10k, sync (iCloud, Obsidian Sync) gets laggy. The index is rebuilt in memory on every startup. No real persistent query layer. Used to be a niche problem. Only power users had vaults that big. Now with LLMs pumping out notes on demand (research briefs, meeting summaries, digests of whatever you're reading), everyone's about to hit that wall. My own vault doubled in 6 months and I'm barely trying. The clean fix feels obvious: files stay source of truth, SQLite or DuckDB becomes a derived index rebuilt on file change. SQL for queries, sync just the files. But nobody really owns that model. Am I missing something? Is anyone building this properly?
Obsidian Markdown vaults break past ~7k notes. Dataview chokes, sync gets flaky. Used to be a power-user problem. With LLMs pumping out notes on demand, everyone's vault is about to get huge. Files-as-truth + derived SQLite/DuckDB index feels obvious. Who's building it?