SQLite goes local for vector work. Embedding Explorer runs entirely in the browser and uses libSQL WASM to store and query embeddings, so you can compare models without a server round-trip. Everything stays on-device, with privacy baked in.
What Embedding Explorer does • Data sources: upload CSVs or point at a SQLite database, all in-browser for quick iteration. [1] • Templates: build doc bodies with a tiny mustache syntax and preview IDs/bodies. [1] • Providers: configure multiple models — OpenAI, Google Gemini, and Ollama — and run batch embeddings for side-by-side comparisons. [1] • Storage/search: vectors + metadata live in libSQL WASM, persisted to OPFS; k‑NN and cosine queries power the UI. [1]
Why client-side vector workflows matter Storing vectors in libSQL WASM and persisting them to OPFS lets offline AI tasks run and keeps data private, with zero server round-trips. [1]
Tech stack & privacy The project leans on Dart + Jaspr for UI and workers, focusing on local storage and no telemetry. [1]
Closing thought: the WASM-backed vector store movement is quietly accelerating, bringing embeddings and privacy-preserving AI closer to the edge. [1]
References
Web tool to compare embeddings; uses SQLite via libSQL WASM for local persistence and in-browser storage
View source