Back to topics

SQLite as a Local Vector Store: Lessons from a WASM-Backed Embeddings Tool

1 min read
185 words
SQLite extensions SQLite Local

SQLite goes local for vector work. Embedding Explorer runs entirely in the browser and uses libSQL WASM to store and query embeddings, so you can compare models without a server round-trip. Everything stays on-device, with privacy baked in.

What Embedding Explorer does • Data sources: upload CSVs or point at a SQLite database, all in-browser for quick iteration. [1] • Templates: build doc bodies with a tiny mustache syntax and preview IDs/bodies. [1] • Providers: configure multiple models — OpenAI, Google Gemini, and Ollama — and run batch embeddings for side-by-side comparisons. [1] • Storage/search: vectors + metadata live in libSQL WASM, persisted to OPFS; k‑NN and cosine queries power the UI. [1]

Why client-side vector workflows matter Storing vectors in libSQL WASM and persisting them to OPFS lets offline AI tasks run and keeps data private, with zero server round-trips. [1]

Tech stack & privacy The project leans on Dart + Jaspr for UI and workers, focusing on local storage and no telemetry. [1]

Closing thought: the WASM-backed vector store movement is quietly accelerating, bringing embeddings and privacy-preserving AI closer to the edge. [1]

References

[1]
HackerNews

Web tool to compare embeddings; uses SQLite via libSQL WASM for local persistence and in-browser storage

View source

Want to track your own topics?

Create custom trackers and get AI-powered insights from social discussions

Get Started