Embedding vectors isn’t limited to servers anymore. In browser experiments, libSQL WASM runs in the browser and stores vectors locally, persisted to OPFS, with fast k-NN and cosine queries powering the UI [1].
• Browser pattern — local persistence and search — Everything runs offline in the browser; vectors and metadata live in libSQL WASM and are saved to OPFS. You can ingest data, embed with multiple providers such as OpenAI, Google Gemini, and Ollama, and query via k-NN/cosine right in the UI [1].
• On-device extension pattern — The vector story moves closer to the device with a dedicated SQLite extension. A discussion of a vector extension for SQLite spotlights projects like sqlite-vector and sqlite-vec, noting licensing and OSS questions as you decide what to ship on-device [2]. Also mentioned: DuckDB can read a SQLite file, enabling cross-tool workflows [2].
Cross-environment patterns to consider: - In-browser storage and queries use libSQL WASM with OPFS for persistence and fast similarity search [1]. - Native on-device deployments lean on SQLite extensions (sqlite-vector, sqlite-vec), with licensing considerations and ecosystem options like DuckDB interoperability [2]. - Aim for standard formats and a shared data store so that on-device and browser deployments can pair or transition without re-encoding vectors.
Closing thought: SQLite-based vector storage is converging—browser and native paths share ideas, but licensing and interoperability will shape how you ship embeddings next.
References
Web tool to compare embeddings; uses SQLite via libSQL WASM for local persistence and in-browser storage
View sourceUltra efficient vector extension for SQLite
Discusses ultra-efficient SQLite vector extension, licensing debates, performance vs sqlite-vec, brute-force vs HNSW, and on-device use considerations for AI apps.
View source