Back to topics

SQL, Vectors, and Graphs for AI Memory: The Relational Case for Memory Architectures

1 min read
229 words
Database Debates Vectors, Graphs

SQL, vectors, and graphs collide in AI memory—and the twist is surprisingly simple: relational databases might outlast flashy stores for persistence. In 2025, the argument to go back to SQL is getting louder as teams chase reliable memory across long conversations [1].

What failed—and what works today - Vector databases – semantic recall is strong, with Pinecone and Weaviate, but retrieval can be noisy and lose structure [1]. - Graph databases – great for reasoning, yet scaling and maintenance stay tricky [1]. - Hybrid systems – flexible, but coordinating vectors, graphs, and relational data adds complexity [1]. - Relational databases – the Memori approach uses SQL tables to separate short-term vs long-term memory, store entities, rules, and preferences, promote important facts into permanent memory, and use joins and indexes for retrieval [1]. The project comes from Gibson as an open-source memory engine [1].

The Relational Pivot Relational memory isn’t nostalgia—it’s practical. Memori aims to give AI agents human-like memory by keeping memories in structured records and leveraging classic SQL querying to recombine context [1].

Under the hood A memory search agent shows how retrieval is planned: a SYSTEM_PROMPT guides intent analysis, search parameter extraction, planning, and filtering to surface relevant memories [1].

Closing thought Maybe 50 years of “SELECT * FROM reality” still beating the latest semantic embeddings is the plot twist we needed [1].

POST IDs referenced: 1

References

[1]
HackerNews

Everyone's trying vectors and graphs for AI memory. We went back to SQL

Discusses AI memory storage options: SQL vs vector vs graph; argues relational memory can beat embeddings; explores retrieval and scalability.

View source

Want to track your own topics?

Create custom trackers and get AI-powered insights from social discussions

Get Started