LLMs are moving into the database, not just the app layer. The buzz: real-time in-database preprocessing via triggers and a hot debate about how we design SQL for AI.
In-database AI: Triggers that preprocess on insert/update — The project Postgres-LLM is a trigger on specific columns and runs an LLM on insert/update, writing the output to another column or updating the original [1]. It supports any LLM that is OpenAI Chat API compatible, and it mentions Interfaze.ai as an example, showing how OCR, translation, and classification can live where the data lands [1].
The Text2SQL debate and hybrid futures — On the broader front, Text2SQL is the target of governance-heavy talk. The opinion piece proclaims "Text2SQL is dead – long live text2SQL" [2], arguing about the value of translating NL to SQL in environments where metadata matters. Some push back: "it's more likely that LLMs will be phased out for SQL" and others argue for new languages or a bytecode API to compile queries into a database-friendly form rather than strings [2]. The idea of memelang.net illustrates an alternative language direction [2].
Bottom line: AI is pushing SQL interfaces toward in-database processing and hybrid syntax layers, blending triggers with broader Text2SQL ideas and potential bytecode-like APIs for future interfaces.
References
Show HN: I built an open source LLM integration for PostgreSQL
Open source Postgres extension enables LLM-based triggers on inserts/updates; supports any OpenAI-compatible LLM; demonstrates in-database preprocessing for OCR translation workflows
View sourceText2SQL is dead – long live text2SQL
Debates Text-to-SQL viability, SQL stickiness, LLM-based query languages; argues future mixed with bytecode API, value concerns, politics and practical insights.
View source