Skip to main content

Developer experience for SQL queries

· 6 min read

I don't like ORMs much. I've worked professionally for a few years with Hibernate, and I've tried SQLAlchemy on personal projects in the past, and I've never really liked it because I just don't find it all that hard to write SQL. Besides that, ORMs offer drawbacks because the way that they "just work" that save a little time initially, but make things harder to understand when you go to optimize or debug your code. ORMs often make it difficult for a developer to tell when a getter on their object will take a few nanoseconds to indirect memory that the CPU already has in L3 cache, and when it will take a few milliseconds to make a query to the database.

I've looked at OpenTelemetry traces for simple operations and seen Hibernate generate dozens of transactions to the database. Using the @Transactional annotation at too high a level also forces the service into demanding a large number of database connections for concurrency because handlers hold on to transactions far longer than they need to. It prevents developers from reasoning easily about how long the transaction actually needs to live.

I see the other side, though. Most frameworks don't provide a great developer experience for writing raw SQL, either, and in a lot of cases writing code using an object-relational mapping framework really does make the business logic easier to understand compared to building dynamic SQL queries by with string builders, which reviewers then have to audit to make sure you only ever concatenate a string you can trust to contain safe SQL with any runtime data bound properly using parameters.

I made sqlt to solve that problem in alignment with my own opinions about writing SQL, so that writing readable SQL doesn't take more effort than using an Object Relational Mapping.

Offline semantic search

· 5 min read

Lately I've started examining the problem of offline search for static websites.

Static sites have lots of convenient qualities—namely, the fact that they incur virtually no hosting costs and cheap hosting at any scale compared to dynamic sites. In the current state of the art, most static site generators use lunr.js to generate a traditional search index using tried and tested technology, but with advancements in Language Models, "semantic search" now exists as a possibility.

I don't have a lot of expertise in the language model space, so I had lots of questions about this approach. I didn't know if I could get good search results out of a language model small enough to expect client devices to run in the browser.