Composing Queries
Understanding how to compose and transform SqlQuery instances
ExoQuery sql blocks allow you to perform functions on SqlQuery instances such as filter, map, distinct, groupBy, orderBy, limit, offset, etc. These functions are not available outside of a sql { ... } block becuase the sql block delimiates the boundary of the query. SqlQuery instances created by the Table<MyRow> function representing simple table elements. Following functional-programming principles, any transformation on any SqlQuery instance work, and work the same way.
For example:
@Serializable
data class Person(val id: Int, val name: String, val age: Int)
// A simple base query defined once
val people: SqlQuery<Person> = sql { Table<Person>() }
// Later, we *compose* that query inside another `sql { ... }` block:
val pageOfAdultNames: SqlQuery<String> = sql {
people // SqlQuery<Person>
.filter { p -> p.age >= 18 } // keep only adults
.map { p -> p.name } // project just the name
.distinct() // remove duplicates
.offset(20) // skip the first 20 rows
.limit(10) // take the next 10
}
// Outside of a `sql { ... }` block, these combinators are *not* available:
// val invalid = people.filter { p -> p.age >= 18 } // โ does not compile
// val alsoInvalid = Table<Person>().map { p -> p.name } // โ does not compile
ExoQuery builds on top of the Terpal-SQL Database Controller in order to encode and decode Kotlin data classes in a fully cross-platform way. Under the hood, when you call something like query.buildFor.Postgres().runOn(controller), ExoQuery hands the compiled SQL and a Serializer<T> for your row type to a Database Controller (for example a JdbcController), which is responsible for turning Kotlin values into database parameters and decoding result-set rows back into Kotlin objects on any supported platform.
Although you do not need data classes to be @Serializeable in order to build the actual queries, you do need it in order to run them so that Database Controller (via kotlinx.serialization) knows how to map between Kotlin types and the underlying database types.
For example, on the JVM:
@Serializable
data class Person(val id: Int, val name: String, val age: Int)
// 1. Create a JDBC DataSource however you normally would (HikariCP, your framework, etc.)
val dataSource: DataSource = ...
// 2. Wrap it in a Terpal/ExoQuery controller for your database (Postgres in this case)
val controller = JdbcControllers.Postgres(dataSource)
// 3. Define an ExoQuery query as usual
val query: SqlQuery<Person> = sql {
Table<Person>().filter { p -> p.age >= 18 }
}
// 4. Build the query for a specific dialect and run it on the controller
val adults: List<Person> = query
.buildFor.Postgres() // compile-time SQL โ a compiled query object
.runOn(controller) // hand it + the serializer to the controller to execute and decode rows
- The
DataSourcegives the controller a way to talk to your database (connection pooling, credentials, etc.). JdbcControllers.Postgres(dataSource)creates a Postgres-aware controller that knows how to send SQL and read results.- The
sql { ... }block describes what you want to query in terms ofSqlQuery<Person>. buildFor.Postgres().runOn(controller)turns that description into real SQL for Postgres, executes it through the controller, and uses the generatedSerializer<Person>to map each row in the result set into aPersoninstance.
Terpal controller