jOOQ Manual 3.11 PDF
jOOQ Manual 3.11 PDF
jOOQ Manual 3.11 PDF
Overview
Table of contents
1. Preface................................................................................................................................................................................................................ 10
2. Copyright, License, and Trademarks.......................................................................................................................................................... 12
3. Getting started with jOOQ............................................................................................................................................................................ 17
3.1. How to read this manual........................................................................................................................................................................... 17
3.2. The sample database used in this manual........................................................................................................................................... 18
3.3. Different use cases for jOOQ................................................................................................................................................................... 19
3.3.1. jOOQ as a SQL builder........................................................................................................................................................................... 20
3.3.2. jOOQ as a SQL builder with code generation.................................................................................................................................. 21
3.3.3. jOOQ as a SQL executor........................................................................................................................................................................ 21
3.3.4. jOOQ for CRUD......................................................................................................................................................................................... 22
3.3.5. jOOQ for PROs.......................................................................................................................................................................................... 23
3.4. Tutorials........................................................................................................................................................................................................... 23
3.4.1. jOOQ in 7 easy steps.............................................................................................................................................................................. 23
3.4.1.1. Step 1: Preparation............................................................................................................................................................................... 23
3.4.1.2. Step 2: Your database.......................................................................................................................................................................... 25
3.4.1.3. Step 3: Code generation..................................................................................................................................................................... 25
3.4.1.4. Step 4: Connect to your database................................................................................................................................................... 27
3.4.1.5. Step 5: Querying.................................................................................................................................................................................... 28
3.4.1.6. Step 6: Iterating..................................................................................................................................................................................... 28
3.4.1.7. Step 7: Explore!...................................................................................................................................................................................... 29
3.4.2. Using jOOQ in modern IDEs................................................................................................................................................................. 29
3.4.3. Using jOOQ with Spring and Apache DBCP...................................................................................................................................... 29
3.4.4. Using jOOQ with Flyway.......................................................................................................................................................................... 34
3.4.5. Using jOOQ with JAX-RS.......................................................................................................................................................................... 40
3.4.6. A simple web application with jOOQ.................................................................................................................................................. 45
3.5. jOOQ and Java 8.......................................................................................................................................................................................... 45
3.6. jOOQ and JavaFX.......................................................................................................................................................................................... 47
3.7. jOOQ and Nashorn...................................................................................................................................................................................... 50
3.8. jOOQ and Scala............................................................................................................................................................................................ 50
3.9. jOOQ and Groovy......................................................................................................................................................................................... 51
3.10. jOOQ and Kotlin......................................................................................................................................................................................... 51
3.11. jOOQ and NoSQL...................................................................................................................................................................................... 52
3.12. jOOQ and JPA.............................................................................................................................................................................................. 52
3.13. Dependencies............................................................................................................................................................................................. 53
3.14. Build your own........................................................................................................................................................................................... 53
3.15. jOOQ and backwards-compatibility...................................................................................................................................................... 54
4. SQL building...................................................................................................................................................................................................... 56
4.1. The query DSL type..................................................................................................................................................................................... 56
4.1.1. DSL subclasses.......................................................................................................................................................................................... 57
4.2. The DSLContext class.................................................................................................................................................................................. 57
4.2.1. SQL Dialect................................................................................................................................................................................................. 58
4.2.2. SQL Dialect Family.................................................................................................................................................................................... 59
4.2.3. Connection vs. DataSource.................................................................................................................................................................... 60
4.2.4. Custom data............................................................................................................................................................................................... 61
4.2.5. Custom ExecuteListeners....................................................................................................................................................................... 61
4.2.6. Custom Settings........................................................................................................................................................................................ 62
4.2.6.1. Object qualification............................................................................................................................................................................... 63
4.2.6.2. Runtime schema and table mapping............................................................................................................................................... 63
4.2.6.3. Identifier style......................................................................................................................................................................................... 66
4.2.6.4. Keyword style.......................................................................................................................................................................................... 67
1. Preface
- No typesafety
- No syntax safety
- No bind value index safety
- Verbose SQL String concatenation
- Boring bind value indexing techniques
- Verbose resource and exception handling in JDBC
- A very "stateful", not very object-oriented JDBC API, which is hard to use
For these many reasons, other frameworks have tried to abstract JDBC away in the past in one way or
another. Unfortunately, many have completely abstracted SQL away as well
jOOQ has come to fill this gap.
jOOQ is different
SQL was never meant to be abstracted. To be confined in the narrow boundaries of heavy mappers,
hiding the beauty and simplicity of relational data. SQL was never meant to be object-oriented. SQL
was never meant to be anything other than... SQL!
- If you're using this work with Open Source databases, you may choose
either ASL or jOOQ License.
- If you're using this work with at least one commercial database, you must
choose jOOQ License
http://www.apache.org/licenses/LICENSE-2.0
This library is distributed with a LIMITED WARRANTY. See the jOOQ License
and Maintenance Agreement for more details: http://www.jooq.org/licensing
http://www.apache.org/licenses/LICENSE-2.0
- GSP and General SQL Parser are trademarks by Gudu Software Limited
- SQL 2 jOOQ is a trademark by Data Geekery™ GmbH and Gudu Software Limited
- Flyway is a trademark by Snow Mountain Labs UG (haftungsbeschränkt)
Contributions
The following are authors and contributors of jOOQ or parts of jOOQ in alphabetical order:
- Aaron Digulla
- Andreas Franzén
- Anuraag Agrawal
- Arnaud Roger
- Art O Cathain
- Artur Dryomov
- Ben Manes
- Brent Douglas
- Brett Meyer
- Christian Stein
- Christopher Deckers
- Ed Schaller
- Eric Peters
- Ernest Mishkin
- Espen Stromsnes
- Eugeny Karpov
- Fabrice Le Roy
- Gonzalo Ortiz Jaureguizar
- Gregory Hlavac
- Henrik Sjöstrand
- Ivan Dugic
- Javier Durante
- Johannes Bühler
- Joseph B Phillips
- Joseph Pachod
- Laurent Pireyn
- Luc Marchaud
- Lukas Eder
- Matti Tahvonen
- Michael Doberenz
- Michael Simons
- Michał Kołodziejski
- Miguel Gonzalez Sanchez
- Nathaniel Fischer
- Oliver Flege
- Peter Ertl
- Richard Bradley
- Robin Stocker
- Samy Deghou
- Sander Plas
- Sean Wellington
- Sergey Epik
- Sergey Zhuravlev
- Stanislas Nanchen
- Stephan Schroevers
- Sugiharto Lim
- Sven Jacobs
- Szymon Jachim
- Terence Zhang
- Timothy Wilson
- Timur Shaidullin
- Thomas Darimont
- Tsukasa Kitachi
- Victor Bronstein
- Victor Z. Peng
-© 2009 Vladimir Kulev
- 2019 by Data Geekery™ GmbH. Page 15 / 323
- Vladimir Vinogradov
- Vojtech Polivka
- Wang Gaoyuan
The jOOQ User Manual 2. Copyright, License, and Trademarks
Code blocks
The following are code blocks:
These are useful to provide examples in code. Often, with jOOQ, it is even more useful to compare SQL
code with its corresponding Java/jOOQ code. When this is done, the blocks are aligned side-by-side,
with SQL usually being on the left, and an equivalent jOOQ DSL query in Java usually being on the right:
-- SQL assumptions
------------------
// Java assumptions
// ----------------
// Whenever you see "standalone functions", assume they were static imported from org.jooq.impl.DSL
// "DSL" is the entry point of the static query DSL
exists(); max(); min(); val(); inline(); // correspond to DSL.exists(); DSL.max(); DSL.min(); etc...
// Whenever you see BOOK/Book, AUTHOR/Author and similar entities, assume they were (static) imported from the generated schema
BOOK.TITLE, AUTHOR.LAST_NAME // correspond to com.example.generated.Tables.BOOK.TITLE, com.example.generated.Tables.BOOK.TITLE
FK_BOOK_AUTHOR // corresponds to com.example.generated.Keys.FK_BOOK_AUTHOR
// Whenever you see "create" being used in Java code, assume that this is an instance of org.jooq.DSLContext.
// The reason why it is called "create" is the fact, that a jOOQ QueryPart is being created from the DSL object.
// "create" is thus the entry point of the non-static query DSL
DSLContext create = DSL.using(connection, SQLDialect.ORACLE);
Your naming may differ, of course. For instance, you could name the "create" instance "db", instead.
Execution
When you're coding PL/SQL, T-SQL or some other procedural SQL language, SQL statements are always
executed immediately at the semi-colon. This is not the case in jOOQ, because as an internal DSL, jOOQ
can never be sure that your statement is complete until you call fetch() or execute(). The manual tries
to apply fetch() and execute() as thoroughly as possible. If not, it is implied:
Degree (arity)
jOOQ records (and many other API elements) have a degree N between 1 and 22. The variable degree
of an API element is denoted as [N], e.g. Row[N] or Record[N]. The term "degree" is preferred over arity,
as "degree" is the term used in the SQL standard, whereas "arity" is used more often in mathematics
and relational theory.
Settings
jOOQ allows to override runtime behaviour using org.jooq.conf.Settings. If nothing is specified, the
default runtime settings are assumed.
Sample database
jOOQ query examples run against the sample database. See the manual's section about the sample
database used in this manual to learn more about the sample database.
More entities, types (e.g. UDT's, ARRAY types, ENUM types, etc), stored procedures and packages are
introduced for specific examples
In addition to the above, you may assume the following sample data:
INSERT INTO language (id, cd, description) VALUES (1, 'en', 'English');
INSERT INTO language (id, cd, description) VALUES (2, 'de', 'Deutsch');
INSERT INTO language (id, cd, description) VALUES (3, 'fr', 'Français');
INSERT INTO language (id, cd, description) VALUES (4, 'pt', 'Português');
- Typesafe database object referencing through generated schema, table, column, record,
procedure, type, dao, pojo artefacts (see the chapter about code generation)
- Typesafe SQL construction / SQL building through a complete querying DSL API modelling SQL
as a domain specific language in Java (see the chapter about the query DSL API)
- Convenient query execution through an improved API for result fetching (see the chapters about
the various types of data fetching)
- SQL dialect abstraction and SQL clause emulation to improve cross-database compatibility and
to enable missing features in simpler databases (see the chapter about SQL dialects)
- SQL logging and debugging using jOOQ as an integral part of your development process (see the
chapters about logging)
Effectively, jOOQ was originally designed to replace any other database abstraction framework short of
the ones handling connection pooling (and more sophisticated transaction management)
- Using Hibernate for 70% of the queries (i.e. CRUD) and jOOQ for the remaining 30% where SQL
is really needed
- Using jOOQ for SQL building and JDBC for SQL execution
- Using jOOQ for SQL building and Spring Data for SQL execution
- Using jOOQ without the source code generator to build the basis of a framework for dynamic
SQL execution.
The following sections explain about various use cases for using jOOQ in your application.
// Fetch a SQL string from a jOOQ Query in order to manually execute it with another tool.
// For simplicity reasons, we're using the API to construct case-insensitive object references, here.
String sql = create.select(field("BOOK.TITLE"), field("AUTHOR.FIRST_NAME"), field("AUTHOR.LAST_NAME"))
.from(table("BOOK"))
.join(table("AUTHOR"))
.on(field("BOOK.AUTHOR_ID").eq(field("AUTHOR.ID")))
.where(field("BOOK.PUBLISHED_IN").eq(1948))
.getSQL();
The SQL string built with the jOOQ query DSL can then be executed using JDBC directly, using
Spring's JdbcTemplate, using Apache DbUtils and many other tools (note that since jOOQ uses
PreparedStatement by default, this will generate a bind variable for "1948". Read more about bind
variables here).
If you wish to use jOOQ only as a SQL builder, the following sections of the manual will be of interest
to you:
- SQL building: This section contains a lot of information about creating SQL statements using the
jOOQ API
- Plain SQL: This section contains information useful in particular to those that want to supply
table expressions, column expressions, etc. as plain SQL to jOOQ, rather than through
generated artefacts
// Fetch a SQL string from a jOOQ Query in order to manually execute it with another tool.
String sql = create.select(BOOK.TITLE, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
.from(BOOK)
.join(AUTHOR)
.on(BOOK.AUTHOR_ID.eq(AUTHOR.ID))
.where(BOOK.PUBLISHED_IN.eq(1948))
.getSQL();
The SQL string that you can generate as such can then be executed using JDBC directly, using Spring's
JdbcTemplate, using Apache DbUtils and many other tools.
If you wish to use jOOQ only as a SQL builder with code generation, the following sections of the manual
will be of interest to you:
- SQL building: This section contains a lot of information about creating SQL statements using the
jOOQ API
- Code generation: This section contains the necessary information to run jOOQ's code generator
against your developer database
By having jOOQ execute your SQL, the jOOQ query DSL becomes truly embedded SQL.
jOOQ doesn't stop here, though! You can execute any SQL with jOOQ. In other words, you can use any
other SQL building tool and run the SQL statements with jOOQ. An example is given here:
© 2009 - 2019 by Data Geekery™ GmbH. Page 21 / 323
The jOOQ User Manual 3.3.4. jOOQ for CRUD
// Or execute that SQL with JDBC, fetching the ResultSet with jOOQ:
ResultSet rs = connection.createStatement().executeQuery(sql);
Result<Record> result = create.fetch(rs);
If you wish to use jOOQ as a SQL executor with (or without) code generation, the following sections of
the manual will be of interest to you:
- SQL building: This section contains a lot of information about creating SQL statements using the
jOOQ API
- Code generation: This section contains the necessary information to run jOOQ's code generator
against your developer database
- SQL execution: This section contains a lot of information about executing SQL statements using
the jOOQ API
- Fetching: This section contains some useful information about the various ways of fetching data
with jOOQ
// Fetch an author
AuthorRecord author : create.fetchOne(AUTHOR, AUTHOR.ID.eq(1));
If you wish to use all of jOOQ's features, the following sections of the manual will be of interest to you
(including all sub-sections):
- SQL building: This section contains a lot of information about creating SQL statements using the
jOOQ API
- Code generation: This section contains the necessary information to run jOOQ's code generator
against your developer database
- SQL execution: This section contains a lot of information about executing SQL statements using
the jOOQ API
- jOOQ's Execute Listeners: jOOQ allows you to hook your custom execute listeners into jOOQ's
SQL statement execution lifecycle in order to centrally coordinate any arbitrary operation
performed on SQL being executed. Use this for logging, identity generation, SQL tracing,
performance measurements, etc.
- Logging: jOOQ has a standard DEBUG logger built-in, for logging and tracing all your executed
SQL statements and fetched result sets
- Stored Procedures: jOOQ supports stored procedures and functions of your favourite database.
All routines and user-defined types are generated and can be included in jOOQ's SQL building
API as function references.
- Batch execution: Batch execution is important when executing a big load of SQL statements.
jOOQ simplifies these operations compared to JDBC
- Exporting and Importing: jOOQ ships with an API to easily export/import data in various formats
If you're a power user of your favourite, feature-rich database, jOOQ will help you access all of your
database's vendor-specific features, such as OLAP features, stored procedures, user-defined types,
vendor-specific SQL, functions, etc. Examples are given throughout this manual.
3.4. Tutorials
Don't have time to read the full manual? Here are a couple of tutorials that will get you into the most
essential parts of jOOQ as quick as possible.
<dependency>
<groupId>org.jooq</groupId>
<artifactId>jooq</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq</groupId>
<artifactId>jooq-meta</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen</artifactId>
<version>3.11.9</version>
</dependency>
<!-- Note: These aren't hosted on Maven Central. Import them manually from your distribution -->
<dependency>
<groupId>org.jooq.pro</groupId>
<artifactId>jooq</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq.pro</groupId>
<artifactId>jooq-meta</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq.pro</groupId>
<artifactId>jooq-codegen</artifactId>
<version>3.11.9</version>
</dependency>
<!-- Note: These aren't hosted on Maven Central. Import them manually from your distribution -->
<dependency>
<groupId>org.jooq.pro-java-6</groupId>
<artifactId>jooq</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq.pro-java-6</groupId>
<artifactId>jooq-meta</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq.pro-java-6</groupId>
<artifactId>jooq-codegen</artifactId>
<version>3.11.9</version>
</dependency>
<!-- Note: These aren't hosted on Maven Central. Import them manually from your distribution -->
<dependency>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq-meta</artifactId>
<version>3.11.9</version>
</dependency>
<dependency>
<groupId>org.jooq.trial</groupId>
<artifactId>jooq-codegen</artifactId>
<version>3.11.9</version>
</dependency>
Note that only the jOOQ Open Source Edition is available from Maven Central. If you're using the jOOQ
Professional Edition or the jOOQ Enterprise Edition, you will have to manually install jOOQ in your local
Nexus, or in your local Maven cache. For more information, please refer to the licensing pages.
Please refer to the manual's section about Code generation configuration to learn how to use jOOQ's
code generator with Maven.
For this example, we'll be using MySQL. If you haven't already downloaded MySQL Connector/J,
download it here:
http://dev.mysql.com/downloads/connector/j/
If you don't have a MySQL instance up and running yet, get XAMPP now! XAMPP is a simple installation
bundle for Apache, MySQL, PHP and Perl
USE `library`;
<generator>
<!-- The default code generator. You can override this one, to generate your own code style.
Supported generators:
- org.jooq.codegen.JavaGenerator
- org.jooq.codegen.ScalaGenerator
Defaults to org.jooq.codegen.JavaGenerator -->
<name>org.jooq.codegen.JavaGenerator</name>
<database>
<!-- The database type. The format here is:
org.jooq.meta.[database].[database]Database -->
<name>org.jooq.meta.mysql.MySQLDatabase</name>
<!-- The database schema (or in the absence of schema support, in your RDBMS this
can be the owner, user, database name) to be generated -->
<inputSchema>library</inputSchema>
<target>
<!-- The destination package of your generated classes (within the destination directory) -->
<packageName>test.generated</packageName>
<!-- The destination directory of your generated classes. Using Maven directory layout here -->
<directory>C:/workspace/MySQLTest/src/main/java</directory>
</target>
</generator>
</configuration>
Replace the username with whatever user has the appropriate privileges to query the database meta
data. You'll also want to look at the other values and replace as necessary. Here are the two interesting
properties:
generator.target.package - set this to the parent package you want to create for the generated classes.
The setting of test.generated will cause the test.generated.Author and test.generated.AuthorRecord to
be created
generator.target.directory - the directory to output to.
Once you have the JAR files and library.xml in your temp directory, type this on a Windows machine:
... or type this on a UNIX / Linux / Mac system (colons instead of semi-colons):
Note: jOOQ will try loading the library.xml from your classpath. This is also why there is a trailing period
(.) on the classpath. If the file cannot be found on the classpath, jOOQ will look on the file system from
the current working directory.
Replace the filenames with your actual filenames. In this example, jOOQ 3.11.9 is being used. If
everything has worked, you should see this in your console output:
// For convenience, always static import your generated tables and jOOQ functions to decrease verbosity:
import static test.generated.Tables.*;
import static org.jooq.impl.DSL.*;
import java.sql.*;
// For the sake of this tutorial, let's keep exception handling simple
catch (Exception e) {
e.printStackTrace();
}
}
}
First get an instance of DSLContext so we can write a simple SELECT query. We pass an instance of
the MySQL connection to DSL. Note that the DSLContext doesn't close the connection. We'll have to
do that ourselves.
We then use jOOQ's query DSL to return an instance of Result. We'll be using this result in the next step.
System.out.println("ID: " + id + " first name: " + firstName + " last name: " + lastName);
}
package test;
import java.sql.*;
import org.jooq.*;
import org.jooq.impl.*;
/**
* @param args
*/
public static void main(String[] args) {
String userName = "root";
String password = "";
String url = "jdbc:mysql://localhost:3306/library";
System.out.println("ID: " + id + " first name: " + firstName + " last name: " + lastName);
}
}
// For the sake of this tutorial, let's keep exception handling simple
catch (Exception e) {
e.printStackTrace();
}
}
}
- Apache DBCP (but you may as well use some other connection pool, like BoneCP, C3P0,
HikariCP, and various others).
- Spring TX as the transaction management library.
- jOOQ as the SQL building and execution library.
Before you copy the manual examples, consider also these further resources:
<dependencies>
Note that only the jOOQ Open Source Edition is available from Maven Central. If you're using the jOOQ
Professional Edition or the jOOQ Enterprise Edition, you will have to manually install jOOQ in your local
Nexus, or in your local Maven cache. For more information, please refer to the licensing pages.
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:tx="http://www.springframework.org/schema/tx"
xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.2.xsd">
<!-- This is needed if you want to use the @Transactional annotation -->
<tx:annotation-driven transaction-manager="transactionManager"/>
<!-- Configure the DSL object, optionally overriding jOOQ Exceptions with Spring Exceptions -->
<bean id="dsl" class="org.jooq.impl.DefaultDSLContext">
<constructor-arg ref="config" />
</bean>
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = {"/jooq-spring.xml"})
public class QueryTest {
@Autowired
DSLContext create;
@Test
public void testJoin() throws Exception {
// All of these tables were generated by jOOQ's Maven plugin
Book b = BOOK.as("b");
Author a = AUTHOR.as("a");
BookStore s = BOOK_STORE.as("s");
BookToBookStore t = BOOK_TO_BOOK_STORE.as("t");
assertEquals(2, result.size());
assertEquals("Paulo", result.getValue(0, a.FIRST_NAME));
assertEquals("George", result.getValue(1, a.FIRST_NAME));
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = {"/jooq-spring.xml"})
@TransactionConfiguration(transactionManager="transactionManager")
public class TransactionTest {
@After
public void teardown() {
@Test
public void testExplicitTransactions() {
boolean rollback = false;
Assert.fail();
}
assertEquals(4, dsl.fetchCount(BOOK));
assertTrue(rollback);
}
}
/**
* Create a new book.
* <p>
* The implementation of this method has a bug, which causes this method to
* fail and roll back the transaction.
*/
@Transactional
void create(int id, int authorId, String title);
@Test
public void testDeclarativeTransactions() {
boolean rollback = false;
try {
assertEquals(4, dsl.fetchCount(BOOK));
assertTrue(rollback);
}
When
performing database migrations, we at Data Geekery recommend using jOOQ with Flyway - Database
Migrations Made Easy. In this chapter, we're going to look into a simple way to get started with the two
frameworks.
Philosophy
There are a variety of ways how jOOQ and Flyway could interact with each other in various development
setups. In this tutorial we're going to show just one variant of such framework team play - a variant that
we find particularly compelling for most use cases.
The general philosophy behind the following approach can be summarised as this:
- 1. Database increment
- 2. Database migration
- 3. Code re-generation
- 4. Development
The four steps above can be repeated time and again, every time you need to modify something in your
database. More concretely, let's consider:
- 1. Database increment - You need a new column in your database, so you write the necessary
DDL in a Flyway script
- 2. Database migration - This Flyway script is now part of your deliverable, which you can share
with all developers who can migrate their databases with it, the next time they check out your
change
- 3. Code re-generation - Once the database is migrated, you regenerate all jOOQ artefacts (see
code generation), locally
- 4. Development - You continue developing your business logic, writing code against the udpated,
generated database schema
<properties>
<db.url>jdbc:h2:~/flyway-test</db.url>
<db.username>sa</db.username>
</properties>
<!-- We'll add the latest version of jOOQ and our JDBC driver - in this case H2 -->
<dependency>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
<!-- For improved logging, we'll be using log4j via slf4j to see what's going on during migration and code generation -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.5</version>
</dependency>
<plugin>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-maven-plugin</artifactId>
<version>3.0</version>
<!-- Note that we're executing the Flyway plugin in the "generate-sources" phase -->
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>migrate</goal>
</goals>
</execution>
</executions>
<!-- Note that we need to prefix the db/migration path with filesystem: to prevent Flyway
from looking for our migration scripts only on the classpath -->
<configuration>
<url>${db.url}</url>
<user>${db.username}</user>
<locations>
<location>filesystem:src/main/resources/db/migration</location>
</locations>
</configuration>
</plugin>
The above Flyway Maven plugin configuration will read and execute all database migration scripts
from src/main/resources/db/migration prior to compiling Java source code. While the official Flyway
documentation suggests that migrations be done in the compile phase, the jOOQ code generator relies
on such migrations having been done prior to code generation.
After the Flyway plugin, we'll add the jOOQ Maven Plugin. For more details, please refer to the manual's
section about the code generation configuration.
<plugin>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
<!-- The jOOQ code generation plugin is also executed in the generate-sources phase, prior to compilation -->
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<!-- This is a minimal working configuration. See the manual's section about the code generator for more details -->
<configuration>
<jdbc>
<url>${db.url}</url>
<user>${db.username}</user>
</jdbc>
<generator>
<database>
<includes>.*</includes>
<inputSchema>FLYWAY_TEST</inputSchema>
</database>
<target>
<packageName>org.jooq.example.flyway.db.h2</packageName>
<directory>target/generated-sources/jooq-h2</directory>
</target>
</generator>
</configuration>
</plugin>
This configuration will now read the FLYWAY_TEST schema and reverse-engineer it into the target/
generated-sources/jooq-h2 directory, and within that, into the org.jooq.example.flyway.db.h2 package.
1. Database increments
Now, when we start developing our database. For that, we'll create database increment scripts, which we
put into the src/main/resources/db/migration directory, as previously configured for the Flyway plugin.
We'll add these files:
- V1__initialise_database.sql
- V2__create_author_table.sql
- V3__create_book_table_and_records.sql
These three scripts model our schema versions 1-3 (note the capital V!). Here are the scripts' contents
-- V1__initialise_database.sql
DROP SCHEMA flyway_test IF EXISTS;
-- V2__create_author_table.sql
CREATE SEQUENCE flyway_test.s_author_id START WITH 1;
-- V3__create_book_table_and_records.sql
CREATE TABLE flyway_test.book (
id INT NOT NULL,
author_id INT NOT NULL,
title VARCHAR(400) NOT NULL,
INSERT INTO flyway_test.author VALUES (next value for flyway_test.s_author_id, 'George', 'Orwell', '1903-06-25', 1903, null);
INSERT INTO flyway_test.author VALUES (next value for flyway_test.s_author_id, 'Paulo', 'Coelho', '1947-08-24', 1947, null);
4. Development
Note that all of the previous steps are executed automatically, every time someone adds new migration
scripts to the Maven module. For instance, a team member might have committed a new migration
script, you check it out, rebuild and get the latest jOOQ-generated sources for your own development
or integration-test database.
Now, that these steps are done, you can proceed writing your database queries. Imagine the following
test case
import org.jooq.Result;
import org.jooq.impl.DSL;
import org.junit.Test;
import java.sql.DriverManager;
@Test
public void testQueryingAfterMigration() throws Exception {
try (Connection c = DriverManager.getConnection("jdbc:h2:~/flyway-test", "sa", "")) {
Result<?> result =
DSL.using(c)
.select(
AUTHOR.FIRST_NAME,
AUTHOR.LAST_NAME,
BOOK.ID,
BOOK.TITLE
)
.from(AUTHOR)
.join(BOOK)
.on(AUTHOR.ID.eq(BOOK.AUTHOR_ID))
.orderBy(BOOK.ID.asc())
.fetch();
assertEquals(4, result.size());
assertEquals(asList(1, 2, 3, 4), result.getValues(BOOK.ID));
}
}
}
Reiterate
The power of this approach becomes clear once you start performing database modifications this way.
Let's assume that the French guy on our team prefers to have things his way:
-- V4__le_french.sql
ALTER TABLE flyway_test.book ALTER COLUMN title RENAME TO le_titre;
They check it in, you check out the new database migration script, run
When we go back to our Java integration test, we can immediately see that the TITLE column is still
being referenced, but it no longer exists:
@Test
public void testQueryingAfterMigration() throws Exception {
try (Connection c = DriverManager.getConnection("jdbc:h2:~/flyway-test", "sa", "")) {
Result<?> result =
DSL.using(c)
.select(
AUTHOR.FIRST_NAME,
AUTHOR.LAST_NAME,
BOOK.ID,
BOOK.TITLE
// ^^^^^ This column no longer exists. We'll have to rename it to LE_TITRE
)
.from(AUTHOR)
.join(BOOK)
.on(AUTHOR.ID.eq(BOOK.AUTHOR_ID))
.orderBy(BOOK.ID.asc())
.fetch();
assertEquals(4, result.size());
assertEquals(asList(1, 2, 3, 4), result.getValues(BOOK.ID));
}
}
}
Conclusion
This tutorial shows very easily how you can build a rock-solid development process using Flyway and
jOOQ to prevent SQL-related errors very early in your development lifecycle - immediately at compile
time, rather than in production!
Please, visit the Flyway website for more information about Flyway.
LICENSE_DATE TIMESTAMP NOT NULL, -- The date when the license was issued
LICENSEE TEXT NOT NULL, -- The e-mail address of the licensee
LICENSE TEXT NOT NULL, -- The license key
VERSION VARCHAR(50) NOT NULL DEFAULT '.*', -- The licensed version(s), a regular expression
LICENSEE TEXT NOT NULL, -- The licensee whose license is being verified
LICENSE TEXT NOT NULL, -- The license key that is being verified
REQUEST_IP VARCHAR(50) NOT NULL, -- The request IP verifying the license
VERSION VARCHAR(50) NOT NULL, -- The version that is being verified
MATCH BOOLEAN NOT NULL, -- Whether the verification was successful
To make things a bit more interesting (and secure), we'll also push license key generation into the
database, by generating it from a stored function as such:
The actual algorithm might be using a secret salt to hash the function arguments. For the sake of a
tutorial, a constant string will suffice.
<groupId>org.jooq</groupId>
<artifactId>jooq-webservices</artifactId>
<packaging>war</packaging>
<version>1.0</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.0.2</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.26</version>
<configuration>
<reload>manual</reload>
<stopKey>stop</stopKey>
<stopPort>9966</stopPort>
</configuration>
</plugin>
<plugin>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
<dependencies>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-json</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>com.sun.jersey.contribs</groupId>
<artifactId>jersey-spring</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
With the above setup, we're now pretty ready to start developing our license service as a JAX-RS service.
/**
* The license server.
*/
@Path("/license/")
@Component
@Scope("request")
public class LicenseService {
/**
* <code>/license/generate</code> generates and returns a new license key.
*
* @param mail The input email address of the licensee.
*/
@GET
@Produces("text/plain")
@Path("/generate")
public String generate(
final @QueryParam("mail") String mail
) {
return run(new CtxRunnable() {
@Override
public String run(DSLContext ctx) {
Timestamp licenseDate = new Timestamp(System.currentTimeMillis());
/**
* <code>/license/verify</code> checks if a given licensee has access to version using a license.
*
* @param request The servlet request from the JAX-RS context.
* @param mail The input email address of the licensee.
* @param license The license used by the licensee.
* @param version The product version being accessed.
*/
@GET
@Produces("text/plain")
@Path("/verify")
public String verify(
final @Context HttpServletRequest request,
final @QueryParam("mail") String mail,
final @QueryParam("license") String license,
final @QueryParam("version") String version
) {
return run(new CtxRunnable() {
@Override
public String run(DSLContext ctx) {
String v = (version == null || version.equals("")) ? "" : version;
// [...]
}
The INSERT INTO LOG_VERIFY query is actually rather interesting. In plain SQL, it would look like this:
Apart from the foregoing, the LicenseService also contains a couple of simple utilities:
/**
* This method encapsulates a transaction and initialises a jOOQ DSLcontext.
* This could also be achieved with Spring and DBCP for connection pooling.
*/
private String run(CtxRunnable runnable) {
try (Connection c = getConnection("jdbc:postgresql:postgres", "postgres", System.getProperty("pw", "test"))) {
DSLContext ctx = DSL.using(new DefaultConfiguration()
.set(new DefaultConnectionProvider(c))
.set(SQLDialect.POSTGRES)
.set(new Settings().withExecuteLogging(false)));
return runnable.run(ctx);
}
catch (Exception e) {
e.printStackTrace();
Response.status(Status.SERVICE_UNAVAILABLE);
return "Service Unavailable - Please contact support@datageekery.com for help";
}
}
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-2.5.xsd">
</beans>
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>classpath:applicationContext.xml</param-value>
</context-param>
<listener>
<listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
<listener>
<listener-class>org.springframework.web.context.request.RequestContextListener</listener-class>
</listener>
<servlet>
<servlet-name>Jersey Spring Web Application</servlet-name>
<servlet-class>com.sun.jersey.spi.spring.container.servlet.SpringServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>Jersey Spring Web Application</servlet-name>
<url-pattern>/*</url-pattern>
</servlet-mapping>
</web-app>
... and we're done! We can now run the server with the following command:
mvn jetty:run
http://localhost:8088/jooq-jax-rs-example/license/generate?mail=test@example.com
-> license-key
http://localhost:8088/jooq-jax-rs-example/license/verify?mail=test@example.com&license=license-key&version=3.2.0
-> true
http://localhost:8088/jooq-jax-rs-example/license/verify?mail=test@example.com&license=wrong&version=3.2.0
-> false
DSL.using(c)
.fetch(sql)
The above example shows how jOOQ's Result.map() method can receive a lambda expression that
implements RecordMapper to map from jOOQ Records to your custom types.
DSL.using(c)
.select(
COLUMNS.TABLE_NAME,
COLUMNS.COLUMN_NAME,
COLUMNS.TYPE_NAME
)
.from(COLUMNS)
.orderBy(
COLUMNS.TABLE_CATALOG,
COLUMNS.TABLE_SCHEMA,
COLUMNS.TABLE_NAME,
COLUMNS.ORDINAL_POSITION
)
.fetch() // jOOQ ends here
.stream() // JDK 8 Streams start here
.collect(groupingBy(
r -> r.getValue(COLUMNS.TABLE_NAME),
LinkedHashMap::new,
mapping(
r -> new Column(
r.getValue(COLUMNS.COLUMN_NAME),
r.getValue(COLUMNS.TYPE_NAME)
),
toList()
)
))
.forEach(
(table, columns) -> {
// Just emit a CREATE TABLE statement
System.out.println(
"CREATE TABLE " + table + " (");
System.out.println(");");
}
);
The above example is explained more in depth in this blog post: http://blog.jooq.org/2014/04/11/java-8-
friday-no-more-need-for-orms/. For more information about Java 8, consider these resources:
Once this data is set up (e.g. in an H2 or PostgreSQL database), we'll run jOOQ's code generator and
implement the following code to display our chart:
The above example uses basic SQL-92 syntax where the countries are ordered using aggregate
information from a nested SELECT, which is supported in all databases. If you're using a database that
supports window functions, e.g. PostgreSQL or any commercial database, you could have also written
a simpler query like this:00
DSL.using(connection)
.select(
COUNTRIES.YEAR,
COUNTRIES.CODE,
COUNTRIES.GOVT_DEBT)
.from(COUNTRIES)
return bc;
More details about how to use jOOQ, JDBC, and SQL with Nashorn can be seen here.
All of the above heavily improve jOOQ's querying DSL API experience for Scala developers.
A short example jOOQ application in Scala might look like this:
For more details about jOOQ's Scala integration, please refer to the manual's section about SQL building
with Scala.
As the above graph gets more complex, a lot of tricky questions arise like:
- What's the optimal order of SQL DML operations for loading and storing entities?
- How can we batch the commands more efficiently?
- How can we keep the transaction footprint as low as possible without compromising on ACID?
- How can we implement optimistic locking?
- You run reports and analytics on large data sets directly in the database
- You import / export data using ETL
- You run complex business logic as SQL queries
Whenever SQL is a good fit, jOOQ is a good fit. Whenever you're operating and persisting the object
graph, JPA is a good fit.
And sometimes, it's best to combine both
3.13. Dependencies
Dependencies are a big hassle in modern software. Many libraries depend on other, non-JDK library
parts that come in different, incompatible versions, potentially causing trouble in your runtime
environment. jOOQ has no external dependencies on any third-party libraries.
However, the above rule has some exceptions:
- logging APIs are referenced as "optional dependencies". jOOQ tries to find slf4j or log4j on the
classpath. If it fails, it will use the java.util.logging.Logger
- Oracle ojdbc types used for array creation are loaded using reflection. The same applies to
Postgres PG* types.
- Small libraries with compatible licenses are incorporated into jOOQ. These include jOOR, jOOU,
parts of OpenCSV, json simple, parts of commons-lang
- javax.persistence and javax.validation will be needed if you activate the relevant code generation
flags
* mvn eclipse:eclipse
Semantic versioning
jOOQ's understanding of backwards compatibility is inspired by the rules of semantic versioning
according to http://semver.org. Those rules impose a versioning scheme [X].[Y].[Z] that can be
summarised as follows:
- If a patch release includes bugfixes, performance improvements and API-irrelevant new features,
[Z] is incremented by one.
- If a minor release includes backwards-compatible, API-relevant new features, [Y] is incremented
by one and [Z] is reset to zero.
- If a major release includes backwards-incompatible, API-relevant new features, [X] is
incremented by one and [Y], [Z] are reset to zero.
It becomes obvious that it would be impossible to add new language elements (e.g. new SQL functions,
new SELECT clauses) to the API without breaking any client code that actually implements those
interfaces. Hence, the following rules should be observed:
- jOOQ's DSL interfaces should not be implemented by client code! Extend only those extension
points that are explicitly documented as "extendable" (e.g. custom QueryParts).
- Binary compatibility can be expected from patch releases, but not from minor releases as it is
not practical to maintain binary compatibility in an internal DSL.
- Source compatibility can be expected from patch and minor releases.
- Behavioural compatibility can be expected from patch and minor releases.
- Any jOOQ SPI XYZ that is meant to be implemented ships with a DefaultXYZ or AbstractXYZ,
which can be used safely as a default implementation.
4. SQL building
SQL is a declarative language that is hard to integrate into procedural, object-oriented, functional or
any other type of programming languages. jOOQ's philosophy is to give SQL the credit it deserves and
integrate SQL itself as an "internal domain specific language" directly into Java.
With this philosophy in mind, SQL building is the main feature of jOOQ. All other features (such as SQL
execution and code generation) are mere convenience built on top of jOOQ's SQL building capabilities.
This section explains all about the various syntax elements involved with jOOQ's SQL building
capabilities. For a complete overview of all syntax elements, please refer to the manual's sections about
SQL to DSL mapping rules.
- Interface-driven design. This allows for modelling queries in a fluent API most efficiently
- Reduction of complexity for client code.
- API guarantee. You only depend on the exposed interfaces, not concrete (potentially dialect-
specific) implementations.
The org.jooq.impl.DSL class is the main class from where you will create all jOOQ objects. It serves as a
static factory for table expressions, column expressions (or "fields"), conditional expressions and many
other QueryParts.
Note, that when working with Eclipse, you could also add the DSL to your favourites. This will allow to
access functions even more fluently:
concat(trim(FIRST_NAME), trim(LAST_NAME));
If you do not have a reference to a pre-existing Configuration object (e.g. created from
org.jooq.impl.DefaultConfiguration), the various overloaded DSL.using() methods will create one for
you.
- org.jooq.SQLDialect : The dialect of your database. This may be any of the currently supported
database types (see SQL Dialect for more details)
- org.jooq.conf.Settings : An optional runtime configuration (see Custom Settings for more details)
- org.jooq.ExecuteListenerProvider : An optional reference to a provider class that can provide
execute listeners to jOOQ (see ExecuteListeners for more details)
- org.jooq.RecordMapperProvider : An optional reference to a provider class that can provide
record mappers to jOOQ (see POJOs with RecordMappers for more details)
- Any of these:
* java.sql.Connection : An optional JDBC Connection that will be re-used for the whole
lifecycle of your Configuration (see Connection vs. DataSource for more details). For
simplicity, this is the use-case referenced from this manual, most of the time.
* java.sql.DataSource : An optional JDBC DataSource that will be re-used for the whole
lifecycle of your Configuration. If you prefer using DataSources over Connections, jOOQ will
internally fetch new Connections from your DataSource, conveniently closing them again
after query execution. This is particularly useful in J2EE or Spring contexts (see Connection
vs. DataSource for more details)
* org.jooq.ConnectionProvider : A custom abstraction that is used by jOOQ to "acquire"
and "release" connections. jOOQ will internally "acquire" new Connections from your
ConnectionProvider, conveniently "releasing" them again after query execution. (see
Connection vs. DataSource for more details)
Wrapping a Configuration object, a DSLContext can construct statements, for later execution. An
example is given here:
// Using the internally referenced Configuration, the select statement can now be executed:
Result<?> result = select.fetch();
Note that you do not need to keep a reference to a DSLContext. You may as well inline your local variable,
and fluently execute a SQL statement as such:
/**
* Add an Oracle-specific <code>CONNECT BY</code> clause to the query
*/
@Support({ SQLDialect.CUBRID, SQLDialect.ORACLE })
SelectConnectByConditionStep<R> connectBy(Condition condition);
jOOQ API methods which are not annotated with the org.jooq.Support annotation, or which are
annotated with the Support annotation, but without any SQL dialects can be safely used in all SQL
dialects. An example for this is the SELECT statement factory method:
/**
* Create a new DSL select statement.
*/
@Support
SelectSelectStep<R> select(Field<?>... fields);
A IS DISTINCT FROM B
Nevertheless, the IS DISTINCT FROM predicate is supported by jOOQ in all dialects, as its semantics can
be expressed with an equivalent CASE expression. For more details, see the manual's section about
the DISTINCT predicate.
jOOQ has a historic affinity to Oracle's SQL extensions. If something is supported in Oracle SQL, it has
a high probability of making it into the jOOQ API
- SQL Server: The "version-less" SQL Server version. This always maps to the latest supported
version of SQL Server
- SQL Server 2012: The SQL Server version 2012
- SQL Server 2008: The SQL Server version 2008
In the above list, SQLSERVER is both a dialect and a family of three dialects. This distinction is used
internally by jOOQ to distinguish whether to use the OFFSET .. FETCH clause (SQL Server 2012), or
whether to emulate it using ROW_NUMBER() OVER() (SQL Server 2008).
- Custom ExecuteListeners
- Custom QueryParts
Here is an example of how to use the custom data API. Let's assume that you have written an
ExecuteListener, that prevents INSERT statements, when a given flag is set to true:
// Implement an ExecuteListener
public class NoInsertListener extends DefaultExecuteListener {
@Override
public void start(ExecuteContext ctx) {
// This listener is active only, when your custom flag is set to true
if (Boolean.TRUE.equals(ctx.configuration().data("com.example.my-namespace.no-inserts"))) {
See the manual's section about ExecuteListeners to learn more about how to implement an
ExecuteListener.
Now, the above listener can be added to your Configuration, but you will also need to pass the flag to
the Configuration, in order for the listener to work:
Using the data() methods, you can store and retrieve custom data in your Configurations.
See the manual's section about ExecuteListeners to see examples of such listener implementations.
- In the DSLContext constructor (DSL.using()). This will override default settings below
- in the org.jooq.impl.DefaultConfiguration constructor. This will override default settings below
- From a location specified by a JVM parameter: -Dorg.jooq.settings
- From the classpath at /jooq-settings.xml
- From the settings defaults, as specified in http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd
Example
For example, if you want to indicate to jOOQ, that it should inline all bind variables, and execute static
java.sql.Statement instead of binding its variables to java.sql.PreparedStatement, you can do so by
creating the following DSLContext:
More details
Please refer to the jOOQ runtime configuration XSD for more details:
http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd
© 2009 - 2019 by Data Geekery™ GmbH. Page 62 / 323
The jOOQ User Manual 4.2.6.1. Object qualification
While the jOOQ code is also implicitly fully qualified (see implied imports), it may not be desireable to
use fully qualified object names in SQL. The renderCatalog and renderSchema settings are used for this.
Programmatic configuration
new Settings()
.withRenderCatalog(false) // Defaults to true
.withRenderSchema(false) // Defaults to true
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderCatalog>false</renderCatalog>
<renderSchema>false</renderSchema>
</settings>
By turning off the rendering of full qualification as can be seen above, it will be possible to use code
generated from one schema on an entirely different schema of the same structure, e.g. for multitenancy
purposes.
More sophisticated multitenancy approaches are available through the render mapping feature.
- DEV: Your development schema. This will be the schema that you base code generation upon,
with jOOQ
- MY_BOOK_WORLD: The schema instance for My Book World
- BOOKS_R_US: The schema instance for Books R Us
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderMapping>
<schemata>
<schema>
<input>DEV</input>
<output>MY_BOOK_WORLD</output>
</schema>
<schema>
<input>LOG</input>
<output>MY_BOOK_WORLD_LOG</output>
</schema>
</schemata>
</renderMapping>
</settings>
The query executed with a Configuration equipped with the above mapping will in fact produce this
SQL statement:
This works because AUTHOR was generated from the DEV schema, which is mapped to the
MY_BOOK_WORLD schema by the above settings.
Mapping of tables
Not only schemata can be mapped, but also tables. If you are not the owner of the database
your application connects to, you might need to install your schema with some sort of prefix to
every table. In our examples, this might mean that you will have to map DEV.AUTHOR to something
MY_BOOK_WORLD.MY_APP__AUTHOR, where MY_APP__ is a prefix applied to all of your tables. This can
be achieved by creating the following mapping:
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderMapping>
<schemata>
<schema>
<input>DEV</input>
<tables>
<table>
<input>AUTHOR</input>
<output>MY_APP__AUTHOR</output>
</table>
</tables>
</schema>
</schemata>
</renderMapping>
</settings>
The query executed with a Configuration equipped with the above mapping will in fact produce this
SQL statement:
Table mapping and schema mapping can be applied independently, by specifying several
MappedSchema entries in the above configuration. jOOQ will process them in order of appearance and
map at first match. Note that you can always omit a MappedSchema's output value, in case of which,
only the table mapping is applied. If you omit a MappedSchema's input value, the table mapping is
applied to all schemata!
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderMapping>
<schemata>
<schema>
<inputExpression>DEV_(.*)</inputExpression>
<output>PROD_$1</output>
<tables>
<table>
<inputExpression>DEV_(.*)</inputExpression>
<output>PROD_$1</output>
</table>
</tables>
</schema>
</schemata>
</renderMapping>
</settings>
The only difference to the constant version is that the input field is replaced by the inputExpression field
of type java.util.regex.Pattern, in case of which the meaning of the output field is a pattern replacement,
not a constant replacement.
Quoting has the following effect on identifiers in most (but not all) databases:
- It allows for using reserved names as object names, e.g. a table called "FROM" is usually possible
only when quoted.
- It allows for using special characters in object names, e.g. a column called "FIRST NAME" can be
achieved only with quoting.
- It turns what are mostly case-insensitive identifiers into case-sensitive ones, e.g. "name" and
"NAME" are different identifiers, whereas name and NAME are not. Please consider your
database manual to learn what the proper default case and default case sensitivity is.
The renderNameStyle setting allows for overriding the name of all identifiers in jOOQ to a consistent
style. Possible options are:
- QUOTED (the default): This will generate all names in their proper case with quotes around
them.
- AS_IS: This will generate all names in their proper case without quotes.
- LOWER: This will transform all names to lower case.
- UPPER: This will transform all names to upper case.
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderNameStyle>AS_IS</renderNameStyle>
</settings>
- AS_IS (the default): Generate keywords as they are defined in the codebase (mostly lower case).
- LOWER: Generate keywords in lower case.
- UPPER: Generate keywords in upper case.
- PASCAL: Generate keywords in pascal case.
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderKeywordStyle>UPPER</renderKeywordStyle>
</settings>
An example:
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<paramType>NAMED</paramType>
</settings>
- java.sql.PreparedStatement: This allows for sending bind variables to the server. jOOQ uses
prepared statements by default.
- java.sql.Statement: Also "static statement". These do not support bind variables and may be
useful for one-shot commands like DDL statements.
The statementType setting allows for overriding the default of using prepared statements internally.
There are two possible options for this setting:
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<statementType>STATIC_STATEMENT</statementType>
</settings>
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<executeLogging>false</executeLogging>
</settings>
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<executeWithOptimisticLocking>true</executeWithOptimisticLocking>
<executeWithOptimisticLockingExcludeUnversioned>false</executeWithOptimisticLockingExcludeUnversioned>
</settings>
For more details, please refer to the manual's section about the optimistic locking feature.
AuthorRecord author =
DSL.using(configuration) // This configuration will be attached to any record produced by the below query.
.selectFrom(AUTHOR)
.where(AUTHOR.ID.eq(1))
.fetchOne();
author.setLastName("Smith");
author.store(); // This store call operates on the "attached" configuration.
In some cases (e.g. when serialising records), it may be desirable not to attach the Configuration that
created a record to the record. This can be achieved with the attachRecords setting:
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<attachRecords>false</attachRecords>
</settings>
AuthorRecord author =
DSL.using(configuration) // This configuration will be attached to any record produced by the below query.
.selectFrom(AUTHOR)
.where(AUTHOR.ID.eq(1))
.fetchOne();
author.setId(2);
author.store(); // The behaviour of this store call is governed by the updatablePrimaryKeys flag
The above store call depends on the value of the updatablePrimaryKeys flag:
- false (the default): Since immutability of primary keys is assumed, the store call will create a new
record (a copy) with the new primary key value.
- true: Since mutablity of primary keys is allowed, the store call will change the primary key value
from 1 to 2.
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<updatablePrimaryKeys>true</updatablePrimaryKeys>
</settings>
XML configuration
© 2009 - 2019 by Data Geekery™ GmbH. Page 70 / 323
The jOOQ User Manual 4.2.6.12. Fetch Warnings
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<reflectionCaching>false</reflectionCaching>
</settings>
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<fetchWarnings>false</fetchWarnings>
</settings>
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<returnAllOnUpdatableRecord>true</returnAllOnUpdatableRecord>
</settings>
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<mapJPAAnnotations>false</mapJPAAnnotations>
</settings>
All of these flags are JDBC-only features with no direct effect on jOOQ. jOOQ only passes them through
to the underlying statement.
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<queryTimeout>5</queryTimeout>
<maxRows>1000</maxRows>
<fetchSize>20</fetchSize>
</settings>
This problem may not be obvious to Java / jOOQ developers, as they are always produced from the
same jOOQ statement:
Depending on the possible sizes of the collection, it may be worth exploring using arrays or temporary
tables as a workaround, or to reuse the original query that produced the set of IDs in the first place
(through a semi-join). But sometimes, this is not possible. In this case, users can opt in to a third
workaround: enabling the inListPadding setting. If enabled, jOOQ will "pad" the IN list to a length that
is a power of two. So, the original queries would look like this instead:
-- Original -- Padded
SELECT * FROM AUTHOR WHERE ID IN (?) SELECT * FROM AUTHOR WHERE ID IN (?)
SELECT * FROM AUTHOR WHERE ID IN (?, ?) SELECT * FROM AUTHOR WHERE ID IN (?, ?)
SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?) SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?)
SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?) SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?)
SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?, ?) SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?, ?, ?, ?, ?)
SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?, ?, ?) SELECT * FROM AUTHOR WHERE ID IN (?, ?, ?, ?, ?, ?, ?, ?)
This technique will drastically reduce the number of possible SQL strings without impairing too much
the usual cases where the IN list is small. When padding, the last bind variable will simply be repeated
many times.
Usually, there is a better way - use this as a last resort!
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<inListPadding>true</inListPadding>
</settings>
SELECT 'I''m sure this is OK' AS val -- Standard SQL escaping of apostrophe by doubling it.
SELECT 'I\'m certain this causes trouble' AS val -- Vendor-specific escaping of apostrophe by using a backslash.
As most databases don't support backslash escaping (and MySQL also allows for turning it off!), jOOQ
by default also doesn't support it when inlining bind variables. However, this can lead to SQL injection
vulnerabilities and syntax errors when not dealing with it carefully!
This feature is turned on by default and for historic reasons for MySQL and MariaDB.
- DEFAULT (the - surprise! - default): Turns the feature ON for MySQL and MariaDB and OFF for all
other dialects
- ON: Turn the feature on.
- OFF: Turn the feature off.
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<backslashEscaping>OFF</backslashEscaping>
</settings>
SELECT DSL.using(configuration)
(SELECT my_package.format(LANGUAGE_ID) FROM dual) .select(MyPackage.format(BOOK.LANGUAGE_ID))
FROM BOOK .from(BOOK)
If our table contains thousands of books, but only a dozen of LANGUAGE_ID values, then with scalar
subquery caching, we can avoid most of the function calls and cache the result per LANGUAGE_ID.
Programmatic configuration
XML configuration
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.11.2.xsd">
<renderScalarSubqueriesForStoredFunctions>true</renderScalarSubqueriesForStoredFunctions>
</settings>
We'll see how the aliasing works later in the section about aliased tables
jOOQ as an internal domain specific language in Java (a.k.a. the DSL API)
Many other frameworks have similar APIs with similar feature sets. Yet, what makes jOOQ special is its
informal BNF notation modelling a unified SQL dialect suitable for many vendor-specific dialects, and
implementing that BNF notation as a hierarchy of interfaces in Java. This concept is extremely powerful,
when using jOOQ in modern IDEs with syntax completion. Not only can you code much faster, your
SQL code will be compile-checked to a certain extent. An example of a DSL query equivalent to the
previous one is given here:
Unlike other, simpler frameworks that use "fluent APIs" or "method chaining", jOOQ's BNF-based
interface hierarchy will not allow bad query syntax. The following will not compile, for instance:
History of SQL building and incremental query building (a.k.a. the model
API)
Historically, jOOQ started out as an object-oriented SQL builder library like any other. This meant that
all queries and their syntactic components were modeled as so-called QueryParts, which delegate SQL
rendering and variable binding to child components. This part of the API will be referred to as the
model API (or non-DSL API), which is still maintained and used internally by jOOQ for incremental query
building. An example of incremental query building is given here:
This query is equivalent to the one shown before using the DSL syntax. In fact, internally, the DSL API
constructs precisely this SelectQuery object. Note, that you can always access the SelectQuery object
to switch between DSL and model APIs:
Mutability
Note, that for historic reasons, the DSL API mixes mutable and immutable behaviour with respect to
the internal representation of the QueryPart being constructed. While creating conditional expressions,
column expressions (such as functions) assumes immutable behaviour, creating SQL statements does
not. In other words, the following can be said:
// Statements (mutable)
// --------------------
SelectFromStep<?> s1 = select();
SelectJoinStep<?> s2 = s1.from(BOOK);
SelectJoinStep<?> s3 = s1.from(AUTHOR);
On the other hand, beware that you can always extract and modify bind values from any QueryPart.
-- Pseudo-SQL for a common table expression specification // Code for creating a CommonTableExpression instance
"t1" ("f1", "f2") AS (SELECT 1, 'a') name("t1").fields("f1", "f2").as(select(val(1), val("a")));
The above expression can be assigned to a variable in Java and then be used to create a full SELECT
statement:
CommonTableExpression<Record2<Integer, String>> t1 =
name("t1").fields("f1", "f2").as(select(val(1), val("a")));
CommonTableExpression<Record2<Integer, String>> t2 =
name("t2").fields("f3", "f4").as(select(val(2), val("b")));
Result<?> result2 =
create.with(t1)
WITH "t1" ("f1", "f2") AS (SELECT 1, 'a'), .with(t2)
"t2" ("f3", "f4") AS (SELECT 2, 'b') .select(
SELECT t1.field("f1").add(t2.field("f3")).as("add"),
"t1"."f1" + "t2"."f3" AS "add", t1.field("f2").concat(t2.field("f4")).as("concat"))
"t1"."f2" || "t2"."f4" AS "concat" .from(t1, t2)
FROM "t1", "t2" .fetch();
;
Note that the org.jooq.CommonTableExpression type extends the commonly used org.jooq.Table type,
and can thus be used wherever a table can be used.
© 2009 - 2019 by Data Geekery™ GmbH. Page 77 / 323
The jOOQ User Manual 4.3.3. The SELECT statement
create.with("a").as(select(
WITH "a" AS (SELECT val(1).as("x"),
1 AS "x", val("a").as("y")
'a' AS "y" ))
) .select()
SELECT .from(table(name("a")))
FROM "a" .fetch();
;
-- get all authors' first and last names, and the number // And with jOOQ...
-- of books they've written in German, if they have written
-- more than five books in German in the last three years
-- (from 2011), and sort those authors by last names
-- limiting results to the second and third row, locking DSLContext create = DSL.using(connection, dialect);
-- the rows for a subsequent update... whew!
create.select(AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME, count())
SELECT AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME, COUNT(*) .from(AUTHOR)
FROM AUTHOR .join(BOOK).on(BOOK.AUTHOR_ID.eq(AUTHOR.ID))
JOIN BOOK ON AUTHOR.ID = BOOK.AUTHOR_ID .where(BOOK.LANGUAGE.eq("DE"))
WHERE BOOK.LANGUAGE = 'DE' .and(BOOK.PUBLISHED.gt("2008-01-01"))
AND BOOK.PUBLISHED > '2008-01-01' .groupBy(AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
GROUP BY AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME .having(count().gt(5))
HAVING COUNT(*) > 5 .orderBy(AUTHOR.LAST_NAME.asc().nullsFirst())
ORDER BY AUTHOR.LAST_NAME ASC NULLS FIRST .limit(2)
LIMIT 2 .offset(1)
OFFSET 1 .forUpdate()
FOR UPDATE .fetch();
Details about the various clauses of this query will be provided in subsequent sections.
As you can see, there is no way to further restrict/project the selected fields. This just selects all known
TableFields in the supplied Table, and it also binds <R extends Record> to your Table's associated
Record. An example of such a Query would then be:
The "reduced" SELECT API is limited in the way that it skips DSL access to any of these clauses:
In most parts of this manual, it is assumed that you do not use the "reduced" SELECT API. For more
information about the simple SELECT API, see the manual's section about fetching strongly or weakly
typed records.
-- The SELECT clause // Provide a varargs Fields list to the SELECT clause:
SELECT BOOK.ID, BOOK.TITLE Select<?> s1 = create.select(BOOK.ID, BOOK.TITLE);
SELECT BOOK.ID, TRIM(BOOK.TITLE) Select<?> s2 = create.select(BOOK.ID, trim(BOOK.TITLE));
Some commonly used projections can be easily created using convenience methods:
See more details about functions and expressions in the manual's section about Column expressions
SELECT *
jOOQ supports the asterisk operator in projections both as a qualified asterisk (through Table.asterisk())
and as an unqualified asterisk (through DSL.asterisk()). It is also possible to omit the projection entirely,
in case of which an asterisk may appear in generated SQL, if not all column names are known to jOOQ.
// Explicitly selects all columns available from BOOK and AUTHOR - No asterisk
create.select().from(BOOK, AUTHOR).fetch();
create.select().from(BOOK).crossJoin(AUTHOR).fetch();
// Renders a SELECT * statement, as columns are unknown to jOOQ - Implicit unqualified asterisk
create.select().from(table(name("BOOK"))).fetch();
With all of the above syntaxes, the row type (as discussed below) is unknown to jOOQ and to the Java
compiler.
It is worth mentioning that in many cases, using an asterisk is a sign of an inefficient query because if
not all columns are needed, too much data is transferred between client and server, plus some joins
that could be eliminated otherwise, cannot.
Since the generic R type is bound to some Record[N], the associated T type information can be used in
various other contexts, e.g. the IN predicate. Such a SELECT statement can be assigned typesafely:
For more information about typesafe record types with degree up to 22, see the manual's section about
Record1 to Record22.
Read more about aliasing in the manual's section about aliased tables.
SELECT * create.select()
FROM TABLE( .from(table(
DBMS_XPLAN.DISPLAY_CURSOR(null, null, 'ALLSTATS') DbmsXplan.displayCursor(null, null, "ALLSTATS")
); ).fetch();
Note, in order to access the DbmsXplan package, you can use the code generator to generate Oracle's
SYS schema.
Read more about dual or dummy tables in the manual's section about the DUAL table. The following
are examples of how to form normal FROM clauses:
- [ INNER ] JOIN
- LEFT [ OUTER ] JOIN
- RIGHT [ OUTER ] JOIN
- FULL OUTER JOIN
- LEFT SEMI JOIN
- LEFT ANTI JOIN
- CROSS JOIN
- NATURAL JOIN
- NATURAL LEFT [ OUTER ] JOIN
- NATURAL RIGHT [ OUTER ] JOIN
All of these JOIN methods can be called on org.jooq.Table types, or directly after the FROM clause for
convenience. The following example joins AUTHOR and BOOK
The two syntaxes will produce the same SQL statement. However, calling "join" on org.jooq.Table objects
allows for more powerful, nested JOIN expressions (if you can handle the parentheses):
SELECT * // Nest joins and provide JOIN conditions only at the end
FROM AUTHOR create.select()
LEFT OUTER JOIN ( .from(AUTHOR
BOOK JOIN BOOK_TO_BOOK_STORE .leftOuterJoin(BOOK
ON BOOK_TO_BOOK_STORE.BOOK_ID = BOOK.ID .join(BOOK_TO_BOOK_STORE)
) .on(BOOK_TO_BOOK_STORE.BOOK_ID.eq(BOOK.ID)))
ON BOOK.AUTHOR_ID = AUTHOR.ID .on(BOOK.AUTHOR_ID.eq(AUTHOR.ID)))
.fetch();
- See the section about conditional expressions to learn more about the many ways to create
org.jooq.Condition objects in jOOQ.
- See the section about table expressions to learn about the various ways of referencing
org.jooq.Table objects in jOOQ
SELECT * create.select()
FROM AUTHOR .from(AUTHOR)
JOIN BOOK ON BOOK.AUTHOR_ID = AUTHOR.ID .join(BOOK).onKey()
.fetch();
In case of ambiguity, you can also supply field references for your foreign keys, or the generated foreign
key reference to the onKey() method.
Note that formal support for the Sybase JOIN ON KEY syntax is on the roadmap.
In schemas with high degrees of normalisation, you may also choose to use NATURAL JOIN, which takes
no JOIN arguments as it joins using all fields that are common to the table expressions to the left and
to the right of the JOIN operator. An example:
SELECT * create.select()
FROM AUTHOR .from(AUTHOR)
LEFT OUTER JOIN BOOK .leftOuterJoin(BOOK)
PARTITION BY (PUBLISHED_IN) .partitionBy(BOOK.PUBLISHED_IN)
ON BOOK.AUTHOR_ID = AUTHOR.ID .on(BOOK.AUTHOR_ID.eq(AUTHOR.ID))
.fetch();
DSL.using(configuration)
.select()
.from(AUTHOR,
lateral(select(count().as("c"))
.from(BOOK)
.where(BOOK.AUTHOR_ID.eq(AUTHOR.ID)))
)
.fetch("c", int.class);
The above example shows standard usage of the LATERAL keyword to connect a derived table to the
previous table in the FROM clause. A similar statement can be written in T-SQL:
DSL.using(configuration)
.from(AUTHOR)
.crossApply(
select(count().as("c"))
.from(BOOK)
.where(BOOK.AUTHOR_ID.eq(AUTHOR.ID))
)
.fetch("c", int.class)
While not all forms of LATERAL JOIN have an equivalent APPLY syntax, the inverse is true, and jOOQ can
thus emulate OUTER APPLY and CROSS APPLY using LATERAL JOIN.
LATERAL JOIN or CROSS APPLY are particularly useful together with table valued functions, which are
also supported by jOOQ.
There is quite a bit of syntactic ceremony (or we could even call it "noise") to get a relatively simple job
done. A much simpler notation would be using implicit joins:
Notice how this alternative notation (depending on your taste) may look more tidy and straightforward,
as the semantics of accessing a table's parent table (or an entity's parent entity) is straightforward.
From jOOQ 3.11 onwards, this syntax is supported for to-one relationship navigation. The code
generator produces relevant navigation methods on generated tables, which can be used in a type safe
way. The navigation method names are:
- The parent table name, if there is only one foreign key between child table and parent table
- The foreign key name, if there are more than one foreign keys between child table and parent
table
The generated SQL is almost identical to the original one - there is no performance penalty to this
syntax.
How it works
During the SQL generation phase, implicit join paths are replaced by generated aliases for the path's
last table. The paths are translated to a join graph, which is always LEFT JOINed to the path's "root table".
If two paths share a common prefix, that prefix is also shared in the join graph.
© 2009 - 2019 by Data Geekery™ GmbH. Page 85 / 323
The jOOQ User Manual 4.3.3.5. The WHERE clause
Future versions of jOOQ may choose to generate correlated subqueries or inner joins where this may
seem more appropriate, if the query semantics doesn't change through that.
Known limitations
- Implicit JOINs are currently only supported in SELECT statements (including any type of
subquery), but not in the WHERE clause of UPDATE statements or DELETE statements, for
instance.
- Implicit JOINs can currently only be used to access columns, not to produce joins. I.e. it is not
possible to write things like FROM book IMPLICIT JOIN book.author
- Implicit JOINs are added to the SQL string after the entire SQL statement is available, for
performance reasons. This means, that VisitListener SPI implementations cannot observe
implicitly joined tables
SELECT * create.select()
FROM BOOK .from(BOOK)
WHERE AUTHOR_ID = 1 .where(BOOK.AUTHOR_ID.eq(1))
AND TITLE = '1984' .and(BOOK.TITLE.eq("1984"))
.fetch();
The above syntax is convenience provided by jOOQ, allowing you to connect the org.jooq.Condition
supplied in the WHERE clause with another condition using an AND operator. You can of course also
create a more complex condition and supply that to the WHERE clause directly (observe the different
placing of parentheses). The results will be the same:
SELECT * create.select()
FROM BOOK .from(BOOK)
WHERE AUTHOR_ID = 1 .where(BOOK.AUTHOR_ID.eq(1).and(
AND TITLE = '1984' BOOK.TITLE.eq("1984")))
.fetch();
You will find more information about creating conditional expressions later in the manual.
-- SELECT ..
-- FROM ..
-- WHERE ..
CONNECT BY [ NOCYCLE ] condition [ AND condition, ... ] [ START WITH condition ]
-- GROUP BY ..
-- ORDER [ SIBLINGS ] BY ..
An example for an iterative query, iterating through values between 1 and 5 is this:
© 2009 - 2019 by Data Geekery™ GmbH. Page 86 / 323
The jOOQ User Manual 4.3.3.6. The CONNECT BY clause
Here's a more complex example where you can recursively fetch directories in your database, and
concatenate them to a path:
SELECT .select(
SUBSTR(SYS_CONNECT_BY_PATH(DIRECTORY.NAME, '/'), 2) sysConnectByPath(DIRECTORY.NAME, "/").substring(2))
FROM DIRECTORY .from(DIRECTORY)
CONNECT BY .connectBy(
PRIOR DIRECTORY.ID = DIRECTORY.PARENT_ID prior(DIRECTORY.ID).eq(DIRECTORY.PARENT_ID))
START WITH DIRECTORY.PARENT_ID IS NULL .startWith(DIRECTORY.PARENT_ID.isNull())
ORDER BY 1 .orderBy(1)
.fetch();
+------------------------------------------------+
|substring |
+------------------------------------------------+
|C: |
|C:/eclipse |
|C:/eclipse/configuration |
|C:/eclipse/dropins |
|C:/eclipse/eclipse.exe |
+------------------------------------------------+
|...21 record(s) truncated...
Some of the supported functions and pseudo-columns are these (available from the DSL):
- LEVEL
- CONNECT_BY_IS_CYCLE
- CONNECT_BY_IS_LEAF
- CONNECT_BY_ROOT
- SYS_CONNECT_BY_PATH
- PRIOR
Note that this syntax is also supported in the CUBRID database and might be emulated in other dialects
supporting common table expressions in the future.
ORDER SIBLINGS
The Oracle database allows for specifying a SIBLINGS keyword in the ORDER BY clause. Instead of
ordering the overall result, this will only order siblings among each other, keeping the hierarchy intact.
An example is given here:
According to the SQL standard, you may omit the GROUP BY clause and still issue a HAVING clause. This
will implicitly GROUP BY (). jOOQ also supports this syntax. The following example selects one record,
only if there are at least 4 books in the books table:
- H2
- MySQL
- PostgreSQL
- SQLite
- Sybase SQL Anywhere
WindowDefinition w = name("w").as(
orderBy(PEOPLE.FIRST_NAME));
select(
SELECT lag(AUTHOR.FIRST_NAME, 1).over(w).as("prev"),
LAG(first_name, 1) OVER w "prev", AUTHOR.FIRST_NAME,
first_name, lead(AUTHOR.FIRST_NAME, 1).over(w).as("next"))
LEAD(first_name, 1) OVER w "next" .from(AUTHOR)
FROM author .window(w)
WINDOW w AS (ORDER first_name) .orderBy(AUTHOR.FIRST_NAME.desc())
ORDER BY first_name DESC .fetch();
Note that in order to create such a window definition, we need to first create a name reference using
DSL.name().
Even if only PostgreSQL and Sybase SQL Anywhere natively support this great feature, jOOQ can
emulate it by expanding any org.jooq.WindowDefinition and org.jooq.WindowSpecification types that
you pass to the window() method - if the database supports window functions at all.
Some more information about window functions and the WINDOW clause can be found on our blog:
http://blog.jooq.org/2013/11/03/probably-the-coolest-sql-feature-window-functions/
Any jOOQ column expression (or field) can be transformed into an org.jooq.SortField by calling the asc()
and desc() methods.
SELECT create.select(
AUTHOR.FIRST_NAME, AUTHOR.FIRST_NAME,
AUTHOR.LAST_NAME AUTHOR.LAST_NAME)
FROM AUTHOR .from(AUTHOR)
ORDER BY LAST_NAME ASC, .orderBy(AUTHOR.LAST_NAME.asc(),
FIRST_NAME ASC NULLS LAST AUTHOR.FIRST_NAME.asc().nullsLast())
.fetch();
If your database doesn't support this syntax, jOOQ emulates it using a CASE expression as follows
SELECT
AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME
FROM AUTHOR
ORDER BY LAST_NAME ASC,
CASE WHEN FIRST_NAME IS NULL
THEN 1 ELSE 0 END ASC,
FIRST_NAME ASC
SELECT * create.select()
FROM BOOK .from(BOOK)
ORDER BY CASE TITLE .orderBy(choose(BOOK.TITLE)
WHEN '1984' THEN 0 .when("1984", 0)
WHEN 'Animal Farm' THEN 1 .when("Animal Farm", 1)
ELSE 2 END ASC .otherwise(2).asc())
.fetch();
But writing these things can become quite verbose. jOOQ supports a convenient syntax for specifying
sort mappings. The same query can be written in jOOQ as such:
create.select()
.from(BOOK)
.orderBy(BOOK.TITLE.sortAsc("1984", "Animal Farm"))
.fetch();
create.select()
.from(BOOK)
.orderBy(BOOK.TITLE.sort(new HashMap<String, Integer>() {{
put("1984", 1);
put("Animal Farm", 13);
put("The jOOQ book", 10);
}}))
.fetch();
Of course, you can combine this feature with the previously discussed NULLS FIRST / NULLS LAST
feature. So, if in fact these two books are the ones you like least, you can put all NULLS FIRST (all the
other books):
create.select()
.from(BOOK)
.orderBy(BOOK.TITLE.sortAsc("1984", "Animal Farm").nullsFirst())
.fetch();
create.select().from(BOOK).limit(1).offset(2).fetch();
This will limit the result to 1 books starting with the 2nd book (starting at offset 0!). limit() is supported
in all dialects, offset() in all but Sybase ASE, which has no reasonable means to emulate it. This is how
jOOQ trivially emulates the above query in various SQL dialects with native OFFSET pagination support:
-- Firebird
SELECT * FROM BOOK ROWS 2 TO 3
Things get a little more tricky in those databases that have no native idiom for OFFSET pagination (actual
queries may vary):
As you can see, jOOQ will take care of the incredibly painful ROW_NUMBER() OVER() (or ROWNUM for
Oracle) filtering in subselects for you, you'll just have to write limit(1).offset(2) in any dialect.
Side-note: If you're interested in understanding why we chose ROWNUM for Oracle, please refer to this
very interesting benchmark, comparing the different approaches of doing pagination in Oracle: http://
www.inf.unideb.hu/~gabora/pagination/results.html.
By default, most users will use the semantics of the ONLY keyword, meaning a LIMIT 5 expression (or
FETCH NEXT 5 ROWS ONLY expression) will result in at most 5 rows. The alternative clause WITH TIES
will return at most 5 rows, except if the 5th row and the 6th row (and so on) are "tied" according to the
ORDER BY clause, meaning that the ORDER BY clause does not deterministically produce a 5th or 6th
row. For example, let's look at our book table:
SELECT * DSL.using(configuration)
FROM book .selectFrom(BOOK)
ORDER BY actor_id .orderBy(BOOK.ACTOR_ID)
FETCH NEXT 1 ROWS WITH TIES .limit(1).withTies()
.fetch();
Resulting in:
id actor_id title
---------------------
1 1 1984
2 1 Animal Farm
We're now getting two rows because both rows "tied" when ordering them by ACTOR_ID. The database
cannot really pick the next 1 row, so they're both returned. If we omit the WITH TIES clause, then only
a random one of the rows would be returned.
Not all databases support WITH TIES. Oracle 12c supports the clause as specified in the SQL standard,
and SQL Server knows TOP n WITH TIES without OFFSET support.
| ID | VALUE | PAGE_BOUNDARY |
|------|-------|---------------|
| ... | ... | ... |
| 474 | 2 | 0 |
| 533 | 2 | 1 | <-- Before page 6
| 640 | 2 | 0 |
| 776 | 2 | 0 |
| 815 | 2 | 0 |
| 947 | 2 | 0 |
| 37 | 3 | 1 | <-- Last on page 6
| 287 | 3 | 0 |
| 450 | 3 | 0 |
| ... | ... | ... |
Now, if we want to display page 6 to the user, instead of going to page 6 by using a record OFFSET, we
could just fetch the record strictly after the last record on page 5, which yields the values (533, 2). This
is how you would do it with SQL or with jOOQ:
DSL.using(configuration)
.select(T.ID, T.VALUE)
SELECT id, value .from(T)
FROM t .orderBy(T.VALUE, T.ID)
WHERE (value, id) > (2, 533) .seek(2, 533)
ORDER BY value, id .limit(5)
LIMIT 5 .fetch();
As you can see, the jOOQ SEEK clause is a synthetic clause that does not really exist in SQL. However,
the jOOQ syntax is far more intuitive for a variety of reasons:
| ID | VALUE |
|-----|-------|
| 640 | 2 |
| 776 | 2 |
| 815 | 2 |
| 947 | 2 |
| 37 | 3 |
Note that you cannot combine the SEEK clause with the OFFSET clause.
More information about this great feature can be found in the jOOQ blog:
- http://blog.jooq.org/2013/10/26/faster-sql-paging-with-jooq-using-the-seek-method/
- http://blog.jooq.org/2013/11/18/faster-sql-pagination-with-keysets-continued/
Further information about offset pagination vs. keyset pagination performance can be found on our
partner page:
SELECT * create.select()
FROM BOOK .from(BOOK)
WHERE ID = 3 .where(BOOK.ID.eq(3))
FOR UPDATE .forUpdate()
.fetch();
The above example will produce a record-lock, locking the whole record for updates. Some databases
also support cell-locks using FOR UPDATE OF ..
SELECT * create.select()
FROM BOOK .from(BOOK)
WHERE ID = 3 .where(BOOK.ID.eq(3))
FOR UPDATE OF TITLE .forUpdate().of(BOOK.TITLE)
.fetch();
Oracle goes a bit further and also allows to specify the actual locking behaviour. It features these
additional clauses, which are all supported by jOOQ:
- FOR UPDATE NOWAIT: This is the default behaviour. If the lock cannot be acquired, the query
fails immediately
- FOR UPDATE WAIT n: Try to wait for [n] seconds for the lock acquisition. The query will fail only
afterwards
- FOR UPDATE SKIP LOCKED: This peculiar syntax will skip all locked records. This is particularly
useful when implementing queue tables with multiple consumers
create.select().from(BOOK).where(BOOK.ID.eq(3)).forUpdate().nowait().fetch();
create.select().from(BOOK).where(BOOK.ID.eq(3)).forUpdate().wait(5).fetch();
create.select().from(BOOK).where(BOOK.ID.eq(3)).forUpdate().skipLocked().fetch();
try (
PreparedStatement stmt = connection.prepareStatement(
"SELECT * FROM author WHERE id IN (3, 4, 5)",
ResultSet.TYPE_SCROLL_SENSITIVE,
ResultSet.CONCUR_UPDATABLE);
ResultSet rs = stmt.executeQuery()
) {
while (rs.next()) {
// UPDATE the primary key for row-locks, or any other columns for cell-locks
rs.updateObject(1, rs.getObject(1));
rs.updateRow();
The main drawback of this approach is the fact that the database has to maintain a scrollable cursor,
whose records are locked one by one. This can cause a major risk of deadlocks or race conditions if
the JDBC driver can recover from the unsuccessful locking, if two Java threads execute the following
statements:
-- thread 1
SELECT * FROM author ORDER BY id ASC;
-- thread 2
SELECT * FROM author ORDER BY id DESC;
So use this technique with care, possibly only ever locking single rows!
jOOQ's set operators and how they're different from standard SQL
As previously mentioned in the manual's section about the ORDER BY clause, jOOQ has slightly changed
the semantics of these set operators. While in SQL, a subselect may not contain any ORDER BY clause
© 2009 - 2019 by Data Geekery™ GmbH. Page 96 / 323
The jOOQ User Manual 4.3.3.16. Oracle-style hints
or LIMIT clause (unless you wrap the subselect into a nested SELECT), jOOQ allows you to do so. In
order to select both the youngest and the oldest author from the database, you can issue the following
statement with jOOQ (rendered to the MySQL dialect):
In case your database doesn't support ordered UNION subselects, the subselects are nested in derived
tables:
SELECT * FROM (
SELECT * FROM AUTHOR
ORDER BY DATE_OF_BIRTH ASC LIMIT 1
)
UNION
SELECT * FROM (
SELECT * FROM AUTHOR
ORDER BY DATE_OF_BIRTH DESC LIMIT 1
)
ORDER BY 1
This can be done in jOOQ using the .hint() clause in your SELECT statement:
create.select(AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
.hint("/*+ALL_ROWS*/")
.from(AUTHOR)
.fetch();
Note that you can pass any string in the .hint() clause. If you use that clause, the passed string will always
be put in between the SELECT [DISTINCT] keywords and the actual projection list. This can be useful in
other databases too, such as MySQL, for instance:
© 2009 - 2019 by Data Geekery™ GmbH. Page 97 / 323
The jOOQ User Manual 4.3.3.17. Lexical and logical SELECT clause order
- The FROM clause: First, all data sources are defined and joined
- The WHERE clause: Then, data is filtered as early as possible
- The CONNECT BY clause: Then, data is traversed iteratively or recursively, to produce new tuples
- The GROUP BY clause: Then, data is reduced to groups, possibly producing new tuples if
grouping functions like ROLLUP(), CUBE(), GROUPING SETS() are used
- The HAVING clause: Then, data is filtered again
- The SELECT clause: Only now, the projection is evaluated. In case of a SELECT DISTINCT
statement, data is further reduced to remove duplicates
- The UNION clause: Optionally, the above is repeated for several UNION-connected subqueries.
Unless this is a UNION ALL clause, data is further reduced to remove duplicates
- The ORDER BY clause: Now, all remaining tuples are ordered
- The LIMIT clause: Then, a paginating view is created for the ordered tuples
- The FOR UPDATE clause: Finally, pessimistic locking is applied
The SQL Server documentation also explains this, with slightly different clauses:
- FROM
- ON
- JOIN
- WHERE
- GROUP BY
- WITH CUBE or WITH ROLLUP
- HAVING
- SELECT
- DISTINCT
- ORDER BY
- TOP
As can be seen, databases have to logically reorder a SQL statement in order to determine the best
execution plan.
// WHERE clause
Where p.UnitsInStock <= p.ReorderLevel AndAlso Not p.Discontinued
// SELECT clause
Select p
A SLICK example:
While this looks like a good idea at first, it only complicates translation to more advanced SQL statements
while impairing readability for those users that are used to writing SQL. jOOQ is designed to look just
like SQL. This is specifically true for SLICK, which not only changed the SELECT clause order, but also
heavily "integrated" SQL clauses with the Scala language.
For these reasons, the jOOQ DSL API is modelled in SQL's lexical order.
Note that for explicit degrees up to 22, the VALUES() constructor provides additional typesafety. The
following example illustrates this:
jOOQ tries to stay close to actual SQL. In detail, however, Java's expressiveness is limited. That's why the
values() clause is repeated for every record in multi-record inserts.
Some RDBMS do not support inserting several records in a single statement. In those cases, jOOQ
emulates multi-record INSERTs using the following SQL:
This can make a lot of sense in situations where you want to "reserve" a row in the database for
an subsequent UPDATE statement within the same transaction. Or if you just want to send an event
containing trigger-generated default values, such as IDs or timestamps.
The DEFAULT VALUES clause is not supported in all databases, but jOOQ can emulate it using the
equivalent statement:
The DEFAULT keyword (or DSL#defaultValue() method) can also be used for individual columns only,
although that will have the same effect as leaving the column away entirely.
create.insertInto(AUTHOR)
.set(AUTHOR.ID, 100)
.set(AUTHOR.FIRST_NAME, "Hermann")
.set(AUTHOR.LAST_NAME, "Hesse")
.newRecord()
.set(AUTHOR.ID, 101)
.set(AUTHOR.FIRST_NAME, "Alfred")
.set(AUTHOR.LAST_NAME, "Döblin")
.execute();
As you can see, this syntax is a bit more verbose, but also more readable, as every field can be matched
with its value. Internally, the two syntaxes are strictly equivalent.
create.insertInto(AUTHOR_ARCHIVE)
.select(selectFrom(AUTHOR).where(AUTHOR.DECEASED.isTrue()))
.execute();
If the underlying database doesn't have any way to "ignore" failing INSERT statements, (e.g. MySQL via
INSERT IGNORE), jOOQ can emulate the statement using a MERGE statement, or using INSERT .. SELECT
WHERE NOT EXISTS:
System.out.println(record.getValue(AUTHOR.ID));
// For some RDBMS, this also works when inserting several values
// The following should return a 2x2 table
Result<?> result =
create.insertInto(AUTHOR, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
.values("Johann Wolfgang", "von Goethe")
.values("Friedrich", "Schiller")
// You can request any field. Also trigger-generated values
.returning(AUTHOR.ID, AUTHOR.CREATION_DATE)
.fetch();
Some databases have poor support for returning generated keys after INSERTs. In those cases, jOOQ
might need to issue another SELECT statement in order to fetch an @@identity value. Be aware, that
this can lead to race-conditions in those databases that cannot properly return generated ID values.
For more information, please consider the jOOQ Javadoc for the returning() clause.
Most databases allow for using scalar subselects in UPDATE statements in one way or another. jOOQ
models this through a set(Field<T>, Select<? extends Record1<T>>) method in the UPDATE DSL API:
UPDATE .. FROM
Some databases, including PostgreSQL and SQL Server, support joining additional tables to an UPDATE
statement using a vendor-specific FROM clause. This is supported as well by jOOQ:
In many cases, such a joined update statement can be emulated using a correlated subquery, or using
updatable views.
UPDATE .. RETURNING
The Firebird and Postgres databases support a RETURNING clause on their UPDATE statements, similar
as the RETURNING clause in INSERT statements. This is useful to fetch trigger-generated values in one
go. An example is given here:
The UPDATE .. RETURNING clause is emulated for DB2 using the SQL standard SELECT .. FROM FINAL
TABLE(UPDATE ..) construct, and in Oracle, using the PL/SQL UPDATE .. RETURNING statement.
This syntax can be fully emulated by jOOQ for all other databases that support the SQL standard MERGE
statement. For more information about the H2 MERGE syntax, see the documentation here:
http://www.h2database.com/html/grammar.html#merge
Depending on whether your database supports catalogs and schemas, the above SET statements may
be supported in your database.
In MariaDB, MySQL, SQL Server, the SET CATALOG statement is emulated using:
USE catalogname;
Indexes
// Renaming the index only if it exists (not all databases support this)
create.alterIndexIfExists("old_index").renameTo("new_index").execute();
Schemas
// Renaming the schema only if it exists (not all databases support this)
create.alterSchemaIfExists("old_schema").renameTo("new_schema").execute();
Sequences
// Renaming the sequence only if it exists (not all databases support this)
create.alterSequenceIfExists("old_sequence").renameTo("new_sequence").execute();
Tables
These statements alter the table itself:
// Renaming the table only if it exists (not all databases support this)
create.alterTableIfExists("old_table").renameTo("new_table").execute();
// Adding columns
create.alterTable(AUTHOR).add(AUTHOR.TITLE, VARCHAR.length(5)).execute();
create.alterTable(AUTHOR).add(AUTHOR.TITLE, VARCHAR.length(5).nullable(false)).execute();
// Altering columns
create.alterTable(AUTHOR).alter(TITLE).defaultValue("no title").execute();
create.alterTable(AUTHOR).alter(TITLE).set(VARCHAR.length(5)).execute();
create.alterTable(AUTHOR).alter(TITLE).set(VARCHAR.length(5).nullable(false)).execute();
create.alterTable(AUTHOR).renameColumn("old_column").to("new_column").execute();
// Dropping columns
create.alterTable(AUTHOR).drop(TITLE).execute();
// Adding constraints
create.alterTable(BOOK).add(constraint("PK_BOOK").primaryKey(BOOK.ID)).execute();
create.alterTable(BOOK).add(constraint("UK_TITLE").unique(BOOK.TITLE)).execute();
create.alterTable(BOOK).add(
constraint("FK_AUTHOR_ID")
.foreignKey(BOOK.AUTHOR_ID)
.references(AUTHOR, AUTHOR.ID)).execute();
create.alterTable(BOOK).add(
constraint("CHECK_PUBLISHED_IN")
.check(BOOK.PUBLISHED_IN.between(1900).and(2000))).execute();
// Altering constraints
create.alterTable(BOOK).renameConstraint("old_constraint").to("new_constraint").execute();
// Dropping constraints
create.alterTable(AUTHOR).dropConstraint("UK_TITLE").execute();
Views
// Renaming the view only if it exists (not all databases support this)
create.alterViewIfExists("old_view").renameTo("new_view").execute();
Indexes
// Create an index only if it doesn't exist (not all databases support this)
create.createIndexIfNotExists("I_AUTHOR_LAST_NAME").on(AUTHOR, AUTHOR.LAST_NAME).execute();
Schemas
// Create a schema
create.createSchema("new_schema").execute();
// Create a schema only if it doesn't exists (not all databases support this)
create.createSchemaIfNotExists("new_schema").execute();
Sequences
// Create a sequence
create.createSequence(S_AUTHOR_ID).execute();
// Create a sequence only if it doesn't exists (not all databases support this)
create.createSequence(S_AUTHOR_ID).execute();
Tables
// Create a table only if it doesn't exists (not all databases support this)
create.createTableIfNotExists("TOP_AUTHORS")
...
Views
// Create a view
create.createView("V_TOP_AUTHORS").as(
select(
AUTHOR.ID,
AUTHOR.FIRST_NAME,
AUTHOR.LAST_NAME)
.from(AUTHOR)
.where(val(50).lt(
selectCount().from(BOOK)
.where(BOOK.AUTHOR_ID.eq(AUTHOR.ID))
))).execute();
// Create a view only if it doesn't exists (not all databases support this)
create.createTableIfNotExists("TOP_AUTHORS")
...
Indexes
// Drop an index
create.dropIndex("I_AUTHOR_LAST_NAME").execute();
Schemas
// Drop a schema
create.dropSchema("schema").execute();
Sequences
// Drop a sequence
create.dropSequence(S_AUTHOR_ID).execute();
Tables
// Drop a table
create.dropTable(AUTHOR).execute();
Views
// Drop a view
create.dropView(V_AUTHOR).execute();
[// Drop a view only if it exists (not all databases support this)
create.dropViewIfExists(V_AUTHOR).execute();
create.truncate(AUTHOR).execute();
TRUNCATE is not supported by Ingres and SQLite. jOOQ will execute a DELETE FROM AUTHOR
statement instead.
// SCHEMA is the generated schema that contains a reference to all generated tables
Queries ddl =
DSL.using(configuration)
.ddl(SCHEMA);
When executing the above, you should see something like the following:
Do note that these features only restore parts of the original schema. For instance, vendor-specific
storage clauses that are not available to jOOQ's generated meta data cannot be reproduced this way.
The catalog
A catalog is a collection of schemas. In many databases, the catalog corresponds to the database, or
the database instance. Most often, catalogs are completely independent and their tables cannot be
joined or combined in any way in a single query. The exception here is SQL Server, which allows for fully
referencing tables from multiple catalogs:
SELECT *
FROM [Catalog1].[Schema1].[Table1] AS [t1]
JOIN [Catalog2].[Schema2].[Table2] AS [t2] ON [t1].[ID] = [t2].[ID]
By default, the Settings.renderCatalog flag is turned on. In case a database supports querying multiple
catalogs, jOOQ will generate fully qualified object names, including catalog name. For more information
about this setting, see the manual's section about settings
jOOQ's code generator generates subpackages for each catalog.
The schema
A schema is a collection of objects, such as tables. Most databases support some sort of schema
(except for some embedded databases like Access, Firebird, SQLite). In most databases, the schema is
an independent structural entity. In Oracle, the schema and the user / owner is mostly treated as the
same thing. An example of a query that uses fully qualified tables including schema names is:
SELECT *
FROM "Schema1"."Table1" AS "t1"
JOIN "Schema2"."Table2" AS "t2" ON "t1"."ID" = "t2"."ID"
By default, the Settings.renderSettings flag is turned on. jOOQ will thus generate fully qualified object
names, including the setting name. For more information about this setting, see the manual's section
about settings
SELECT * create.select()
FROM AUTHOR -- Table expression AUTHOR .from(AUTHOR) // Table expression AUTHOR
JOIN BOOK -- Table expression BOOK .join(BOOK) // Table expression BOOK
ON (AUTHOR.ID = BOOK.AUTHOR_ID) .on(AUTHOR.ID.eq(BOOK.AUTHOR_ID))
.fetch();
The above example shows how AUTHOR and BOOK tables are joined in a SELECT statement. It also
shows how you can access table columns by dereferencing the relevant Java attributes of their tables.
See the manual's section about generated tables for more information about what is really generated
by the code generator
-- Select all books by authors born after 1920, // Declare your aliases before using them in SQL:
-- named "Paulo" from a catalogue: Author a = AUTHOR.as("a");
Book b = BOOK.as("b");
As you can see in the above example, calling as() on generated tables returns an object of the same
type as the table. This means that the resulting object can be used to dereference fields from the
aliased table. This is quite powerful in terms of having your Java compiler check the syntax of your SQL
statements. If you remove a column from a table, dereferencing that column from that table alias will
cause compilation errors.
This feature is useful in various use-cases where column names are not known in advance (but the
table's degree is!). An example for this are unnested tables, or the VALUES() table constructor:
-- Unnested tables
SELECT t.a, t.b
FROM unnest(my_table_function()) t(a, b)
-- VALUES() constructor
SELECT t.a, t.b
FROM VALUES(1, 2),(3, 4) t(a, b)
Only few databases really support such a syntax, but fortunately, jOOQ can emulate it easily using
UNION ALL and an empty dummy record specifying the new column names. The two statements are
equivalent:
In jOOQ, you would simply specify a varargs list of column aliases as such:
// Unnested tables
create.select().from(unnest(myTableFunction()).as("t", "a", "b")).fetch();
// VALUES() constructor
create.select().from(values(
row(1, 2),
row(3, 4)
).as("t", "a", "b"))
.fetch();
Most databases do not support unnamed derived tables, they require an explicit alias. If you do not
provide jOOQ with such an explicit alias, an alias will be generated based on the derived table's content,
to make sure the generated SQL will be syntactically correct. The generated alias is not specified and
should not be referenced explicitly.
A(colA1, ..., colAn) "join" B(colB1, ..., colBm) "produces" C(colA1, ..., colAn, colB1, ..., colBm)
SQL and relational algebra distinguish between at least the following JOIN types (upper-case: SQL, lower-
case: relational algebra):
- CROSS JOIN or cartesian product: The basic JOIN in SQL, producing a relational cross product,
combining every record of table A with every record of table B. Note that cartesian products can
also be produced by listing comma-separated table expressions in the FROM clause of a SELECT
statement
- NATURAL JOIN: The basic JOIN in relational algebra, yet a rarely used JOIN in databases with
everyday degree of normalisation. This JOIN type unconditionally equi-joins two tables by all
columns with the same name (requiring foreign keys and primary keys to share the same name).
Note that the JOIN columns will only figure once in the resulting table expression.
- INNER JOIN or equi-join: This JOIN operation performs a cartesian product (CROSS JOIN)
with a filtering predicate being applied to the resulting table expression. Most often, a equal
comparison predicate comparing foreign keys and primary keys will be applied as a filter, but any
other predicate will work, too.
- OUTER JOIN: This JOIN operation performs a cartesian product (CROSS JOIN) with a filtering
predicate being applied to the resulting table expression. Most often, a equal comparison
predicate comparing foreign keys and primary keys will be applied as a filter, but any other
predicate will work, too. Unlike the INNER JOIN, an OUTER JOIN will add "empty records" to the
left (table A) or right (table B) or both tables, in case the conditional expression fails to produce
a.
- semi-join: In SQL, this JOIN operation can only be expressed implicitly using IN predicates or
EXISTS predicates. The table expression resulting from a semi-join will only contain the left-hand
side table A
- anti-join: In SQL, this JOIN operation can only be expressed implicitly using NOT IN predicates or
NOT EXISTS predicates. The table expression resulting from a semi-join will only contain the left-
hand side table A
- division: This JOIN operation is hard to express at all, in SQL. See the manual's chapter about
relational division for details on how jOOQ emulates this operation.
jOOQ supports all of these JOIN types (including semi-join and anti-join) directly on any table expression:
// INNER JOIN
TableOnStep join(TableLike<?>)
TableOnStep innerJoin(TableLike<?>)
TablePartitionByStep rightJoin(TableLike<?>)
TablePartitionByStep rightOuterJoin(TableLike<?>)
// SEMI JOIN
TableOnStep<R> leftSemiJoin(TableLike<?>);
// ANTI JOIN
TableOnStep<R> leftAntiJoin(TableLike<?>);
// CROSS JOIN
Table<Record> crossJoin(TableLike<?>)
// NATURAL JOIN
Table<Record> naturalJoin(TableLike<?>)
Table<Record> naturalLeftOuterJoin(TableLike<?>)
Table<Record> naturalRightOuterJoin(TableLike<?>)
Most of the above JOIN types are overloaded also to accommodate plain SQL use-cases for
convenience:
// Overloaded versions taking SQL template strings with bind variables, or other forms of
// "plain SQL" QueryParts:
TableOnStep join(String)
TableOnStep join(String, Object...)
TableOnStep join(String, QueryPart...)
TableOnStep join(SQL)
TableOnStep join(Name)
Note that most of jOOQ's JOIN operations give way to a similar DSL API hierarchy as previously seen in
the manual's section about the JOIN clause
SELECT a, b create.select()
FROM VALUES(1, 'a'), .from(values(row(1, "a"),
(2, 'b') t(a, b) row(2, "b")).as("t", "a", "b"))
.fetch();
Note, that it is usually quite useful to provide column aliases ("derived column lists") along with the table
alias for the VALUES() constructor.
The above statement is emulated by jOOQ for those databases that do not support the VALUES()
constructor, natively (actual emulations may vary):
-- An empty dummy record is added to provide column names for the emulated derived column expression
SELECT NULL a, NULL b FROM DUAL WHERE 1 = 0 UNION ALL
SELECT * create.select()
FROM BOOK .from(BOOK)
WHERE BOOK.AUTHOR_ID = ( .where(BOOK.AUTHOR_ID.eq(create
SELECT ID .select(AUTHOR.ID)
FROM AUTHOR .from(AUTHOR)
WHERE LAST_NAME = 'Orwell') .where(AUTHOR.LAST_NAME.eq("Orwell"))))
.fetch();
Table<Record> nested =
create.select(BOOK.AUTHOR_ID, count().as("books"))
SELECT nested.* FROM ( .from(BOOK)
SELECT AUTHOR_ID, count(*) books .groupBy(BOOK.AUTHOR_ID).asTable("nested");
FROM BOOK
GROUP BY AUTHOR_ID create.select(nested.fields())
) nested .from(nested)
ORDER BY nested.books DESC .orderBy(nested.field("books"))
.fetch();
-- SELECT ..
FROM table PIVOT (aggregateFunction [, aggregateFunction] FOR column IN (expression [, expression]))
-- WHERE ..
The PIVOT clause is available from the org.jooq.Table type, as pivoting is done directly on a table.
Currently, only Oracle's PIVOT clause is supported. Support for SQL Server's slightly different PIVOT
clause will be added later. Also, jOOQ may emulate PIVOT for other dialects in the future.
With jOOQ, you can simplify using relational divisions by using the following syntax:
C.divideBy(B).on(C.ID.eq(B.C_ID)).returning(C.TEXT)
Or in plain text: Find those TEXT values in C whose ID's correspond to all ID's in B. Note that from the
above SQL statement, it is immediately clear that proper indexing is of the essence. Be sure to have
indexes on all columns referenced from the on(...) and returning(...) clauses.
For more information about relational division and some nice, real-life examples, see
- http://en.wikipedia.org/wiki/Relational_algebra#Division
- http://www.simple-talk.com/sql/t-sql-programming/divided-we-stand-the-sql-of-relational-
division/
SELECT * create.select()
FROM TABLE(DBMS_XPLAN.DISPLAY_CURSOR(null, null, 'ALLSTATS')); .from(table(DbmsXplan.displayCursor(null, null,
"ALLSTATS"))
.fetch();
Note, in order to access the DbmsXplan package, you can use the code generator to generate Oracle's
SYS schema.
The jOOQ code generator will now produce a generated table from the above, which can be used as
a SQL function:
// Lateral joining the table-valued function to another table using CROSS APPLY:
create.select(BOOK.ID, F_BOOKS.TITLE)
.from(BOOK.crossApply(fBooks(BOOK.ID)))
.fetch();
- The ones that always require a FROM clause (as required by the SQL standard)
- The ones that never require a FROM clause (and still allow a WHERE clause)
- The ones that require a FROM clause only with a WHERE clause, GROUP BY clause, or HAVING
clause
With jOOQ, you don't have to worry about the above distinction of SQL dialects. jOOQ never requires
a FROM clause, but renders the necessary "DUAL" table, if needed. The following program shows how
jOOQ renders "DUAL" tables
Note, that some databases (H2, MySQL) can normally do without "DUAL". However, there exist some
corner-cases with complex nested SELECT statements, where this will cause syntax errors (or parser
bugs). To stay on the safe side, jOOQ will always render "dual" in those dialects.
// The same function created from a pre-existing Field using "postfix" notation
Field<String> field3 = BOOK.TITLE.trim();
In general, it is up to you whether you want to use the "prefix" notation or the "postfix" notation to
create new column expressions based on existing ones. The "SQL way" would be to use the "prefix
notation", with functions created from the DSL. The "Java way" or "object-oriented way" would be to use
the "postfix" notation with functions created from org.jooq.Field objects. Both ways ultimately create
the same query part, though.
Table columns implement a more specific interface called org.jooq.TableField, which is parameterised
with its associated <R extends Record> record type.
See the manual's section about generated tables for more information about what is really generated
by the code generator
When you alias Fields like above, you can access those Fields' values using the alias name:
These unnamed expressions can be used both in SQL as well as with jOOQ. However, do note that
jOOQ will use Field.getName() to extract this column name from the field, when referencing the field or
when nesting it in derived tables. In order to stay in full control of any such column names, it is always
a good idea to provide explicit aliasing for column expressions, both in SQL as well as in jOOQ.
create.select(TAuthor.LAST_NAME.cast(PostgresDataType.TEXT)).fetch();
The same thing can be achieved by casting a Field directly to String.class, as TEXT is the default data
type in Postgres to map to Java's String
create.select(TAuthor.LAST_NAME.cast(String.class)).fetch();
In the above example, field1 will be treated by jOOQ as a Field<String>, binding the numeric literal 1 as
a VARCHAR value. The same applies to field2, whose string literal "1" will be bound as an INTEGER value.
This technique is better than performing unsafe or rawtype casting in Java, if you cannot access the
"right" field type from any given expression.
4.7.5. Collations
Many databases support "collations", which defines the sort order on character data types, such as
VARCHAR.
Such databases usually allow for specifying:
The actual implementation is vendor-specific, including the way the above defaults override each other.
To accommodate most use-cases jOOQ 3.11 introduced the org.jooq.Collation type, which can be
attached to a org.jooq.DataType through DataType.collate(Collation), or to a org.jooq.Field through
Field.collate(Collation), for example:
SELECT * create.selectFrom(BOOK)
FROM book .orderBy(BOOK.TITLE.collate("utf8_bin"))
ORDER BY title COLLATE utf8_bin .fetch();
+ - * / %
create.select(val(1).add(2).mul(val(5).sub(3)).div(2).mod(10)).fetch();
Operator precedence
jOOQ does not know any operator precedence (see also boolean operator precedence). All operations
are evaluated from left to right, as with any object-oriented API. The two following expressions are the
same:
For more advanced datetime arithmetic, use the DSL's timestampDiff() and dateDiff() functions, as well
as jOOQ's built-in SQL standard INTERVAL data type support:
create.selectFrom(BOOK).where(TITLE.likeRegex("^.*SQL.*$")).fetch();
Note that the SQL standard specifies that patterns should follow the XQuery standards. In the real
world, the POSIX regular expression standard is the most used one, some use Java regular expressions,
and only a few ones use Perl regular expressions. jOOQ does not make any assumptions about
regular expression syntax. For cross-database compatibility, please read the relevant database manuals
carefully, to learn about the appropriate syntax. Please refer to the DSL Javadoc for more details.
Intervals in jOOQ
jOOQ fills a gap opened by JDBC, which neglects an important SQL data type as defined by the SQL
standards: INTERVAL types. See the manual's section about INTERVAL data types for more details.
// Statistical functions
AggregateFunction<BigDecimal> median (Field<? extends Number> field);
AggregateFunction<BigDecimal> stddevPop (Field<? extends Number> field);
AggregateFunction<BigDecimal> stddevSamp(Field<? extends Number> field);
AggregateFunction<BigDecimal> varPop (Field<? extends Number> field);
AggregateFunction<BigDecimal> varSamp (Field<? extends Number> field);
Here's an example, counting the number of books any author has written:
Aggregate functions have strong limitations about when they may be used and when not. For instance,
you can use aggregate functions in scalar queries. Typically, this means you only select aggregate
functions, no regular columns or other column expressions. Another use case is to use them along with
a GROUP BY clause as seen in the previous example. Note, that jOOQ does not check whether your
using of aggregate functions is correct according to the SQL standards, or according to your database's
behaviour.
SELECT
count(*), create.select(
count(*) FILTER (WHERE TITLE LIKE 'A%') count(),
FROM BOOK count().filterWhere(BOOK.TITLE.like("A%")))
.from(BOOK)
It is usually a good idea to calculate multiple aggregate functions in a single query, if this is possible.
© 2009 - 2019 by Data Geekery™ GmbH. Page 129 / 323
The jOOQ User Manual 4.7.15. Aggregate functions
Only few databases (e.g. HSQLDB, PostgreSQL) implement native support for the FILTER clause. In all
other databases, jOOQ emulates the clause using a CASE expression:
SELECT
count(*),
count(CASE WHEN TITLE LIKE 'A%' THEN 1 END)
FROM BOOK
Aggregate functions exclude NULL values from aggregation, so the above query is equivalent to the
one using FILTER.
+---------------------+
| LISTAGG |
+---------------------+
| 1984, Animal Farm |
| O Alquimista, Brida |
+---------------------+
SUM(BOOK.AMOUNT_SOLD) sum(BOOK.AMOUNT_SOLD)
KEEP(DENSE_RANK FIRST ORDER BY BOOK.AUTHOR_ID) .keepDenseRankFirstOrderBy(BOOK.AUTHOR_ID)
// Ranking functions
WindowOverStep<Integer> rowNumber();
WindowOverStep<Integer> rank();
WindowOverStep<Integer> denseRank();
WindowOverStep<BigDecimal> percentRank();
// Windowing functions
<T> WindowIgnoreNullsStep<T> firstValue(Field<T> field);
<T> WindowIgnoreNullsStep<T> lastValue(Field<T> field);
<T> WindowIgnoreNullsStep<T> nthValue(Field<T> field, int nth);
<T> WindowIgnoreNullsStep<T> nthValue(Field<T> field, Field<Integer> nth);
<T> WindowIgnoreNullsStep<T> lead(Field<T> field);
<T> WindowIgnoreNullsStep<T> lead(Field<T> field, int offset);
<T> WindowIgnoreNullsStep<T> lead(Field<T> field, int offset, T defaultValue);
<T> WindowIgnoreNullsStep<T> lead(Field<T> field, int offset, Field<T> defaultValue);
<T> WindowIgnoreNullsStep<T> lag(Field<T> field);
<T> WindowIgnoreNullsStep<T> lag(Field<T> field, int offset);
<T> WindowIgnoreNullsStep<T> lag(Field<T> field, int offset, T defaultValue);
<T> WindowIgnoreNullsStep<T> lag(Field<T> field, int offset, Field<T> defaultValue);
// Statistical functions
WindowOverStep<BigDecimal> cumeDist();
WindowOverStep<Integer> ntile(int number);
SQL distinguishes between various window function types (e.g. "ranking functions"). Depending on the
function, SQL expects mandatory PARTITION BY or ORDER BY clauses within the OVER() clause. jOOQ
does not enforce those rules for two reasons:
If possible, however, jOOQ tries to render missing clauses for you, if a given SQL dialect is more
restrictive.
Some examples
Here are some simple examples of window functions with jOOQ:
SUM(BOOK.AMOUNT_SOLD) sum(BOOK.AMOUNT_SOLD)
KEEP(DENSE_RANK FIRST ORDER BY BOOK.AUTHOR_ID) .keepDenseRankFirstOrderBy(BOOK.AUTHOR_ID)
OVER(PARTITION BY 1) .over().partitionByOne();
-- ROLLUP() with one argument -- The same query using UNION ALL:
SELECT AUTHOR_ID, COUNT(*) SELECT AUTHOR_ID, COUNT(*) FROM BOOK GROUP BY (AUTHOR_ID)
FROM BOOK UNION ALL
GROUP BY ROLLUP(AUTHOR_ID) SELECT NULL, COUNT(*) FROM BOOK GROUP BY ()
ORDER BY 1 NULLS LAST
-- ROLLUP() with two arguments -- The same query using UNION ALL:
SELECT AUTHOR_ID, PUBLISHED_IN, COUNT(*) SELECT AUTHOR_ID, PUBLISHED_IN, COUNT(*)
FROM BOOK FROM BOOK GROUP BY (AUTHOR_ID, PUBLISHED_IN)
GROUP BY ROLLUP(AUTHOR_ID, PUBLISHED_IN) UNION ALL
SELECT AUTHOR_ID, NULL, COUNT(*)
FROM BOOK GROUP BY (AUTHOR_ID)
UNION ALL
SELECT NULL, NULL, COUNT(*)
FROM BOOK GROUP BY ()
ORDER BY 1 NULLS LAST, 2 NULLS LAST
In English, the ROLLUP() grouping function provides N+1 groupings, when N is the number of arguments
to the ROLLUP() function. Each grouping has an additional group field from the ROLLUP() argument
field list. The results of the second query might look something like this:
+-----------+--------------+----------+
| AUTHOR_ID | PUBLISHED_IN | COUNT(*) |
+-----------+--------------+----------+
| 1 | 1945 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 1 | 1948 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 1 | NULL | 2 | <- GROUP BY (AUTHOR_ID)
| 2 | 1988 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 2 | 1990 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 2 | NULL | 2 | <- GROUP BY (AUTHOR_ID)
| NULL | NULL | 4 | <- GROUP BY ()
+-----------+--------------+----------+
-- CUBE() with two arguments -- The same query using UNION ALL:
SELECT AUTHOR_ID, PUBLISHED_IN, COUNT(*) SELECT AUTHOR_ID, PUBLISHED_IN, COUNT(*)
FROM BOOK FROM BOOK GROUP BY (AUTHOR_ID, PUBLISHED_IN)
GROUP BY CUBE(AUTHOR_ID, PUBLISHED_IN) UNION ALL
SELECT AUTHOR_ID, NULL, COUNT(*)
FROM BOOK GROUP BY (AUTHOR_ID)
UNION ALL
SELECT NULL, PUBLISHED_IN, COUNT(*)
FROM BOOK GROUP BY (PUBLISHED_IN)
UNION ALL
SELECT NULL, NULL, COUNT(*)
FROM BOOK GROUP BY ()
ORDER BY 1 NULLS FIRST, 2 NULLS FIRST
+-----------+--------------+----------+
| AUTHOR_ID | PUBLISHED_IN | COUNT(*) |
+-----------+--------------+----------+
| NULL | NULL | 2 | <- GROUP BY ()
| NULL | 1945 | 1 | <- GROUP BY (PUBLISHED_IN)
| NULL | 1948 | 1 | <- GROUP BY (PUBLISHED_IN)
| NULL | 1988 | 1 | <- GROUP BY (PUBLISHED_IN)
| NULL | 1990 | 1 | <- GROUP BY (PUBLISHED_IN)
| 1 | NULL | 2 | <- GROUP BY (AUTHOR_ID)
| 1 | 1945 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 1 | 1948 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 2 | NULL | 2 | <- GROUP BY (AUTHOR_ID)
| 2 | 1988 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
| 2 | 1990 | 1 | <- GROUP BY (AUTHOR_ID, PUBLISHED_IN)
+-----------+--------------+----------+
GROUPING SETS()
GROUPING SETS() are the generalised way to create multiple groupings. From our previous examples
This is nicely explained in the SQL Server manual pages about GROUPING SETS() and other grouping
functions:
http://msdn.microsoft.com/en-us/library/bb510427(v=sql.105)
The above function will be made available from a generated Routines class. You can use it like any other
column expression:
Note that user-defined functions returning CURSOR or ARRAY data types can also be used wherever
table expressions can be used, if they are unnested
MEMBER FUNCTION ODCIAggregateTerminate(self IN U_SECOND_MAX, returnValue OUT NUMBER, flags IN NUMBER) RETURN NUMBER IS
BEGIN
RETURNVALUE := SELF.SECMAX;
RETURN ODCIConst.Success;
END;
-- OR: // OR:
In jOOQ, both syntaxes are supported (The second one is emulated in Derby, which only knows the first
one). Unfortunately, both case and else are reserved words in Java. jOOQ chose to use decode() from
the Oracle DECODE function, or choose(), and otherwise(), which means the same as else.
A CASE expression can be used anywhere where you can place a column expression (or Field). For
instance, you can SELECT the above expression, if you're selecting from AUTHOR:
-- Oracle:
DECODE(FIRST_NAME, 'Paulo', 'brazilian',
'George', 'english',
'unknown');
-- Other SQL dialects // Use the Oracle-style DECODE() function with jOOQ.
CASE AUTHOR.FIRST_NAME WHEN 'Paulo' THEN 'brazilian' // Note, that you will not be able to rely on type-safety
WHEN 'George' THEN 'english' DSL.decode(AUTHOR.FIRST_NAME,
ELSE 'unknown' "Paulo", "brazilian",
END "George", "english",
"unknown");
You can then use your generated sequence object directly in a SQL statement as such:
- For more information about generated sequences, refer to the manual's section about
generated sequences
- For more information about executing standalone calls to sequences, refer to the manual's
section about sequence execution
- comparison predicates
- NULL predicates
- BETWEEN predicates
- IN predicates
- OVERLAPS predicate (for degree 2 row value expressions only)
See the relevant sections for more details about how to use row value expressions in predicates.
- 1 or TRUE
- 0 or FALSE
- NULL or UNKNOWN
It is important to know that SQL differs from many other languages in the way it interprets the NULL
boolean value. Most importantly, the following facts are to be remembered:
For simplified NULL handling, please refer to the section about the DISTINCT predicate.
Note that jOOQ does not model these values as actual column expression compatible.
- plain SQL conditions, that allow you to phrase your own SQL string conditional expression
- The EXISTS predicate, a standalone predicate that creates a conditional expression
- Constant TRUE and FALSE conditional expressions
The above example shows that the number of parentheses in Java can quickly explode. Proper
indentation may become crucial in making such code readable. In order to understand how jOOQ
composes combined conditional expressions, let's assign component expressions first:
Condition combined1 = a.or(b); // These OR-connected conditions form a new condition, wrapped in parentheses
Condition combined2 = combined1.andNot(c); // The left-hand side of the AND NOT () operator is already wrapped in parentheses
Unfortunately, Java does not support operator overloading, hence these operators are also
implemented as methods in jOOQ, like any other SQL syntax elements. The relevant parts of the
org.jooq.Field interface are these:
Note that every operator is represented by two methods. A verbose one (such as equal()) and a two-
character one (such as eq()). Both methods are the same. You may choose either one, depending on
your taste. The manual will always use the more verbose one.
In SQL, the two expressions wouldn't be the same, as SQL natively knows operator precedence.
jOOQ supports all of the above row value expression comparison predicates, both with column
expression lists and scalar subselects at the right-hand side:
For the example, the right-hand side of the quantified comparison predicates were filled with argument
lists. But it is easy to imagine that the source of values results from a subselect.
[ROW VALUE EXPRESSION] IN [IN PREDICATE VALUE] [ROW VALUE EXPRESSION] = ANY [IN PREDICATE VALUE]
Typically, the IN predicate is more readable than the quantified comparison predicate.
The SQL standard contains a nice truth table for the above rules:
+-----------------------+-----------+---------------+---------------+-------------------+
| Expression | R IS NULL | R IS NOT NULL | NOT R IS NULL | NOT R IS NOT NULL |
+-----------------------+-----------+---------------+---------------+-------------------+
| degree 1: null | true | false | false | true |
| degree 1: not null | false | true | true | false |
| degree > 1: all null | true | false | false | true |
| degree > 1: some null | false | false | true | true |
| degree > 1: none null | false | true | true | false |
+-----------------------+-----------+---------------+---------------+-------------------+
In jOOQ, you would simply use the isNull() and isNotNull() methods on row value expressions. Again,
as with the row value expression comparison predicate, the row value expression NULL predicate is
emulated by jOOQ, if your database does not natively support it:
row(BOOK.ID, BOOK.TITLE).isNull();
row(BOOK.ID, BOOK.TITLE).isNotNull();
For instance, you can compare two fields for distinctness, ignoring the fact that any of the two could be
NULL, which would lead to funny results. This is supported by jOOQ as such:
If your database does not natively support the DISTINCT predicate, jOOQ emulates it with an equivalent
CASE expression, modelling the above truth table:
[A] BETWEEN [B] AND [C] [A] >= [B] AND [A] <= [C]
Note the inclusiveness of range boundaries in the definition of the BETWEEN predicate. Intuitively, this
is supported in jOOQ as such:
BETWEEN SYMMETRIC
The SQL standard defines the SYMMETRIC keyword to be used along with BETWEEN to indicate that you
do not care which bound of the range is larger than the other. A database system should simply swap
range bounds, in case the first bound is greater than the second one. jOOQ supports this keyword as
well, emulating it if necessary.
[A] BETWEEN SYMMETRIC [B] AND [C] ([A] BETWEEN [B] AND [C]) OR ([A] BETWEEN [C] AND [B])
[A] BETWEEN [B] AND [C] [A] >= [B] AND [A] <= [C]
[A] BETWEEN SYMMETRIC [B] AND [C] ([A] >= [B] AND [A] <= [C]) OR ([A] >= [C] AND [A] <= [B])
The above can be factored out according to the rules listed in the manual's section about row value
expression comparison predicates.
jOOQ supports the BETWEEN [SYMMETRIC] predicate and emulates it in all SQL dialects where
necessary. An example is given here:
- _: (single-character wildcard)
- %: (multi-character wildcard)
With jOOQ, the LIKE predicate can be created from any column expression as such:
TITLE LIKE '%The !%-Sign Book%' ESCAPE '!' BOOK.TITLE.like("%The !%-Sign Book%", '!')
TITLE NOT LIKE '%The !%-Sign Book%' ESCAPE '!' BOOK.TITLE.notLike("%The !%-Sign Book%", '!')
In the above predicate expressions, the exclamation mark character is passed as the escape character
to escape wildcard characters "!_" and "!%", as well as to escape the escape character itself: "!!"
Please refer to your database manual for more details about escaping patterns with the LIKE predicate.
Note, that jOOQ escapes % and _ characters in value in some of the above predicate implementations.
For simplicity, this has been omitted in this manual.
4.8.13. IN predicate
In SQL, apart from comparing a value against several values, the IN predicate can be used to create
semi-joins or anti-joins. jOOQ knows the following methods on the org.jooq.Field interface, to construct
such IN predicates:
A good way to prevent this from happening is to use the EXISTS predicate for anti-joins, which is NULL-
value insensitive. See the manual's section about conditional expressions to see a boolean truth table.
jOOQ supports the IN predicate with row value expressions.An example is given here:
In both cases, i.e. when using an IN list or when using a subselect, the type of the predicate is checked.
Both sides of the predicate must be of equal degree and row type.
Emulation of the IN predicate where row value expressions aren't well supported is currently only
available for IN predicates that do not take a subselect as an IN predicate value.
- From the DSL, using static methods. This is probably the most used case
- From a conditional expression using convenience methods attached to boolean operators
- From a SELECT statement using convenience methods attached to the where clause, and from
other clauses
Note that in SQL, the projection of a subselect in an EXISTS predicate is irrelevant. To help you write
queries like the above, you can use jOOQ's selectZero() or selectOne() DSL methods
http://blog.jooq.org/2012/07/27/not-in-vs-not-exists-vs-left-join-is-null-mysql/
-- INTERVAL data types are also supported. This is equivalent to the above
(DATE '2010-01-01', CAST('+2 00:00:00' AS INTERVAL DAY TO SECOND)) OVERLAPS
(DATE '2010-01-02', CAST('+2 00:00:00' AS INTERVAL DAY TO SECOND))
-- This predicate
(A, B) OVERLAPS (C, D)
- If a record attribute is set to a value, then that value is used for an equality predicate
- If a record attribute is not set, then that attribute is not used for any predicates
The latter API call makes use of the convenience API DSLContext.fetchByExample(TableRecord).
create.select(
AUTHOR.FIRST_NAME.concat(AUTHOR.LAST_NAME),
count()
.from(AUTHOR)
.join(BOOK).on(AUTHOR.ID.eq(BOOK.AUTHOR_ID))
.groupBy(AUTHOR.ID, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
.orderBy(count().desc())
.fetch();
It is, however, interesting to think of all of the above expressions as what they are: expressions. And
as such, nothing keeps users from extracting expressions and referencing them from outside the
statement. The following statement is exactly equivalent:
SelectField<?>[] select = {
AUTHOR.FIRST_NAME.concat(AUTHOR.LAST_NAME),
count()
};
Table<?> from = AUTHOR.join(BOOK).on(AUTHOR.ID.eq(BOOK.AUTHOR_ID));
GroupField[] groupBy = { AUTHOR.ID, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME };
SortField<?>[] orderBy = { count().desc() };
create.select(select)
.from(from)
.groupBy(groupBy)
.orderBy()
.fetch();
Each individual expression, and collection of expressions can be seen as an independent entity that
can be
o Constructed dynamically
o Reused across queries
Dynamic construction is particularly useful in the case of the WHERE clause, for dynamic predicate
building. For instance:
if (request.getParameter("title") != null)
result = result.and(BOOK.TITLE.like("%" + request.getParameter("title") + "%"));
if (request.getParameter("author") != null)
result = result.and(BOOK.AUTHOR_ID.in(
selectOne().from(AUTHOR).where(
AUTHOR.FIRST_NAME.like("%" + request.getParameter("author") + "%")
.or(AUTHOR.LAST_NAME .like("%" + request.getParameter("author") + "%"))
)
));
return result;
}
// And then:
create.select()
.from(BOOK)
.where(condition(httpRequest))
.fetch();
The dynamic SQL building power may be one of the biggest advantages of using a runtime query model
like the one offered by jOOQ. Queries can be created dynamically, of arbitrary complexity. In the above
example, we've just constructed a dynamic WHERE clause. The same can be done for any other clauses,
including dynamic FROM clauses (dynamic JOINs), or adding additional WITH clauses as needed.
- aliasing
- nested selects
- arithmetic expressions
- casting
You'll probably find other examples. If verbosity scares you off, don't worry. The verbose use-cases for
jOOQ are rather rare, and when they come up, you do have an option. Just write SQL the way you're
used to!
jOOQ allows you to embed SQL as a String into any supported statement in these contexts:
Both the bind value and the query part argument overloads make use of jOOQ's plain SQL templating
language.
Please refer to the org.jooq.impl.DSL Javadoc for more details. The following is a more complete listing
of plain SQL construction methods from the DSL:
// A condition
Condition condition(String sql);
Condition condition(String sql, Object... bindings);
Condition condition(String sql, QueryPart... parts);
// A function
<T> Field<T> function(String name, Class<T> type, Field<?>... arguments);
<T> Field<T> function(String name, DataType<T> type, Field<?>... arguments);
// A table
Table<?> table(String sql);
Table<?> table(String sql, Object... bindings);
Table<?> table(String sql, QueryPart... parts);
Apart from the general factory methods, plain SQL is also available in various other contexts. For
instance, when adding a .where("a = b") clause to a query. Hence, there exist several convenience
methods where plain SQL can be inserted usefully. This is an example displaying all various use-cases
in one single query:
// Use plain SQL for conditions both in JOIN and WHERE clauses
.on("a.id = b.author_id")
- jOOQ doesn't know what you're doing. You're on your own again!
- You have to provide something that will be syntactically correct. If it's not, then jOOQ won't know.
Only your JDBC driver or your RDBMS will detect the syntax error.
- You have to provide consistency when you use variable binding. The number of ? must match
the number of variables
- Your SQL is inserted into jOOQ queries without further checks. Hence, jOOQ can't prevent SQL
injection.
The SQL string may reference the arguments by 0-based indexing. Each argument may be referenced
several times. For instance, SQLite's emulation of the REPEAT(string, count) function may look like this:
For convenience, there is also a DSL.list(QueryPart...) API that allows for wrapping a comma-separated
list of query parts in a single template argument:
Field<String> a = val("a");
Field<String> b = val("b");
Field<String> c = val("c");
Parsing rules
When processing these plain SQL templates, a mini parser is run that handles things like
- String literals
- Quoted names
- Comments
- JDBC escape sequences
The above are recognised by the templating engine and contents inside of them are ignored when
replacing numbered placeholders and/or bind variables. For instance:
query(
"SELECT /* In a comment, this is not a placeholder: {0}. And this is not a bind variable: ? */ title AS `title {1} ?` " +
"-- Another comment without placeholders: {2} nor bind variables: ?" +
"FROM book " +
"WHERE title = 'In a string literal, this is not a placeholder: {3}. And this is not a bind variable: ?'"
);
The above query does not contain any numbered placeholders nor bind variables, because the tokens
that would otherwise be searched for are contained inside of comments, string literals, or quoted
names.
Goal
Historically, jOOQ implements an internal domain-specific language in Java, which generates SQL (an
external domain-specific language) for use with JDBC. The jOOQ API is built from two parts: The DSL
and the model API where the DSL API adds lexical convenience for programmers on top of the model
API, which is really just a SQL expression tree, similar to what a SQL parser does inside of any database.
With this parser, the whole set of jOOQ functionality will now also be made available to anyone who
is not using jOOQ directly, including JDBC and/or JPA users, e.g. through the parsing connection, which
proxies all JDBC Connection calls to the jOOQ parser before forwarding them to the database, or
through the DSLContext.parser() API, which allows for a more low-level access to the parser directly,
e.g. for tool building on top of jOOQ.
The possibilities are endless, including standardised, SQL string based database migrations that work
on any SQLDialect that is supported by jOOQ.
Example
This parser API allows for parsing an arbitrary SQL string fragment into a variety of jOOQ API elements:
The parser is able to parse any unspecified dialect to produce a jOOQ representation of the SQL
expression, for instance:
ResultQuery<?> query =
DSL.using(configuration)
.parser()
.parseResultQuery("SELECT * FROM (VALUES (1, 'a'), (2, 'b')) t(a, b)")
The above SQL query is valid standard SQL and runs out of the box on PostgreSQL and SQL Server,
among others. The jOOQ ResultQuery that is generated from this SQL string, however, will also work
on any other database, as jOOQ can emulate the two interesting SQL features being used here:
The query might be rendered as follows on the H2 database, which supports VALUES(), but not derived
column lists:
select
t.a,
t.b
from (
(
select
null a,
null b
where 1 = 0
)
union all (
select *
from (values
(1, 'a'),
(2, 'b')
) t
)
) t;
select
t.a,
t.b
from (
(
select
null a,
null b
from dual
where 1 = 0
)
union all (
select *
from (
(
select
1,
'a'
from dual
)
union all (
select
2,
'b'
from dual
)
) t
)
) t;
Nevertheless, there is a grammar available for documentation purposes and it is included in the manual
here:
(The layout of the grammar and the grammar itself is still work in progress)
The diagrams have been created with the neat RRDiagram library by Christopher Deckers.
For the above reasons, and also to prevent an additional SQL injection risk where names might contain
SQL code, jOOQ by default quotes all names in generated SQL to be sure they match what is really
contained in your database. This means that the following names will be rendered
-- Unquoted name
AUTHOR.TITLE
-- MariaDB, MySQL
`AUTHOR`.`TITLE`
Note that you can influence jOOQ's name rendering behaviour through custom settings, if you prefer
another name style to be applied.
// Unqualified name
Name name = name("TITLE");
// Qualified name
Name name = name("AUTHOR", "TITLE");
Such names can be used as standalone QueryParts, or as DSL entry point for SQL expressions, like
More details about how to use names / identifiers to construct such expressions can be found in the
relevant sections of the manual.
- Protection against SQL injection. Instead of inlining values possibly originating from user input,
you bind those values to your prepared statement and let the JDBC driver / database take care
of handling security aspects.
- Increased speed. Advanced databases such as Oracle can keep execution plans of similar
queries in a dedicated cache to prevent hard-parsing your query again and again. In many cases,
the actual value of a bind variable does not influence the execution plan, hence it can be reused.
Preparing a statement will thus be faster
- On a JDBC level, you can also reuse the SQL string and prepared statement object instead of
constructing it again, as you can bind new values to the prepared statement. jOOQ currently
does not cache prepared statements, internally.
The following sections explain how you can introduce bind values in jOOQ, and how you can control
the way they are rendered and bound to SQL.
try (PreparedStatement stmt = connection.prepareStatement("SELECT * FROM BOOK WHERE ID = ? AND TITLE = ?")) {
With dynamic SQL, keeping track of the number of question marks and their corresponding index may
turn out to be hard. jOOQ abstracts this and lets you provide the bind value right where it is needed.
A trivial example is this:
create.select().from(BOOK).where(BOOK.ID.eq(5)).and(BOOK.TITLE.eq("Animal Farm")).fetch();
Note the using of DSL.val() to explicitly create an indexed bind value. You don't have to worry about that
index. When the query is rendered, each bind value will render a question mark. When the query binds
its variables, each bind value will generate the appropriate bind value index.
You can also extract specific bind values by index from a query, if you wish to modify their underlying
value after creating a query. This can be achieved as such:
For more details about jOOQ's internals, see the manual's section about QueryParts.
// Create a query with a named parameter. You can then use that name for accessing the parameter again
Query query1 = create.select().from(AUTHOR).where(LAST_NAME.eq(param("lastName", "Poe")));
Param<?> param1 = query.getParam("lastName");
// Or, keep a reference to the typed parameter in order not to lose the <T> type information:
Param<String> param2 = param("lastName", "Poe");
Query query2 = create.select().from(AUTHOR).where(LAST_NAME.eq(param2));
// You can now change the bind value directly on the Param reference:
param2.setValue("Orwell");
The org.jooq.Query interface also allows for setting new bind values directly, without accessing the
Param type:
In order to actually render named parameter names in generated SQL, use the
DSLContext.renderNamedParams() method:
In all cases, your inlined bind values will be properly escaped to avoid SQL syntax errors and SQL
injection. Some examples:
All methods in the jOOQ API that allow for plain (unescaped, untreated) SQL contain a warning message
in their relevant Javadoc, to remind you of the risk of SQL injection in what is otherwise a SQL-injection-
safe API.
4.15. QueryParts
A org.jooq.Query and all its contained objects is a org.jooq.QueryPart. QueryParts essentially provide
this functionality:
Both of these methods are contained in jOOQ's internal API's org.jooq.QueryPartInternal, which is
internally implemented by every QueryPart.
The following sections explain some more details about SQL rendering and variable binding, as well as
other implementation details about QueryParts in general.
// These methods are useful for generating unique aliases within a RenderContext (and thus within a Query)
String peekAlias();
String nextAlias();
// These methods allow for fluent appending of SQL to the RenderContext's internal StringBuilder
RenderContext keyword(String keyword);
RenderContext literal(String literal);
RenderContext sql(String sql);
RenderContext sql(char sql);
RenderContext sql(int sql);
RenderContext sql(QueryPart part);
// These methods allow for controlling formatting of SQL, if the relevant Setting is active
RenderContext formatNewLine();
RenderContext formatSeparator();
RenderContext formatIndentStart();
RenderContext formatIndentStart(int indent);
RenderContext formatIndentLockStart();
RenderContext formatIndentEnd();
RenderContext formatIndentEnd(int indent);
RenderContext formatIndentLockEnd();
The following additional methods are inherited from a common org.jooq.Context, which is shared
among org.jooq.RenderContext and org.jooq.BindContext:
// These methods indicate whether fields or tables are being declared (MY_TABLE AS MY_ALIAS) or referenced (MY_ALIAS)
boolean declareFields();
Context declareFields(boolean declareFields);
boolean declareTables();
Context declareTables(boolean declareTables);
// These methods provide the bind value indices within the scope of the whole Context (and thus of the whole Query)
int nextIndex();
int peekIndex();
-- [...]
FROM AUTHOR
JOIN BOOK ON AUTHOR.ID = BOOK.AUTHOR_ID
-- [...]
@Override
public final void accept(Context<?> context) {
// The CompareCondition delegates rendering of the Fields to the Fields
// themselves and connects them using the Condition's comparator operator:
context.visit(field1)
.sql(" ")
.keyword(comparator.toSQL())
.sql(" ")
.visit(field2);
}
See the manual's sections about custom QueryParts and plain SQL QueryParts to learn about how to
write your own query parts in order to extend jOOQ.
The section about ExecuteListeners shows an example of how such pretty printing can be used to log
readable SQL to the stdout.
- It provides some information about the "state" of the variable binding in process.
- It provides a common API for binding values to the context's internal java.sql.PreparedStatement
// This method provides access to the PreparedStatement to which bind values are bound
PreparedStatement statement();
Some additional methods are inherited from a common org.jooq.Context, which is shared among
org.jooq.RenderContext and org.jooq.BindContext. Details are documented in the previous chapter
about SQL rendering
-- [...]
WHERE AUTHOR.ID = ?
-- [...]
@Override
public final void bind(BindContext context) throws DataAccessException {
// The CompareCondition itself does not bind any variables.
// But the two fields involved in the condition might do so...
context.bind(field1).bind(field2);
}
See the manual's sections about custom QueryParts and plain SQL QueryParts to learn about how to
write your own query parts in order to extend jOOQ.
Converters
The simplest use-case of injecting custom data types is by using org.jooq.Converter. A Converter can
convert from a database type <T> to a user-defined type <U> and vice versa. You'll be implementing
this SPI:
// Your conversion logic goes into these two methods, that can convert
// between the database type T and the user type U:
U from(T databaseObject);
T to(U userObject);
If, for instance, you want to use Java 8's java.time.LocalDate for SQL DATE and java.time.LocalDateTime
for SQL TIMESTAMP, you write a converter like this:
import java.sql.Date;
import java.time.LocalDate;
import org.jooq.Converter;
@Override
public LocalDate from(Date t) {
return t == null ? null : LocalDate.parse(t.toString());
}
@Override
public Date to(LocalDate u) {
return u == null ? null : Date.valueOf(u.toString());
}
@Override
public Class<Date> fromType() {
return Date.class;
}
@Override
public Class<LocalDate> toType() {
return LocalDate.class;
}
}
This converter can now be used in a variety of jOOQ API, most importanly to create a new data type:
And data types, in turn, can be used with any org.jooq.Field, i.e. with any column expression, including
plain SQL or name based ones:
// Name based
Field<LocalDate> date2 = DSL.field(name("my_table", "my_column"), type);
Bindings
While converters are very useful for simple use-cases, org.jooq.Binding is useful when you need to
customise data type interactions at a JDBC level, e.g. when you want to bind a PostgreSQL JSON data
type. Custom bindings implement the following SPI:
// A callback that generates the SQL string for bind values of this
// binding type. Typically, just ?, but also ?::json, etc.
void sql(BindingSQLContext<U> ctx) throws SQLException;
Below is full fledged example implementation that uses Google Gson to model JSON documents in Java
// We're binding <T> = Object (unknown database type), and <U> = JsonElement (user type)
public class PostgresJSONGsonBinding implements Binding<Object, JsonElement> {
@Override
public Object to(JsonElement u) {
return u == null || u == JsonNull.INSTANCE ? null : new Gson().toJson(u);
}
@Override
public Class<Object> fromType() {
return Object.class;
}
@Override
public Class<JsonElement> toType() {
return JsonElement.class;
}
};
}
// Rending a bind variable for the binding context's value and casting it to the json type
@Override
public void sql(BindingSQLContext<JsonElement> ctx) throws SQLException {
// Depending on how you generate your SQL, you may need to explicitly distinguish
// between jOOQ generating bind variables or inlined literals.
if (ctx.render().paramType() == ParamType.INLINED)
ctx.render().visit(DSL.inline(ctx.convert(converter()).value())).sql("::json");
else
ctx.render().sql("?::json");
}
// Converting the JsonElement to a String value and setting that on a JDBC PreparedStatement
@Override
public void set(BindingSetStatementContext<JsonElement> ctx) throws SQLException {
ctx.statement().setString(ctx.index(), Objects.toString(ctx.convert(converter()).value(), null));
}
// Getting a String value from a JDBC ResultSet and converting that to a JsonElement
@Override
public void get(BindingGetResultSetContext<JsonElement> ctx) throws SQLException {
ctx.convert(converter()).value(ctx.resultSet().getString(ctx.index()));
}
// Getting a String value from a JDBC CallableStatement and converting that to a JsonElement
@Override
public void get(BindingGetStatementContext<JsonElement> ctx) throws SQLException {
ctx.convert(converter()).value(ctx.statement().getString(ctx.index()));
}
// Getting a value from a JDBC SQLInput (useful for Oracle OBJECT types)
@Override
public void get(BindingGetSQLInputContext<JsonElement> ctx) throws SQLException {
throw new SQLFeatureNotSupportedException();
}
}
Code generation
There is a special section in the manual explaining how to automatically tie your Converters and Bindings
to your generated code. The relevant sections are:
this.arg0 = arg0;
this.arg1 = arg1;
}
@Override
public void accept(RenderContext context) {
context.visit(delegate(context.configuration()));
}
case SQLSERVER:
return DSL.field("CONVERT(VARCHAR(8), {0}, {1})", String.class, arg0, arg1);
default:
throw new UnsupportedOperationException("Dialect not supported");
}
}
}
The above CustomField implementation can be exposed from your own custom DSL class:
- method(String, Object...): This is a method that accepts a SQL string and a list of bind values that
are to be bound to the variables contained in the SQL string
- method(String, QueryPart...): This is a method that accepts a SQL string and a list of QueryParts
that are "injected" at the position of their respective placeholders in the SQL string
// Plain SQL using bind values. The value 5 is bound to the first variable, "Animal Farm" to the second variable:
create.selectFrom(BOOK).where(
"BOOK.ID = ? AND TITLE = ?", // The SQL string containing bind value placeholders ("?")
5, // The bind value at index 1
"Animal Farm" // The bind value at index 2
).fetch();
Note that for historic reasons the two API usages can also be mixed, although this is not recommended
and the exact behaviour is unspecified.
- Single-line comments (starting with -- in all databases (or #) in MySQL) are rendered without
modification. Any bind variable or QueryPart placeholders in such comments are ignored.
- Multi-line comments (starting with /* and ending with */ in all databases) are rendered without
modification. Any bind variable or QueryPart placeholders in such comments are ignored.
- String literals (starting and ending with ' in all databases, where all databases support escaping
of the quote character by duplication as such: '', or in MySQL by escaping as such: \' (if
Settings.backslashEscaping is turned on)) are rendered without modification. Any bind variable
or QueryPart placeholders in such comments are ignored.
- Quoted names (starting and ending with " in most databases, with ` in MySQL, or with [ and ]
in T-SQL databases) are rendered without modification. Any bind variable or QueryPart
placeholders in such comments are ignored.
- JDBC escape syntax ({fn ...}, {d ...}, {t ...}, {ts ...}) is rendered without modification. Any bind
variable or QueryPart placeholders in such comments are ignored.
- Bind variable placeholders (? or :name for named bind variables) are replaced by the
matching bind value in case inlining is activated, e.g. through Settings.statementType ==
STATIC_STATEMENT.
- QueryPart placeholders ({number}) are replaced by the matching QueryPart.
- Keywords ({identifier}) are treated like keywords and rendered in the correct case according to
Settings.renderKeywordStyle.
// Identifiers / names (which are rendered according to Settings.renderNameStyle) can be specified as such:
public static Name name(String... qualifiedName) { ... }
// QueryPart lists (e.g. IN-lists for the IN predicate) can be generated via these methods:
public static QueryPart list(QueryPart... parts) { ... }
public static QueryPart list(Collection<? extends QueryPart> parts) { ... }
4.15.7. Serializability
A lot of jOOQ types extend and implement the java.io.Serializable interface for your convenience.
Beware, however, that jOOQ will make no guarantees related to the serialisation format, and its
backwards compatible evolution. This means that while it is generally safe to rely on jOOQ types being
serialisable when two processes using the exact same jOOQ version transfer jOOQ state over some
network, it is not safe to rely on persisting serialised jOOQ state to be deserialised again at a later time
- even after a patch release upgrade!
As always with Java's serialisation, if you want reliable serialisation of Java objects, please use your own
serialisation protocol, or use one of the official export formats.
But textual or binary bind values can get quite long, quickly filling your log files with irrelevant
information. It would be good to be able to abbreviate such long values (and possibly add a remark to
the logged statement). Instead of patching jOOQ's internals, we can just transform the SQL statements
in the logger implementation, cleanly separating concerns. This can be done with the following
VisitListener:
@Override
public void visitStart(VisitContext context) {
// ... and replace it in the current rendering context (not in the Query)
context.queryPart(val(abbreviate((String) value, maxLength)));
}
// ... and replace it in the current rendering context (not in the Query)
context.queryPart(val(Arrays.copyOf((byte[]) value, maxLength)));
}
}
}
}
@Override
public void visitEnd(VisitContext context) {
// ... and if this is the top-level QueryPart, then append a SQL comment to indicate the abbreviation
if (context.queryPartsLength() == 1) {
context.renderContext().sql(" -- Bind values may have been abbreviated");
}
}
}
}
If maxLength were set to 5, the above listener would produce the following log output:
In the above example, we're looking for the 3rd value of X in T ordered by Y. Clearly, this window function
uses one-based indexing. The same is true for the ORDER BY clause, which orders the result by the 1st
column - again one-based counting. There is no column zero in SQL.
Unlike in JDBC, where java.sql.ResultSet#absolute(int) positions the underlying cursor at the one-based
index, we Java developers really don't like that way of thinking. As can be seen in the above loop, we
iterate over this result as we do over any other Java collection.
/**
* A Scala-esque representation of {@link org.jooq.Field}, adding overloaded
* operators for common jOOQ operations to arbitrary fields
*/
trait SAnyField[T] extends Field[T] {
// String operations
// -----------------
// Comparison predicates
// ---------------------
/**
* A Scala-esque representation of {@link org.jooq.Field}, adding overloaded
* operators for common jOOQ operations to numeric fields
*/
trait SNumberField[T <: Number] extends SAnyField[T] {
// Arithmetic operations
// ---------------------
// Bitwise operations
// ------------------
An example query using such overloaded operators would then look like this:
select (
BOOK.ID * BOOK.AUTHOR_ID,
BOOK.ID + BOOK.AUTHOR_ID * 3 + 4,
BOOK.TITLE || " abc" || " xy")
from BOOK
leftOuterJoin (
select (x.ID, x.YEAR_OF_BIRTH)
from x
limit 1
asTable x.getName()
)
on BOOK.AUTHOR_ID === x.ID
where (BOOK.ID <> 2)
or (BOOK.TITLE in ("O Alquimista", "Brida"))
fetch
5. SQL execution
In a previous section of the manual, we've seen how jOOQ can be used to build SQL that can be executed
with any API including JDBC or ... jOOQ. This section of the manual deals with various means of actually
executing SQL with jOOQ.
- java.sql.Statement, or "static statement": This statement type is used for any arbitrary type of
SQL statement. It is particularly useful with inlined parameters
- java.sql.PreparedStatement: This statement type is used for any arbitrary type of SQL statement.
It is particularly useful with indexed parameters (note that JDBC does not support named
parameters)
- java.sql.CallableStatement: This statement type is used for SQL statements that are "called"
rather than "executed". In particular, this includes calls to stored procedures. Callable
statements can register OUT parameters
Today, the JDBC API may look weird to users being used to object-oriented design. While statements
hide a lot of SQL dialect-specific implementation details quite well, they assume a lot of knowledge
about the internal state of a statement. For instance, you can use the PreparedStatement.addBatch()
method, to add a the prepared statement being created to an "internal list" of batch statements. Instead
of returning a new type, this method forces user to reflect on the prepared statement's internal state
or "mode".
- Both APIs return the number of affected records in non-result queries. JDBC:
Statement.executeUpdate(), jOOQ: Query.execute()
- Both APIs return a scrollable result set type from result queries. JDBC: java.sql.ResultSet, jOOQ:
org.jooq.Result
Differences to JDBC
Some of the most important differences between JDBC and jOOQ are listed here:
- Query vs. ResultQuery: JDBC does not formally distinguish between queries that can return
results, and queries that cannot. The same API is used for both. This greatly reduces the
possibility for fetching convenience methods
- Exception handling: While SQL uses the checked java.sql.SQLException, jOOQ wraps all
exceptions in an unchecked org.jooq.exception.DataAccessException
- org.jooq.Result: Unlike its JDBC counter-part, this type implements java.util.List and is fully
loaded into Java memory, freeing resources as early as possible. Just like statements, this means
that users don't have to deal with a "weird" internal result set state.
- org.jooq.Cursor: If you want more fine-grained control over how many records are fetched into
memory at once, you can still do that using jOOQ's lazy fetching feature
- Statement type: jOOQ does not formally distinguish between static statements and prepared
statements. By default, all statements are prepared statements in jOOQ, internally. Executing a
statement as a static statement can be done simply using a custom settings flag
- Closing Statements: JDBC keeps open resources even if they are already consumed. With
JDBC, there is a lot of verbosity around safely closing resources. In jOOQ, resources are closed
after consumption, by default. If you want to keep them open after consumption, you have to
explicitly say so.
- JDBC flags: JDBC execution flags and modes are not modified. They can be set fluently on a
Query
- Zero-based vs one-based APIs: JDBC is a one-based API, jOOQ is a zero-based API. While this
makes sense intuitively (JDBC is the less intuitive API from a Java perspective), it can lead to
confusion in certain cases.
the manual's section about fetching to learn more about fetching results). With plain SQL, the distinction
can be made clear most easily:
5.3. Fetching
Fetching is something that has been completely neglegted by JDBC and also by various other database
abstraction libraries. Fetching is much more than just looping or listing records or mapped objects.
There are so many ways you may want to fetch data from a database, it should be considered a first-
class feature of any database abstraction API. Just to name a few, here are some of jOOQ's fetching
modes:
- Untyped vs. typed fetching: Sometimes you care about the returned type of your records,
sometimes (with arbitrary projections) you don't.
- Fetching arrays, maps, or lists: Instead of letting you transform your result sets into any more
suitable data type, a library should do that work for you.
- Fetching through handler callbacks: This is an entirely different fetching paradigm. With Java 8's
lambda expressions, this will become even more powerful.
- Fetching through mapper callbacks: This is an entirely different fetching paradigm. With Java 8's
lambda expressions, this will become even more powerful.
- Fetching custom POJOs: This is what made Hibernate and JPA so strong. Automatic mapping of
tables to custom POJOs.
- Lazy vs. eager fetching: It should be easy to distinguish these two fetch modes.
- Fetching many results: Some databases allow for returning many result sets from a single query.
JDBC can handle this but it's very verbose. A list of results should be returned instead.
- Fetching data asynchronously: Some queries take too long to execute to wait for their results.
You should be able to spawn query execution in a separate process.
// The "standard" fetch when you know your query returns only one record. This may return null.
R fetchOne();
// The "standard" fetch when you know your query returns only one record.
Optional<R> fetchOptional();
// The "standard" fetch when you only want to fetch the first record
R fetchAny();
// Execute a ResultQuery with jOOQ, but return a JDBC ResultSet, not a jOOQ object
ResultSet fetchResultSet();
Fetch convenience
These means of fetching are also available from org.jooq.Result and org.jooq.Record APIs
// These methods are convenience for fetching only a single field, possibly converting results to another type
// Instead of returning lists, these return arrays
<T> T[] fetchArray(Field<T> field);
<T> T[] fetchArray(Field<?> field, Class<? extends T> type);
<T, U> U[] fetchArray(Field<T> field, Converter<? super T, U> converter);
Object[] fetchArray(int fieldIndex);
<T> T[] fetchArray(int fieldIndex, Class<? extends T> type);
<U> U[] fetchArray(int fieldIndex, Converter<?, U> converter);
Object[] fetchArray(String fieldName);
<T> T[] fetchArray(String fieldName, Class<? extends T> type);
<U> U[] fetchArray(String fieldName, Converter<?, U> converter);
// These methods are convenience for fetching only a single field from a single record,
// possibly converting results to another type
<T> T fetchOne(Field<T> field);
<T> T fetchOne(Field<?> field, Class<? extends T> type);
<T, U> U fetchOne(Field<T> field, Converter<? super T, U> converter);
Object fetchOne(int fieldIndex);
<T> T fetchOne(int fieldIndex, Class<? extends T> type);
<U> U fetchOne(int fieldIndex, Converter<?, U> converter);
Object fetchOne(String fieldName);
<T> T fetchOne(String fieldName, Class<? extends T> type);
<U> U fetchOne(String fieldName, Converter<?, U> converter);
Fetch transformations
These means of fetching are also available from org.jooq.Result and org.jooq.Record APIs
Note, that apart from the fetchLazy() methods, all fetch() methods will immediately close underlying
JDBC result sets.
When you use the DSLContext.selectFrom() method, jOOQ will return the record type supplied with the
argument table. Beware though, that you will no longer be able to use any clause that modifies the type
of your table expression. This includes:
// "extract" the two individual strongly typed TableRecord types from the denormalised Record:
BookRecord book = record.into(BOOK);
AuthorRecord author = record.into(AUTHOR);
Higher-degree records
jOOQ chose to explicitly support degrees up to 22 to match Scala's typesafe tuple, function and product
support. Unlike Scala, however, jOOQ also supports higher degrees without the additional typesafety.
your specific needs. Or you just want to list all values of one specific column. Here are some examples
to illustrate those use cases:
Note that most of these convenience methods are available both through org.jooq.ResultQuery and
org.jooq.Result, some are even available through org.jooq.Record as well.
5.3.4. RecordHandler
In a more functional operating mode, you might want to write callbacks that receive records from
your select statement results in order to do some processing. This is a common data access pattern
in Spring's JdbcTemplate, and it is also available in jOOQ. With jOOQ, you can implement your own
org.jooq.RecordHandler classes and plug them into jOOQ's org.jooq.ResultQuery:
// Or more concisely
create.selectFrom(BOOK)
.orderBy(BOOK.ID)
.fetchInto(new RecordHandler<BookRecord>() {...});
See also the manual's section about the RecordMapper, which provides similar features
5.3.5. RecordMapper
In a more functional operating mode, you might want to write callbacks that map records from your
select statement results in order to do some processing. This is a common data access pattern in
Spring's JdbcTemplate, and it is also available in jOOQ. With jOOQ, you can implement your own
org.jooq.RecordMapper classes and plug them into jOOQ's org.jooq.ResultQuery:
// Of course, the lambda could be expanded into the following anonymous RecordMapper:
create.selectFrom(BOOK)
.orderBy(BOOK.ID)
.fetch(new RecordMapper<BookRecord, Integer>() {
@Override
public Integer map(BookRecord book) {
return book.getId();
}
});
Your custom RecordMapper types can be used automatically through jOOQ's POJO mapping APIs, by
injecting a RecordMapperProvider into your Configuration.
See also the manual's section about the RecordHandler, which provides similar features
5.3.6. POJOs
Fetching data in records is fine as long as your application is not really layered, or as long as you're
still writing code in the DAO layer. But if you have a more advanced application architecture, you may
not want to allow for jOOQ artefacts to leak into other layers. You may choose to write POJOs (Plain
Old Java Objects) as your primary DTOs (Data Transfer Objects), without any dependencies on jOOQ's
org.jooq.Record types, which may even potentially hold a reference to a Configuration, and thus a JDBC
java.sql.Connection. Like Hibernate/JPA, jOOQ allows you to operate with POJOs. Unlike Hibernate/JPA,
jOOQ does not "attach" those POJOs or create proxies with any magic in them.
If you're using jOOQ's code generator, you can configure it to generate POJOs for you, but you're not
required to use those generated POJOs. You can use your own. See the manual's section about POJOs
with custom RecordMappers to see how to modify jOOQ's standard POJO mapping behaviour.
@Column(name = "TITLE")
public String myTitle;
}
// The various "into()" methods allow for fetching records into your custom POJOs:
MyBook myBook = create.select().from(BOOK).fetchAny().into(MyBook.class);
List<MyBook> myBooks = create.select().from(BOOK).fetch().into(MyBook.class);
List<MyBook> myBooks = create.select().from(BOOK).fetchInto(MyBook.class);
Just as with any other JPA implementation, you can put the javax.persistence.Column annotation on
any class member, including attributes, setters and getters. Please refer to the Record.into() Javadoc
for more details.
// The various "into()" methods allow for fetching records into your custom POJOs:
MyBook1 myBook = create.select().from(BOOK).fetchAny().into(MyBook1.class);
List<MyBook1> myBooks = create.select().from(BOOK).fetch().into(MyBook1.class);
List<MyBook1> myBooks = create.select().from(BOOK).fetchInto(MyBook1.class);
// With "immutable" POJO classes, there must be an exact match between projected fields and available constructors:
MyBook2 myBook = create.select(BOOK.ID, BOOK.TITLE).from(BOOK).fetchAny().into(MyBook2.class);
List<MyBook2> myBooks = create.select(BOOK.ID, BOOK.TITLE).from(BOOK).fetch().into(MyBook2.class);
List<MyBook2> myBooks = create.select(BOOK.ID, BOOK.TITLE).from(BOOK).fetchInto(MyBook2.class);
// With annotated "immutable" POJO classes, there doesn't need to be an exact match between fields and constructor arguments.
// In the below cases, only BOOK.ID is really set onto the POJO, BOOK.TITLE remains null and BOOK.AUTHOR_ID is ignored
MyBook3 myBook = create.select(BOOK.ID, BOOK.AUTHOR_ID).from(BOOK).fetchAny().into(MyBook3.class);
List<MyBook3> myBooks = create.select(BOOK.ID, BOOK.AUTHOR_ID).from(BOOK).fetch().into(MyBook3.class);
List<MyBook3> myBooks = create.select(BOOK.ID, BOOK.AUTHOR_ID).from(BOOK).fetchInto(MyBook3.class);
// A "proxyable" type
public interface MyBook3 {
int getId();
void setId(int id);
String getTitle();
void setTitle(String title);
}
// The various "into()" methods allow for fetching records into your custom POJOs:
MyBook3 myBook = create.select(BOOK.ID, BOOK.TITLE).from(BOOK).fetchAny().into(MyBook3.class);
List<MyBook3> myBooks = create.select(BOOK.ID, BOOK.TITLE).from(BOOK).fetch().into(MyBook3.class);
List<MyBook3> myBooks = create.select(BOOK.ID, BOOK.TITLE).from(BOOK).fetchInto(MyBook3.class);
// Insert it (implicitly)
book.store();
// Insert it (explicitly)
create.executeInsert(book);
Note: Because of your manual setting of ID = 10, jOOQ's store() method will asume that you want to
insert a new record. See the manual's section about CRUD with UpdatableRecords for more details
on this.
// Initialise a Configuration
Configuration configuration = new DefaultConfiguration().set(connection).set(SQLDialect.ORACLE);
// Delete it again
bookDao.delete(book);
DSL.using(new DefaultConfiguration()
.set(connection)
.set(SQLDialect.ORACLE)
.set(
new RecordMapperProvider() {
@Override
public <R extends Record, E> RecordMapper<R, E> provide(RecordType<R> recordType, Class<? extends E> type) {
The above is a very simple example showing that you will have complete flexibility in how to override
jOOQ's record to POJO mapping mechanisms.
Util.doThingsWithBook(book);
}
}
- Strongly or weakly typed records: Cursors are also typed with the <R> type, allowing to fetch
custom, generated org.jooq.TableRecord or plain org.jooq.Record types.
- RecordHandler callbacks: You can use your own org.jooq.RecordHandler callbacks to receive
lazily fetched records.
- RecordMapper callbacks: You can use your own org.jooq.RecordMapper callbacks to map lazily
fetched records.
- POJOs: You can fetch data into your own custom POJO types.
A more sophisticated example would be using streams to transform the results and add business
logic to it. For instance, to generate a DDL script with CREATE TABLE statements from the
INFORMATION_SCHEMA of an H2 database:
create.select(
COLUMNS.TABLE_NAME,
COLUMNS.COLUMN_NAME,
COLUMNS.TYPE_NAME)
.from(COLUMNS)
.orderBy(
COLUMNS.TABLE_CATALOG,
COLUMNS.TABLE_SCHEMA,
COLUMNS.TABLE_NAME,
COLUMNS.ORDINAL_POSITION)
.fetch() // Eagerly load the whole ResultSet into memory first
.stream()
.collect(groupingBy(
r -> r.getValue(COLUMNS.TABLE_NAME),
LinkedHashMap::new,
mapping(
r -> new SimpleEntry(
r.getValue(COLUMNS.COLUMN_NAME),
r.getValue(COLUMNS.TYPE_NAME)
),
toList()
)))
.forEach(
(table, columns) -> {
// Just emit a CREATE TABLE statement
System.out.println("CREATE TABLE " + table + " (");
// Map each "Column" type into a String containing the column specification,
// and join them using comma and newline. Done!
System.out.println(
columns.stream()
.map(col -> " " + col.getKey() +
" " + col.getValue())
.collect(Collectors.joining(",\n"))
);
System.out.println(");");
});
The above combination of SQL and functional programming will produce the following output:
+--------+-----+-----------+-------------+-------------------+
|Name |Owner|Object_type|Object_status|Create_date |
+--------+-----+-----------+-------------+-------------------+
| author|dbo |user table | -- none -- |Sep 22 2011 11:20PM|
+--------+-----+-----------+-------------+-------------------+
+-------------+-------+------+----+-----+-----+
|Column_name |Type |Length|Prec|Scale|... |
+-------------+-------+------+----+-----+-----+
|id |int | 4|NULL| NULL| 0|
|first_name |varchar| 50|NULL| NULL| 1|
|last_name |varchar| 50|NULL| NULL| 0|
|date_of_birth|date | 4|NULL| NULL| 1|
|year_of_birth|int | 4|NULL| NULL| 1|
+-------------+-------+------+----+-----+-----+
ResultSet rs = statement.executeQuery();
As previously discussed in the chapter about differences between jOOQ and JDBC, jOOQ does not rely
on an internal state of any JDBC object, which is "externalised" by Javadoc. Instead, it has a straight-
forward API allowing you to do the above in a one-liner:
// Get some information about the author table, its columns, keys, indexes, etc
Results results = create.fetchMany("sp_help 'author'");
The returned org.jooq.Results type extends the List<Result<Record>> type for backwards-compatibility
reasons, but it also allows to access individual update counts that may have been returned by the
database in between result sets.
// This lambda will supply an int value indicating the number of inserted rows
.supplyAsync(() ->
DSL.using(configuration)
.insertInto(AUTHOR, AUTHOR.ID, AUTHOR.LAST_NAME)
.values(3, "Hitchcock")
.execute()
)
// This will supply an AuthorRecord value for the newly inserted author
.handleAsync((rows, throwable) ->
DSL.using(configuration)
.fetchOne(AUTHOR, AUTHOR.ID.eq(3))
)
// This will supply an int value indicating the number of deleted rows
.handleAsync((rows, throwable) ->
DSL.using(configuration)
.delete(AUTHOR)
.where(AUTHOR.ID.eq(3))
.execute()
)
.join();
The above example will execute four actions one after the other, but asynchronously in the JDK's default
or common java.util.concurrent.ForkJoinPool.
For more information, please refer to the java.util.concurrent.CompletableFuture Javadoc and official
documentation.
Note, that instead of letting jOOQ spawn a new thread, you can also provide jOOQ with your own
java.util.concurrent.ExecutorService:
try (
// But you can also directly access that ResultSet from ResultQuery:
ResultSet rs2 = create.selectFrom(BOOK).fetchResultSet()) {
// ...
}
// As a Result:
Result<Record> result = create.fetch(rs);
// As a Cursor
Cursor<Record> cursor = create.fetchLazy(rs);
You can also tighten the interaction with jOOQ's data type system and data type conversion features,
by passing the record type to the above fetch methods:
If supplied, the additional information is used to override the information obtained from the ResultSet's
java.sql.ResultSetMetaData information.
- null is always converted to null, or the primitive default value, or Optional.empty(), regardless of
the target type.
- Identity conversion (converting a value to its own type) is always possible.
- Primitive types can be converted to their wrapper types and vice versa
- All types can be converted to String
- All types can be converted to Object
- All Number types can be converted to other Number types
- All Number or String types can be converte to Boolean. Possible (case-insensitive) values for
true:
* 1
* 1.0
* y
* yes
* true
* on
* enabled
* 0
* 0.0
* n
* no
* false
* off
* disabled
This auto conversion can be applied explicitly, but is also available through a variety of API, in particular
anywhere a java.lang.Class reference can be provided, such as:
/**
* Convert a database object to a user object
*/
U from(T databaseObject);
/**
* Convert a user object to a database object
*/
T to(U userObject);
/**
* The database type
*/
Class<T> fromType();
/**
* The user type
*/
Class<U> toType();
}
Such a converter can be used in many parts of the jOOQ API. Some examples have been illustrated in
the manual's section about fetching.
@Override
public GregorianCalendar from(Timestamp databaseObject) {
GregorianCalendar calendar = (GregorianCalendar) Calendar.getInstance();
calendar.setTimeInMillis(databaseObject.getTime());
return calendar;
}
@Override
public Timestamp to(GregorianCalendar userObject) {
return new Timestamp(userObject.getTime().getTime());
}
@Override
public Class<Timestamp> fromType() {
return Timestamp.class;
}
@Override
public Class<GregorianCalendar> toType() {
return GregorianCalendar.class;
}
}
Enum Converters
jOOQ ships with a built-in default org.jooq.impl.EnumConverter, that you can use to map VARCHAR
values to enum literals or NUMBER values to enum ordinals (both modes are supported). Let's say, you
want to map a YES / NO / MAYBE column to a custom Enum:
// And you're all set for converting records to your custom Enum:
for (BookRecord book : create.selectFrom(BOOK).fetch()) {
switch (book.getValue(BOOK.I_LIKE, new YNMConverter())) {
case YES: System.out.println("I like this book : " + book.getTitle()); break;
case NO: System.out.println("I didn't like this book : " + book.getTitle()); break;
case MAYBE: System.out.println("I'm not sure about this book : " + book.getTitle()); break;
}
}
If you're using forcedTypes in your code generation configuration, you can configure the application
of an EnumConverter by adding <enumConverter>true</enumConverter> to your <forcedType/>
configuration.
+----+-----------+--------------+
| ID | AUTHOR_ID | TITLE |
+----+-----------+--------------+
| 1 | 1 | 1984 |
| 2 | 1 | Animal Farm |
| 3 | 2 | O Alquimista |
| 4 | 2 | Brida |
+----+-----------+--------------+
Now, if you have millions of records with only few distinct values for AUTHOR_ID, you may not want to
hold references to distinct (but equal) java.lang.Integer objects. This is specifically true for IDs of type
java.util.UUID or string representations thereof. jOOQ allows you to "intern" those values:
You can specify as many fields as you want for interning. The above has the following effect:
- If the interned Field is of type java.lang.String, then String.intern() is called upon each string
- If the interned Field is of any other type, then the call is ignored
Future versions of jOOQ will implement interning of data for non-String data types by collecting values
in java.util.Set, removing duplicate instances.
Note, that jOOQ will not use interned data for identity comparisons: string1 == string2. Interning is used
only to reduce the memory footprint of org.jooq.Result objects.
The above technique can be quite useful when you want to reuse expensive database resources. This
can be the case when your statement is executed very frequently and your database would take non-
negligible time to soft-parse the prepared statement and generate a new statement / cursor resource.
The above example shows how a query can be executed twice against the same underlying
PreparedStatement. Notice how the Query must now be treated like a resource, i.e. it must be managed
in a try-with-resources statement, or Query.close() must be called explicitly.
// [...]
// [...]
DSL.using(new DefaultConfiguration()
.set(connection)
.set(SQLDialect.ORACLE)
.set(DefaultExecuteListenerProvider.providers(
new DefaultExecuteListener() {
@Override
public void recordStart(ExecuteContext ctx) {
try {
In the above example, your custom ExecuteListener callback is triggered before jOOQ loads a new
Record from the JDBC ResultSet. With the concurrency being set to ResultSet.CONCUR_UPDATABLE,
you can now modify the database cursor through the standard JDBC ResultSet API.
Using JDBC
In code, this looks like the following snippet:
// 1. several queries
// ------------------
try (Statement stmt = connection.createStatement()) {
stmt.addBatch("INSERT INTO author(id, first_name, last_name) VALUES (1, 'Erich', 'Gamma')");
stmt.addBatch("INSERT INTO author(id, first_name, last_name) VALUES (2, 'Richard', 'Helm')");
stmt.addBatch("INSERT INTO author(id, first_name, last_name) VALUES (3, 'Ralph', 'Johnson')");
stmt.addBatch("INSERT INTO author(id, first_name, last_name) VALUES (4, 'John', 'Vlissides')");
int[] result = stmt.executeBatch();
}
// 2. a single query
// -----------------
try (PreparedStatement stmt = connection.prepareStatement("INSERT INTO author(id, first_name, last_name) VALUES (?, ?, ?)")) {
stmt.setInt(1, 1);
stmt.setString(2, "Erich");
stmt.setString(3, "Gamma");
stmt.addBatch();
stmt.setInt(1, 2);
stmt.setString(2, "Richard");
stmt.setString(3, "Helm");
stmt.addBatch();
stmt.setInt(1, 3);
stmt.setString(2, "Ralph");
stmt.setString(3, "Johnson");
stmt.addBatch();
stmt.setInt(1, 4);
stmt.setString(2, "John");
stmt.setString(3, "Vlissides");
stmt.addBatch();
Using jOOQ
jOOQ supports executing queries in batch mode as follows:
// 1. several queries
// ------------------
create.batch(
create.insertInto(AUTHOR, ID, FIRST_NAME, LAST_NAME).values(1, "Erich" , "Gamma" ),
create.insertInto(AUTHOR, ID, FIRST_NAME, LAST_NAME).values(2, "Richard", "Helm" ),
create.insertInto(AUTHOR, ID, FIRST_NAME, LAST_NAME).values(3, "Ralph" , "Johnson" ),
create.insertInto(AUTHOR, ID, FIRST_NAME, LAST_NAME).values(4, "John" , "Vlissides"))
.execute();
// 2. a single query
// -----------------
create.batch(create.insertInto(AUTHOR, ID, FIRST_NAME, LAST_NAME ).values((Integer) null, null, null))
.bind( 1 , "Erich" , "Gamma" )
.bind( 2 , "Richard" , "Helm" )
.bind( 3 , "Ralph" , "Johnson" )
.bind( 4 , "John" , "Vlissides")
.execute();
When creating a batch execution with a single query and multiple bind values, you will still have to
provide jOOQ with dummy bind values for the original query. In the above example, these are set to
null. For subsequent calls to bind(), there will be no type safety provided by jOOQ.
For more info about inlining sequence references in SQL statements, please refer to the manual's
section about sequences and serials.
- Ada
- BASIC
- Pascal
- etc...
The general distinction between (stored) procedures and (stored) functions can be summarised like this:
Procedures
Functions
- DB2, H2, and HSQLDB don't allow for JDBC escape syntax when calling functions. Functions must
be used in a SELECT statement
- H2 only knows functions (without OUT parameters)
- Oracle functions may have OUT parameters
- Oracle knows functions that must not be used in SQL statements for transactional reasons
- Postgres only knows functions (with all features combined). OUT parameters can also be
interpreted as return values, which is quite elegant/surprising, depending on your taste
- The Sybase jconn3 JDBC driver doesn't handle null values correctly when using the JDBC escape
syntax on functions
In general, it can be said that the field of routines (procedures / functions) is far from being standardised
in modern RDBMS even if the SQL:2008 standard specifies things quite well. Every database has
its ways and JDBC only provides little abstraction over the great variety of procedures / functions
implementations, especially when advanced data types such as cursors / UDT's / arrays are involved.
To simplify things a little bit, jOOQ handles both procedures and functions the same way, using a more
general org.jooq.Routine type.
-- Check whether there is an author in AUTHOR by that name and get his ID
CREATE OR REPLACE PROCEDURE author_exists (author_name VARCHAR2, result OUT NUMBER, id OUT NUMBER);
But you can also call the procedure using a generated convenience method in a global Routines class:
// The generated Routines class contains static methods for every procedure.
// Results are also returned in a generated object, holding getters for every OUT or IN OUT parameter.
AuthorExists procedure = Routines.authorExists(configuration, "Paulo");
For more details about code generation for procedures, see the manual's section about procedures
and code generation.
-- Check whether there is an author in AUTHOR by that name and get his ID
CREATE OR REPLACE FUNCTION author_exists (author_name VARCHAR2) RETURN NUMBER;
-- This is the rendered SQL // Use the static-imported method from Routines:
boolean exists =
SELECT AUTHOR_EXISTS('Paulo') FROM DUAL create.select(authorExists("Paulo")).fetchOne(0, boolean.class);
For more info about inlining stored function references in SQL statements, please refer to the manual's
section about user-defined functions.
- A Java package holding classes for formal Java representations of the procedure/function in that
package
- A Java class holding convenience methods to facilitate calling those procedures/functions
Apart from this, the generated source code looks exactly like the one for standalone procedures/
functions.
For more details about code generation for procedures and packages see the manual's section about
procedures and code generation.
© 2009 - 2019 by Data Geekery™ GmbH. Page 201 / 323
The jOOQ User Manual 5.9.2. Oracle member procedures
These member functions and procedures can simply be mapped to Java methods:
// Set the author ID and load the record using the LOAD procedure
author.setId(1);
author.load();
For more details about code generation for UDTs see the manual's section about user-defined types
and code generation.
The above query will result in an XML document looking like the following one:
<result xmlns="http://www.jooq.org/xsd/jooq-export-3.10.0.xsd">
<fields>
<field schema="TEST" table="BOOK" name="ID" type="INTEGER"/>
<field schema="TEST" table="BOOK" name="AUTHOR_ID" type="INTEGER"/>
<field schema="TEST" table="BOOK" name="TITLE" type="VARCHAR"/>
</fields>
<records>
<record>
<value field="ID">1</value>
<value field="AUTHOR_ID">1</value>
<value field="TITLE">1984</value>
</record>
<record>
<value field="ID">2</value>
<value field="AUTHOR_ID">1</value>
<value field="TITLE">Animal Farm</value>
</record>
</records>
</result>
The same result as an org.w3c.dom.Document can be obtained using the Result.intoXML() method:
See the XSD schema definition here, for a formal definition of the XML export format:
http://www.jooq.org/xsd/jooq-export-3.10.0.xsd
The above query will result in a CSV document looking like the following one:
ID,AUTHOR_ID,TITLE
1,1,1984
2,1,Animal Farm
In addition to the standard behaviour, you can also specify a separator character, as well as a special
string to represent NULL values (which cannot be represented in standard CSV):
The above query will result in a JSON document looking like the following one:
{"fields":[{"schema":"schema-1","table":"table-1","name":"field-1","type":"type-1"},
{"schema":"schema-2","table":"table-2","name":"field-2","type":"type-2"},
...,
{"schema":"schema-n","table":"table-n","name":"field-n","type":"type-n"}],
"records":[[value-1-1,value-1-2,...,value-1-n],
[value-2-1,value-2-2,...,value-2-n]]}
Note: This format has been modified in jOOQ 2.6.0 and 3.7.0
The above query will result in an HTML document looking like the following one
<table>
<thead>
<tr>
<th>ID</th>
<th>AUTHOR_ID</th>
<th>TITLE</th>
</tr>
</thead>
<tbody>
<tr>
<td>1</td>
<td>1</td>
<td>1984</td>
</tr>
<tr>
<td>2</td>
<td>1</td>
<td>Animal Farm</td>
</tr>
</tbody>
</table>
The above query will result in a text document looking like the following one
+---+---------+-----------+
| ID|AUTHOR_ID|TITLE |
+---+---------+-----------+
| 1| 1|1984 |
| 2| 1|Animal Farm|
+---+---------+-----------+
A simple text representation can also be obtained by calling toString() on a Result object. See also the
manual's section about DEBUG logging
ID,AUTHOR_ID,TITLE <-- Note the CSV header. By default, the first line is ignored
1,1,1984
2,1,Animal Farm
With jOOQ, you can load this data using various parameters from the loader API. A simple load may
look like this:
// Ignore the AUTHOR_ID column from the CSV file when inserting
create.loadInto(BOOK)
.loadCSV(inputstream, encoding)
.fields(AUTHOR.ID, null, AUTHOR.TITLE)
.execute();
.loadCSV(inputstream)
.fields(BOOK.ID, null, BOOK.TITLE)
.execute();
.loadCSV(inputstream, encoding)
.fields(BOOK.ID, null, BOOK.TITLE)
.execute();
.loadCSV(inputstream, encoding)
.fields(BOOK.ID, null, BOOK.TITLE)
.execute();
Any of the above configuration methods can be combined to achieve the type of load you need. Please
refer to the API's Javadoc to learn about more details. Errors that occur during the load are reported
by the execute method's result:
{"fields" :[{"name":"ID","type":"INTEGER"},
{"name":"AUTHOR_ID","type":"INTEGER"},
{"name":"TITLE","type":"VARCHAR"}],
"records":[[1,1,"1984"],
[2,1,"Animal Farm"]]}
With jOOQ, you can load this data using various parameters from the loader API. A simple load may
look like this:
No other, JSON-specific options are currently available. For additional Loader API options, please refer
to the manual's section about importing CSV.
DSL.using(configuration2)
.loadInto(BOOK)
.loadRecords(result)
.fields(BOOK.ID, BOOK.AUTHOR_ID, BOOK.TITLE)
.execute();
No other, Record-specific options are currently available. For additional Loader API options, please refer
to the manual's section about importing CSV.
DSL.using(configuration)
.loadInto(BOOK)
.loadArrays(
new Object[] { 1, 1, "1984" },
new Object[] { 2, 1, "Animal Farm" })
.fields(BOOK.ID, BOOK.AUTHOR_ID, BOOK.TITLE)
.execute();
No other, Array-specific options are currently available. For additional Loader API options, please refer
to the manual's section about importing CSV.
© 2009 - 2019 by Data Geekery™ GmbH. Page 207 / 323
The jOOQ User Manual 5.11.5. Importing XML
- Create (INSERT)
- Read (SELECT)
- Update (UPDATE)
- Delete (DELETE)
CRUD always uses the same patterns, regardless of the nature of underlying tables. This again, leads to
a lot of boilerplate code, if you have to issue your statements yourself. Like Hibernate / JPA and other
ORMs, jOOQ facilitates CRUD using a specific API involving org.jooq.UpdatableRecord types.
Normalised databases assume that a primary key is unique "forever", i.e. that a key, once inserted into
a table, will never be changed or re-inserted after deletion. In order to use jOOQ's CRUD operations
correctly, you should design your database accordingly.
See the manual's section about serializability for some more insight on "attached" objects.
Storing
Storing a record will perform an INSERT statement or an UPDATE statement. In general, new records are
always inserted, whereas records loaded from the database are always updated. This is best visualised
in code:
// Update the record: UPDATE BOOK SET PUBLISHED_IN = 1984 WHERE ID = [id]
book1.setPublishedIn(1948);
book1.store();
// Update the record: UPDATE BOOK SET TITLE = 'Animal Farm' WHERE ID = [id]
book2.setTitle("Animal Farm");
book2.store();
- jOOQ sets only modified values in INSERT statements or UPDATE statements. This allows for
default values to be applied to inserted records, as specified in CREATE TABLE DDL statements.
- When store() performs an INSERT statement, jOOQ attempts to load any generated keys from
the database back into the record. For more details, see the manual's section about IDENTITY
values.
- When loading records from POJOs, jOOQ will assume the record is a new record. It will hence
attempt to INSERT it.
- When you activate optimistic locking, storing a record may fail, if the underlying database record
has been changed in the mean time.
Deleting
Deleting a record will remove it from the database. Here's how you delete records:
Refreshing
Refreshing a record from the database means that jOOQ will issue a SELECT statement to refresh all
record values that are not the primary key. This is particularly useful when you use jOOQ's optimistic
locking feature, in case a modified record is "stale" and cannot be stored to the database, because the
underlying database record has changed in the mean time.
In order to perform a refresh, use the following Java code:
The purpose of the above information is for jOOQ's CRUD operations to know, which values need to be
stored to the database, and which values have been left untouched.
-- [...]
If you're using jOOQ's code generator, the above table will generate a org.jooq.UpdatableRecord with
an IDENTITY column. This information is used by jOOQ internally, to update IDs after calling store():
Database compatibility
DB2, Derby, HSQLDB, Ingres
These SQL dialects implement the standard very neatly.
H2, MySQL, Postgres, SQL Server, Sybase ASE, Sybase SQL Anywhere
These SQL dialects implement identites, but the DDL syntax doesn’t follow the standard
-- SQL Server
ID INTEGER IDENTITY(1,1) NOT NULL
-- Sybase ASE
id INTEGER IDENTITY NOT NULL
-- Sybase SQL Anywhere
id INTEGER NOT NULL IDENTITY
FOREIGN KEY (AUTHOR_ID) REFERENCES author(ID) // Find other books by that author
) Result<BookRecord> books = author.fetchChildren(FK_BOOK_AUTHOR);
Note that, unlike in Hibernate, jOOQ's navigation methods will always lazy-fetch relevant records,
without caching any results. In other words, every time you run such a fetch method, a new query will
be issued.
These fetch methods only work on "attached" records. See the manual's section about serializability for
some more insight on "attached" objects.
The above changes to jOOQ's behaviour are transparent to the API, the only thing you need to do for
it to be activated is to set the Settings flag. Here is an example illustrating optimistic locking:
// Change the title and store this book. The underlying database record has not been modified, it can be safely updated.
book1.setTitle("Animal Farm");
book1.store();
// Book2 still references the original TITLE value, but the database holds a new value from book1.store().
// This store() will thus fail:
book2.setTitle("1984");
book2.store();
-- This column indicates when each book record was modified for the last time
MODIFIED TIMESTAMP NOT NULL,
-- [...]
)
The MODIFIED column will contain a timestamp indicating the last modification timestamp for any
book in the BOOK table. If you're using jOOQ and it's store() methods on UpdatableRecords, jOOQ will
then generate this TIMESTAMP value for you, automatically. However, instead of running an additional
SELECT .. FOR UPDATE statement prior to an UPDATE or DELETE statement, jOOQ adds a WHERE-clause
to the UPDATE or DELETE statement, checking for TIMESTAMP's integrity. This can be best illustrated
with an example:
// Change the title and store this book. The MODIFIED value has not been changed since the book was fetched.
// It can be safely updated
book1.setTitle("Animal Farm");
book1.store();
// Book2 still references the original MODIFIED value, but the database holds a new value from book1.store().
// This store() will thus fail:
book2.setTitle("1984");
book2.store();
As before, without the added TIMESTAMP column, optimistic locking is transparent to the API.
Internally, jOOQ will render all the required SQL statements and execute them as a regular JDBC batch
execution.
- Adding a central ID generation algorithm, generating UUIDs for all of your records.
- Adding a central record initialisation mechanism, preparing the database prior to inserting a new
record.
@Override
public void insertStart(RecordContext ctx) {
For a full documentation of what RecordListener can do, please consider the RecordListener
Javadoc. Note that RecordListener instances can be registered with a Configuration independently of
ExecuteListeners.
5.13. DAOs
If you're using jOOQ's code generator, you can configure it to generate POJOs and DAOs for you.
jOOQ then generates one DAO per UpdatableRecord, i.e. per table with a single-column primary key.
Generated DAOs implement a common jOOQ type called org.jooq.DAO. This type contains the following
methods:
// These methods allow for updating POJOs based on their primary key
void update(P object) throws DataAccessException;
void update(P... objects) throws DataAccessException;
void update(Collection<P> objects) throws DataAccessException;
// These methods allow for deleting POJOs based on their primary key
void delete(P... objects) throws DataAccessException;
void delete(Collection<P> objects) throws DataAccessException;
void deleteById(T... ids) throws DataAccessException;
void deleteById(Collection<T> ids) throws DataAccessException;
// These methods allow for retrieving POJOs by primary key or by some other field
List<P> findAll() throws DataAccessException;
P findById(T id) throws DataAccessException;
<Z> List<P> fetch(Field<Z> field, Z... values) throws DataAccessException;
<Z> P fetchOne(Field<Z> field, Z value) throws DataAccessException;
Besides these base methods, generated DAO classes implement various useful fetch methods. An
incomplete example is given here, for the BOOK table:
Note that you can further subtype those pre-generated DAO classes, to add more useful DAO methods
to them. Using such a DAO is simple:
// Initialise an Configuration
Configuration configuration = new DefaultConfiguration().set(connection).set(SQLDialect.ORACLE);
// Delete it again
bookDao.delete(book);
- You can issue vendor-specific COMMIT, ROLLBACK and other statements directly in your
database.
- You can call JDBC's Connection.commit(), Connection.rollback() and other methods on your JDBC
driver.
- You can use third-party transaction management libraries like Spring TX. Examples shown in the
jOOQ with Spring examples section.
- You can use a JTA-compliant Java EE transaction manager from your container.
While jOOQ does not aim to replace any of the above, it offers a simple API (and a corresponding SPI) to
provide you with jOOQ-style programmatic fluency to express your transactions. Below are some Java
examples showing how to implement (nested) transactions with jOOQ. For these examples, we're using
Java 8 syntax. Java 8 is not a requirement, though.
create.transaction(configuration -> {
AuthorRecord author =
DSL.using(configuration)
.insertInto(AUTHOR, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
.values("George", "Orwell")
.returning()
.fetchOne();
DSL.using(configuration)
.insertInto(BOOK, BOOK.AUTHOR_ID, BOOK.TITLE)
.values(author.getId(), "1984")
.values(author.getId(), "Animal Farm")
.execute();
Note how the lambda expression receives a new, derived configuration that should be used within the
local scope:
create.transaction(configuration -> {
// ... but avoid using the scope from outside the transaction:
create.insertInto(...);
create.insertInto(...);
});
Rollbacks
Any uncaught checked or unchecked exception thrown from your transactional code will rollback the
transaction to the beginning of the block. This behaviour will allow for nesting transactions, if your
configured org.jooq.TransactionProvider supports nesting of transactions. An example can be seen
here:
create.transaction(outer -> {
final AuthorRecord author =
DSL.using(outer)
.insertInto(AUTHOR, AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME)
.values("George", "Orwell")
.returning()
.fetchOne();
// We can decide whether an exception is "fatal enough" to roll back also the outer transaction
if (isFatal(e))
TransactionProvider implementations
By default, jOOQ ships with the org.jooq.impl.DefaultTransactionProvider, which implements
nested transactions using JDBC java.sql.Savepoint. You can, however, implement your own
org.jooq.TransactionProvider and supply that to your Configuration to override jOOQ's default
behaviour. A simple example implementation using Spring's DataSourceTransactionManager can be
seen here:
import org.jooq.Transaction;
import org.jooq.TransactionContext;
import org.jooq.TransactionProvider;
import org.jooq.tools.JooqLogger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.datasource.DataSourceTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.support.DefaultTransactionDefinition;
@Autowired
DataSourceTransactionManager txMgr;
@Override
public void begin(TransactionContext ctx) {
log.info("Begin transaction");
@Override
public void commit(TransactionContext ctx) {
log.info("commit transaction");
txMgr.commit(((SpringTransaction) ctx.transaction()).tx);
}
@Override
public void rollback(TransactionContext ctx) {
log.info("rollback transaction");
txMgr.rollback(((SpringTransaction) ctx.transaction()).tx);
}
}
SpringTransaction(TransactionStatus tx) {
this.tx = tx;
}
}
More information about how to use jOOQ with Spring can be found in the tutorials about jOOQ and
Spring
- All "system exceptions" are unchecked. If in the middle of a transaction involving business logic,
there is no way that you can recover sensibly from a lost database connection, or a constraint
violation that indicates a bug in your understanding of your database model.
- All "business exceptions" are checked. Business exceptions are true exceptions that you should
handle (e.g. not enough funds to complete a transaction).
With jOOQ, it's simple. All of jOOQ's exceptions are "system exceptions", hence they are all unchecked.
jOOQ's DataAccessException
jOOQ uses its own org.jooq.exception.DataAccessException to wrap any underlying
java.sql.SQLException that might have occurred. Note that all methods in jOOQ that may cause such a
DataAccessException document this both in the Javadoc as well as in their method signature.
DataAccessException is subtyped several times as follows:
5.16. ExecuteListeners
The Configuration lets you specify a list of org.jooq.ExecuteListener instances. The ExecuteListener
is essentially an event listener for Query, Routine, or ResultSet render, prepare, bind, execute, fetch
steps. It is a base type for loggers, debuggers, profilers, data collectors, triggers, etc. Advanced
ExecuteListeners can also provide custom implementations of Connection, PreparedStatement and
ResultSet to jOOQ in apropriate methods.
For convenience and better backwards-compatibility, consider extending
org.jooq.impl.DefaultExecuteListener instead of implementing this interface.
package com.example;
/**
* Generated UID
*/
private static final long serialVersionUID = 7399239846062763212L;
@Override
public void start(ExecuteContext ctx) {
STATISTICS.compute(ctx.type(), (k, v) -> v == null ? 1 : v + 1);
}
}
log.info("STATISTICS");
log.info("----------");
import org.jooq.DSLContext;
import org.jooq.ExecuteContext;
import org.jooq.conf.Settings;
import org.jooq.impl.DefaultExecuteListener;
import org.jooq.tools.StringUtils;
/**
* Hook into the query execution lifecycle before executing queries
*/
@Override
public void executeStart(ExecuteContext ctx) {
See also the manual's sections about logging for more sample implementations of actual
ExecuteListeners.
@Override
public void renderEnd(ExecuteContext ctx) {
if (ctx.sql().matches("^(?i:(UPDATE|DELETE)(?!.* WHERE ).*)$")) {
throw new DeleteOrUpdateWithoutWhereException();
}
}
}
You might want to replace the above implementation with a more efficient and more reliable one, of
course.
- java.sql.Connection
- java.sql.Statement
- java.sql.PreparedStatement
- java.sql.CallableStatement
- java.sql.ResultSet
- java.sql.ResultSetMetaData
Optionally, you may even want to implement interfaces, such as java.sql.Array, java.sql.Blob,
java.sql.Clob, and many others. In addition to the above, you might need to find a way to simultaneously
support incompatible JDBC minor versions, such as 4.0, 4.1
As you can see, the configuration setup is simple. Now, the MockDataProvider acts as your single point
of contact with JDBC / jOOQ. It unifies any of these execution modes, transparently:
The above are the execution modes supported by jOOQ. Whether you're using any of jOOQ's various
fetching modes (e.g. pojo fetching, lazy fetching, many fetching, later fetching) is irrelevant, as those
modes are all built on top of the standard JDBC API.
Implementing MockDataProvider
Now, here's how to implement MockDataProvider:
@Override
public MockResult[] execute(MockExecuteContext ctx) throws SQLException {
// The execute context contains SQL string(s), bind values, and other meta-data
String sql = ctx.sql();
// You decide, whether any given statement returns results, and how many
else if (sql.toUpperCase().startsWith("SELECT")) {
return mock;
}
}
Essentially, the MockExecuteContext contains all the necessary information for you to decide, what kind
of data you should return. The MockResult wraps up two pieces of information:
You should return as many MockResult objects as there were query executions (in batch mode) or
results (in fetch-many mode). Instead of an awkward JDBC ResultSet, however, you can construct a
"friendlier" org.jooq.Result with your own record types. The jOOQ mock API will use meta data provided
with this Result in order to create the necessary JDBC java.sql.ResultSetMetaData
See the MockDataProvider Javadoc for a list of rules that you should follow.
# All lines with a leading hash are ignored. This is the MockFileDatabase comment syntax
-- SQL comments are parsed and passed to the SQL statement
/* The same is true for multi-line SQL comments */
select 'A';
> A
> -
> A
@ rows: 1
The above syntax consists of the following elements to define an individual statement:
- MockFileDatabase comments are any line with a leading hash ("#") symbol. They are ignored
when reading the file
- SQL comments are part of the SQL statement
- A SQL statement always starts on a new line and ends with a semi colon (;), which is the last
symbol on the line (apart from whitespace)
- If the statement has a result set, it immediately succeeds the SQL statement and is
prefixed by angle brackets and a whitespace ("> "). Any format that is accepted by
DSLContext.fetchFromTXT() is accepted.
- The statement is always terminated by the row count, which is prefixed by an at symbol, the
"rows" keyword, and a double colon ("@ rows:").
The above database supports exactly two statements in total, and is completely stateless (e.g. an INSERT
statement cannot be made to affect the results of a subsequent SELECT statement on the same table).
It can be loaded through the MockFileDatabase can be used as follows:
// Queries that are not listed in the MockFileDatabase will simply fail
Result<?> result = create.select(inline("C")).fetch();
In situations where the expected set of queries are well-defined, the MockFileDatabase can offer a very
effective way of mocking parts of the database engine, without offering the complete functionality of
the programmatic mocking connection.
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration).parsingConnection();
Statement s = c.createStatement();
// This syntax is not supported in Oracle, but thanks to the parser and jOOQ,
// it will run on Oracle and produce the expected result
ResultSet rs = s.executeQuery("SELECT * FROM (VALUES (1, 'a'), (2, 'b')) t(x, y)")) {
while (rs.next())
System.out.println("x: " + rs.getInt(1) + ", y: " + rs.getString());
}
x: 1, y: a
x: 2, y: b
5.21. Diagnostics
jOOQ includes a powerful diagnostics SPI, which can be used to detect problems and inefficiencies on
different levels of your database interaction:
Just like the parsing connection, which was documented in the previuos section, this functionality
does not depend on using the jOOQ API in a client application, but can expose itself through a JDBC
java.sql.Connection that proxies your real database connection.
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration.derive(new MyDiagnosticsListener()))
.diagnosticsConnection();
Statement s = c.createStatement()) {
The following sections describe each individual event, how it can happen, how and why it should be
remedied.
Why is it bad?
While it is definitely good not to fetch too many rows from a JDBC ResultSet, it would be even better to
communicate to the database that only a limited number of rows are going to be needed in the client,
by using the LIMIT clause. Not only will this prevent the pre-allocation of some resources both in the
client and in the server, but it opens up the possibility of much better execution plans. For instance,
the optimiser may prefer to chose nested loop joins over hash joins if it knows that the loops can be
aborted early.
An example is given here:
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration.derive(new TooManyRows()))
.diagnosticsConnection();
Statement s = c.createStatement()) {
Why is it bad?
The drawbacks of projecting too many columns are manifold:
- Too much data is loaded, cached, transferred between server and client. The overall resource
consumption of a system is too high if too many columns are projected. This can cause orders of
magnitude of overhead in extreme cases!
- Locking could occur in cases where it otherwise wouldn't happen, because two conflicting
queries actually don't really need to touch the same columns.
- The probability of using "covering indexes" (or "index only scans") on some tables decreases
because of the unnecessary projection. This can have drastic effects!
- The probability of applying JOIN elimination decreases, because of the unnecessary projection.
This is particularly true if you're querying views.
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration.derive(new TooManyColumns()))
.diagnosticsConnection();
Statement s = c.createStatement()) {
// On none of the rows, we retrieve the TITLE column, so selecting it would not have been necessary.
while (rs.next())
System.out.println("ID: " + rs.getInt(1));
}
}
Why is it bad?
There are two main problems:
- If the duplicate SQL appears in dynamic SQL (i.e. in generated SQL), then there is an indication
that the database's parser and optimiser may have to do too much work parsing the various
similar (but not identical) SQL queries and finding an execution plan for them, each time. In
fact, it will find the same execution plan most of the time, but with some significant overhead.
Depending on the query complexity, this overhead can easily go from milliseconds into several
seconds, blocking important resources in the database. If duplicate SQL happens at peak load
times, this problem can have a significant impact in production. It never affects your (single user)
development environments, where the overhead of parsing duplicate SQL is manageable.
- If the duplicate SQL appears in static SQL, this can simply indicate that the query was copy
pasted, and you might be able to refactor it. There's probably not any performance issue arising
from duplicate static SQL
// All the duplicate actual statements that have produced the same normalised
// statement in the recent past.
System.out.println("Duplicate statements: " + ctx.duplicateStatements());
}
}
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration.derive(new DuplicateStatements()))
.diagnosticsConnection();
Statement s = c.createStatement();
ResultSet rs = s.executeQuery(sql)) {
while (rs.next()) {
// Consume result set
}
}
}
// Everything is fine with the first execution
run("SELECT title FROM book WHERE id = 1");
// This query is identical to the previous one, differing only in irrelevant white space
run("SELECT title FROM book WHERE id = 1");
// This query is identical to the previous one, differing only in irrelevant additional parentheses
run("SELECT title FROM book WHERE (id = 1)");
// This query is identical to the previous one, differing only in what should be a bind variable
run("SELECT title FROM book WHERE id = 2");
// Everything is fine with the first execution of a new query that has never been seen
run("SELECT title FROM book WHERE id IN (1, 2, 3, 4, 5)");
// This query is identical to the previous one, differing only in what should be bind variables
run("SELECT title FROM book WHERE id IN (1, 2, 3, 4, 5, 6)");
}
Unlike when detecting repeated statements, duplicate statement statistics are performed globally over
all JDBC connections and data sources.
Why is it bad?
This problem is usually referred to as the N+1 problem. A parent entity is loaded (often by an ORM), and
its child entities are loaded lazily. Unfortunately, there were several parent instances, so for each parent
instance, we're now loading a set of child instances, resulting in many many queries. This diagnostic
detects if on the same connection, there is repeated execution of the same statement, even if it is not
exactly identical.
An example is given here:
// All the duplicate actual statements that have produced the same normalised
// statement in the recent past.
System.out.println("Repeated statements: " + ctx.repeatedStatements());
}
}
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration.derive(new RepeatedStatement()))
.diagnosticsConnection();
Statement s1 = c.createStatement();
ResultSet a = s1.executeQuery("SELECT id FROM author WHERE first_name LIKE 'A%'")) {
while (a.next()) {
int id = a.getInt(1);
// This query is run once for every author, when we could have joined the author table
try (PreparedStatement s2 = c.prepareStatement("SELECT title FROM book WHERE author_id = ?")) {
s2.setInt(1, id);
Unlike when detecting repeated statements, repeated statement statistics are performed locally only,
for a single JDBC Connection, or, if possible, for a transaction. Repeated statements in different
transactions are usually not an indication of a problem.
Why is it bad?
There are two misuses that can arise in this area:
- The call to wasNull() wasn't made when it should have been (nullable type, fetched as a primitive
type), possibly resulting in wrong results in the client.
- The call to wasNull() was made too often, or when it did not need to have been made (non-
nullable type, or types fetched as reference types), possibly resulting in a very slight performance
overhead, depending on the driver.
// Configuration is configured with the target DataSource, SQLDialect, etc. for instance Oracle.
try (Connection c = DSL.using(configuration.derive(new WasNull()))
.diagnosticsConnection();
Statement s = c.createStatement()) {
5.22. Logging
jOOQ logs all SQL queries and fetched result sets to its internal DEBUG logger, which is implemented
as an execute listener. By default, execute logging is activated in the jOOQ Settings. In order to see any
DEBUG log output, put either log4j or slf4j on jOOQ's classpath along with their respective configuration.
A sample log4j configuration can be seen here:
<root>
<priority value="debug" />
<appender-ref ref="stdout" />
</root>
</log4j:configuration>
With the above configuration, let's fetch some data with jOOQ
Executing query : select "BOOK"."ID", "BOOK"."TITLE" from "BOOK" order by "BOOK"."ID" asc limit ? offset ?
-> with bind values : select "BOOK"."ID", "BOOK"."TITLE" from "BOOK" order by "BOOK"."ID" asc limit 2 offset 1
Query executed : Total: 1.439ms
Fetched result : +----+------------+
: | ID|TITLE |
: +----+------------+
: | 2|Animal Farm |
: | 3|O Alquimista|
: +----+------------+
Finishing : Total: 4.814ms, +3.375ms
If you wish to use your own logger (e.g. avoiding printing out sensitive data), you can deactivate jOOQ's
logger using your custom settings and implement your own execute listener logger.
- It takes some time to construct jOOQ queries. If you can reuse the same queries, you might
cache them. Beware of thread-safety issues, though, as jOOQ's Configuration is not necessarily
threadsafe, and queries are "attached" to their creating DSLContext
- It takes some time to render SQL strings. Internally, jOOQ reuses the same
java.lang.StringBuilder for the complete query, but some rendering elements may take
their time. You could, of course, cache SQL generated by jOOQ and prepare your own
java.sql.PreparedStatement objects
- It takes some time to bind values to prepared statements. jOOQ does not keep any open
prepared statements, internally. Use a sophisticated connection pool, that will cache prepared
statements and inject them into jOOQ through the standard JDBC API
- It takes some time to fetch results. By default, jOOQ will always fetch the complete
java.sql.ResultSet into memory. Use lazy fetching to prevent that, and scroll over an open
underlying database cursor
Optimise wisely
Don't be put off by the above paragraphs. You should optimise wisely, i.e. only in places where you really
need very high throughput to your database. jOOQ's overhead compared to plain JDBC is typically less
than 1ms per query.
- Variable binding
- Result mapping
- Exception handling
When adding jOOQ to a project that is using JdbcTemplate extensively, a pragmatic first step is to use
jOOQ as a SQL builder and pass the query string and bind variables to JdbcTemplate for execution. For
instance, you may have the following class to store authors and their number of books in our stores:
// But instead of executing the above query, we'll send the SQL string and the bind values to JdbcTemplate:
JdbcTemplate template = new JdbcTemplate(dataSource);
List<AuthorAndBooks> result = template.query(
query.getSQL(),
query.getBindValues().toArray(),
(r, i) -> new AuthorAndBooks(
r.getString(1),
r.getString(2),
r.getInt(3)
));
This approach helps you gradually migrate from using JdbcTemplate to a jOOQ-only execution model.
@Entity
@Table(name = "book")
public class JPABook {
@Id
public int id;
@Column(name = "title")
public String title;
@ManyToOne
public JPAAuthor author;
@Override
public String toString() {
return "JPABook [id=" + id + ", title=" + title + ", author=" + author + "]";
}
}
@Entity
@Table(name = "author")
public class JPAAuthor {
@Id
public int id;
@Column(name = "first_name")
public String firstName;
@Column(name = "last_name")
public String lastName;
@OneToMany(mappedBy = "author")
public Set<JPABook> books;
@Override
public String toString() {
return "JPAAuthor [id=" + id + ", firstName=" + firstName +
", lastName=" + lastName + ", book size=" + books.size() + "]";
}
}
return result.getResultList();
}
Note, if you're using custom data types or bindings, make sure to take those into account as well. E.g.
as follows:
return result.getResultList();
}
This way, you can construct complex, type safe queries using the jOOQ API and have your
javax.persistence.EntityManager execute it with all the transaction semantics attached:
List<Object[]> books =
nativeQuery(em, DSL.using(configuration)
.select(AUTHOR.FIRST_NAME, AUTHOR.LAST_NAME, BOOK.TITLE)
.from(AUTHOR)
.join(BOOK).on(AUTHOR.ID.eq(BOOK.AUTHOR_ID))
.orderBy(BOOK.ID));
books.forEach((Object[] book) -> System.out.println(book[0] + " " + book[1] + " wrote " + book[2]));
public static <E> List<E> nativeQuery(EntityManager em, org.jooq.Query query, Class<E> type) {
// There's an unsafe cast here, but we can be sure that we'll get the right type from JPA
return result.getResultList();
}
Note, if you're using custom data types or bindings, make sure to take those into account as well. E.g.
as follows:
return result.getResultList();
}
With the above simple API, we're ready to write complex jOOQ queries and map their results to JPA
entities:
List<JPAAuthor> authors =
nativeQuery(em,
DSL.using(configuration)
.select()
.from(AUTHOR)
.orderBy(AUTHOR.ID)
, JPAAuthor.class);
authors.forEach(author -> {
System.out.println(author.firstName + " " + author.lastName + " wrote");
books.forEach(book -> {
System.out.println(" " + book.title);
});
});
@SqlResultSetMapping(
name = "bookmapping",
entities = {
@EntityResult(
entityClass = JPABook.class,
fields = {
@FieldResult(name = "id", column = "b_id"),
@FieldResult(name = "title", column = "b_title"),
@FieldResult(name = "author", column = "b_author_id")
}
),
@EntityResult(
entityClass = JPAAuthor.class,
fields = {
@FieldResult(name = "id", column = "a_id"),
@FieldResult(name = "firstName", column = "a_first_name"),
@FieldResult(name = "lastName", column = "a_last_name")
}
)
}
)
With the above boilerplate in place, we can now fetch entities using jOOQ and JPA:
public static <E> List<E> nativeQuery(EntityManager em, org.jooq.Query query, String resultSetMapping) {
return result.getResultList();
}
Note, if you're using custom data types or bindings, make sure to take those into account as well. E.g.
as follows:
public static <E> List<E> nativeQuery(EntityManager em, org.jooq.Query query, String resultSetMapping) {
return result.getResultList();
}
List<Object[]> result =
nativeQuery(em,
DSL.using(configuration
.select(
AUTHOR.ID.as("a_id"),
AUTHOR.FIRST_NAME.as("a_first_name"),
AUTHOR.LAST_NAME.as("a_last_name"),
BOOK.ID.as("b_id"),
BOOK.AUTHOR_ID.as("b_author_id"),
BOOK.TITLE.as("b_title")
)
.from(AUTHOR)
.join(BOOK).on(BOOK.AUTHOR_ID.eq(AUTHOR.ID))
.orderBy(BOOK.ID)),
"bookmapping" // The name of the SqlResultSetMapping
);
- We have to reference the result set mapping by name (a String) - there is no type safety involved
here
- We don't know the type contained in the resulting List - there is a potential for
ClassCastException
- The results are in fact a list of Object[], with the individual entities listed in the array, which need
explicit casting
6. Code generation
While optional, source code generation is one of jOOQ's main assets if you wish to increase developer
productivity. jOOQ's code generator takes your database schema and reverse-engineers it into a set of
Java classes modelling tables, records, sequences, POJOs, DAOs, stored procedures, user-defined types
and many more.
The essential ideas behind source code generation are these:
- Increased IDE support: Type your Java code directly against your database schema, with all type
information available
- Type-safety: When your database schema changes, your generated code will change as well.
Removing columns will lead to compilation errors, which you can detect early.
The following chapters will show how to configure the code generator and how to generate various
artefacts.
- jooq-3.11.9.jar
The main library that you will include in your application to run jOOQ
- jooq-meta-3.11.9.jar
The utility that you will include in your build to navigate your database schema for code
generation. This can be used as a schema crawler as well.
- jooq-codegen-3.11.9.jar
The utility that you will include in your build to generate your database schema
<!-- You can also pass user/password and other JDBC properties in the optional properties tag: -->
<properties>
<property><key>user</key><value>[db-user]</value></property>
<property><key>password</key><value>[db-password]</value></property>
</properties>
</jdbc>
<generator>
<database>
<!-- The database dialect from jooq-meta. Available dialects are
named org.jooq.meta.[database].[database]Database.
org.jooq.meta.ase.ASEDatabase
org.jooq.meta.auroramysql.AuroraMySQLDatabase
org.jooq.meta.aurorapostgres.AuroraPostgresDatabase
org.jooq.meta.cubrid.CUBRIDDatabase
org.jooq.meta.db2.DB2Database
org.jooq.meta.derby.DerbyDatabase
org.jooq.meta.firebird.FirebirdDatabase
org.jooq.meta.h2.H2Database
org.jooq.meta.hana.HANADatabase
org.jooq.meta.hsqldb.HSQLDBDatabase
org.jooq.meta.informix.InformixDatabase
org.jooq.meta.ingres.IngresDatabase
org.jooq.meta.mariadb.MariaDBDatabase
org.jooq.meta.mysql.MySQLDatabase
org.jooq.meta.oracle.OracleDatabase
org.jooq.meta.postgres.PostgresDatabase
org.jooq.meta.redshift.RedshiftDatabase
org.jooq.meta.sqldatawarehouse.SQLDataWarehouseDatabase
org.jooq.meta.sqlite.SQLiteDatabase
org.jooq.meta.sqlserver.SQLServerDatabase
org.jooq.meta.sybase.SybaseDatabase
org.jooq.meta.teradata.TeradataDatabase
org.jooq.meta.vertica.VerticaDatabase
This value can be used to reverse-engineer generic JDBC DatabaseMetaData (e.g. for MS Access)
org.jooq.meta.jdbc.JDBCDatabase
org.jooq.meta.xml.XMLDatabase
This value can be used to reverse-engineer schemas defined by SQL files (requires jooq-meta-extensions dependency)
org.jooq.meta.extensions.ddl.DDLDatabase
This value can be used to reverse-engineer schemas defined by JPA annotated entities (requires jooq-meta-extensions
dependency)
org.jooq.meta.extensions.jpa.JPADatabase
<!-- All elements that are generated from your schema (A Java regular expression.
Use the pipe to separate several expressions) Watch out for
case-sensitivity. Depending on your database, this might be
important!
You can create case-insensitive regular expressions using this syntax: (?i:expr)
<!-- All elements that are excluded from your schema (A Java regular expression.
Use the pipe to separate several expressions). Excludes match before
includes, i.e. excludes have a higher priority -->
<excludes>
UNUSED_TABLE # This table (unqualified name) should not be generated
| PREFIX_.* # Objects with a given prefix should not be generated
| SECRET_SCHEMA\.SECRET_TABLE # This table (qualified name) should not be generated
| SECRET_ROUTINE # This routine (unqualified name) ...
</excludes>
<!-- The schema that is used locally as a source for meta information.
This could be your development schema or the production schema, etc
This cannot be combined with the schemata element.
If left empty, jOOQ will generate all available schemata. See the
manual's next section to learn how to generate several schemata -->
<inputSchema>[your database schema / owner / name]</inputSchema>
</database>
<generate>
<!-- Generation flags: See advanced configuration properties -->
</generate>
© 2009 - 2019 by Data Geekery™ GmbH. Page 239 / 323
<target>
<!-- The destination package of your generated classes (within the
destination directory)
jOOQ may append the schema name to this package if generating multiple schemas,
The jOOQ User Manual 6.1. Configuration and setup of the generator
There are also lots of advanced configuration parameters, which will be treated in the manual's
section about advanced code generation features Note, you can find the official XSD file for a formal
specification at:
http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd
org.jooq.codegen.GenerationTool /jooq-config.xml
- Put the property file, jooq*.jar and the JDBC driver into a directory, e.g. C:\temp\jooq
- Go to C:\temp\jooq
- Run java -cp jooq-3.11.9.jar;jooq-meta-3.11.9.jar;jooq-codegen-3.11.9.jar;[JDBC-driver].jar;.
org.jooq.codegen.GenerationTool /[XML file]
- this example uses jOOQ's log4j support by adding log4j.xml and log4j.jar to the project
classpath.
- the actual jooq-3.11.9.jar, jooq-meta-3.11.9.jar, jooq-codegen-3.11.9.jar artefacts may contain
version numbers in the file names.
Once the project is set up correctly with all required artefacts on the classpath, you can configure an
Eclipse Run Configuration for org.jooq.codegen.GenerationTool.
Finally, run the code generation and see your generated artefacts
<plugin>
<!-- The plugin should hook into the generate goal -->
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<!-- Manage the plugin's dependency. In this example, we'll use a PostgreSQL database -->
<dependencies>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.4.1212</version>
</dependency>
</dependencies>
See a more complete example of a Maven pom.xml File in the jOOQ / Spring tutorial.
6.2.1. Logging
This optional top level configuration element simply allows for overriding the log level of anything that
has been specified by the runtime, e.g. in log4j or slf4j and is helpful for per-code-generation log
configuration. For example, in order to mute everything that is less than WARN level, write:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<logging>WARN</logging>
...
</configuration>
Programmatic configuration
configuration.withLogging(Logging.WARN);
Gradle configuration
myConfigurationName(sourceSets.main) {
logging = 'WARN'
}
- TRACE
- DEBUG
- INFO
- WARN
- ERROR
- FATAL
6.2.2. Jdbc
This optional top level configuration element allows for configuring a JDBC connection. By default, the
jOOQ code generator requires an active JDBC connection to reverse engineer your database schema.
For example, if you want to connect to a MySQL database, write this:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<jdbc>
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost/testdb</url>
<!--
<username/> is a valid synonym for <user/>
-->
<user>root</user>
<password>secret</password>
</jdbc>
...
</configuration>
Programmatic configuration
© 2009 - 2019 by Data Geekery™ GmbH. Page 245 / 323
The jOOQ User Manual 6.2.2. Jdbc
configuration
.withJdbc(new Jdbc()
.withDriver("com.mysql.jdbc.Driver")
.withUrl("jdbc:mysql://localhost/testdb")
.withUser("root")
.withPassword("secret"));
Note that when using the programmatic configuration API through the GenerationTool, you can also
pass a pre-existing JDBC connection to the GenerationTool and leave this configuration element alone.
Gradle configuration
myConfigurationName(sourceSets.main) {
jdbc {
driver = 'com.mysql.jdbc.Driver'
url = 'jdbc:mysql://localhost/testdb'
user = 'root'
password = 'secret'
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<jdbc>
<driver>com.mysql.jdbc.Driver</driver>
<url>jdbc:mysql://localhost/testdb</url>
<properties>
<property>
<key>user</key>
<value>root</value>
</property>
<property>
<key>password</key>
<value>secret</value>
</property>
</properties>
</jdbc>
...
</configuration>
Programmatic configuration
configuration
.withJdbc(new Jdbc()
.withDriver("com.mysql.jdbc.Driver")
.withUrl("jdbc:mysql://localhost/testdb")
.withProperties(
new Property().withKey("user").withValue("root"),
new Property().withKey("password").withValue("secret")));
Gradle configuration
myConfigurationName(sourceSets.main) {
jdbc {
driver = 'com.mysql.jdbc.Driver'
url = 'jdbc:mysql://localhost/testdb'
properties {
property {
key = 'user'
value = 'root'
}
property {
key = 'password'
value = 'secret'
}
}
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<jdbc>
<driver>${db.driver}</driver>
<url>${db.url}</url>
<user>${db.user}</user>
<password>${db.password}</password>
</jdbc>
...
</configuration>
6.2.3. Generator
This mandatory top level configuration element wraps all the remaining configuration elements related
to code generation, including the overridable code generator class.
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<!-- Optional: The fully qualified class name of the code generator. -->
<name>...</name>
<!-- Optional: The fully qualified class name of the generator strategy. -->
<strategy>...</strategy>
<!-- Optional: The jooq-meta configuration, configuring the information schema source. -->
<database>...</database>
<!-- Optional: The jooq-codegen configuration, configuring the generated output content. -->
<generate>...</generate>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withName("...")
.withStrategy(new Strategy())
.withDatabase(new Database())
.withGenerate(new Generate())
.withTarget(new Target())));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
name = '...'
strategy {
...
}
database {
...
}
generate {
...
}
target {
...
}
}
}
Specifying a strategy
jOOQ by default applies standard Java naming schemes: PascalCase for classes, camelCase for
members, methods, variables, parameters, UPPER_CASE_WITH_UNDERSCORES for constants and other
literals. This may not be the desired default for your database, e.g. when you strongly rely on case-
sensitive naming and if you wish to be able to search for names both in your Java code and in your
database code (scripts, views, stored procedures) uniformly. For that purpose, you can override the
<strategy/> element with your own implementation, either:
- programmatically
- configuratively
6.2.4. Database
This element wraps all the configuration elements that are used for the jooq-meta module, which reads
the configured database meta data. In its simplest form, it can be left empty, when meaningful defaults
will apply.
The two main elements in the <database/> element are <name/> and <properties>, which specify the
class to implement the database meta data source, and an optional list of key/value parameters, which
are described in the next chapter
Subsequent elements are:
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<name>org.jooq.meta.xml.XMLDatabase</name>
<properties>
<property>
<key>dialect</key>
<value>MYSQL</value>
</property>
<property>
<key>xml-file</key>
<value>/path/to/database.xml</value>
</property>
</properties>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withName("org.jooq.meta.xml.XMLDatabase")
.withProperties(
new Property().withKey("dialect").withValue("MYSQL"),
new Property().withKey("xml-file").withValue("/path/to/database.xml")))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
name = 'org.jooq.meta.xml.XMLDatabase'
properties {
property {
key = 'dialect'
value = 'MYSQL'
}
property {
key = 'xml-file'
value = '/path/to/database.xml'
}
}
}
}
}
The default <name/> if no name is supplied will be derived from the JDBC connection. If you want to
specifically specify your SQL dialect's database name, any of these values will be supported by jOOQ,
out of the box:
- org.jooq.meta.ase.ASEDatabase
- org.jooq.meta.cubrid.CUBRIDDatabase
- org.jooq.meta.db2.DB2Database
- org.jooq.meta.derby.DerbyDatabase
- org.jooq.meta.firebird.FirebirdDatabase
- org.jooq.meta.h2.H2Database
- org.jooq.meta.hana.HanaDatabse
- org.jooq.meta.hsqldb.HSQLDBDatabase
- org.jooq.meta.informix.InformixDatabase
- org.jooq.meta.ingres.IngresDatabase
- org.jooq.meta.mariadb.MariaDBDatabase
- org.jooq.meta.mysql.MySQLDatabase
- org.jooq.meta.oracle.OracleDatabase
- org.jooq.meta.postgres.PostgresDatabase
- org.jooq.meta.redshift.RedshiftDatabase
- org.jooq.meta.sqlite.SQLiteDatabase
- org.jooq.meta.sqlserver.SQLServerDatabase
- org.jooq.meta.sybase.SybaseDatabase
- org.jooq.meta.vertica.VerticaDatabase
Alternatively, you can also specify the following database if you want to reverse-engineer a generic JDBC
java.sql.DatabaseMetaData source for an unsupported database version / dialect / etc:
- org.jooq.meta.jdbc.JDBCDatabase
Furthermore, there are two out-of-the-box database meta data sources, that do not rely on a JDBC
connection: the JPADatabase (to reverse engineer JPA annotated entities) and the XMLDatabase (to
reverse engineer an XML file). Please refer to the respective sections for more details.
Last, but not least, you can of course implement your own by implementing org.jooq.meta.Database
from the jooq-meta module.
6.2.4.2. RegexFlags
A lot of configuration elements rely on regular expressions. The most prominent examples are the useful
includes and excludes elements. All of these regular expressions use the Java java.util.regex.Pattern API,
with all of its features. The Pattern API allows for specifying flags and for your configuration convenience,
the applied flags are, by default:
But of course, this default setting may get in your way, for instance if you rely on case sensitive identifiers
and whitespace in identifiers a lot, it might be better for you to turn off the above defaults:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<regexFlags>COMMENTS DOTALL</regexFlags>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withRegexFlags("COMMENTS DOTALL"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
regexFlags = 'COMMENTS DOTALL'
}
}
}
All the flags available from java.util.regex.Pattern are available as a whitespace-separated list.
- Array types
- Domains
- Enums
- Links
- Packages
- Queues
- Routiens
- Sequences
- Tables
- UDTs
Excludes match before includes, meaning that something that has been excluded cannot be included
again. Remember, these expressions are regular expressions with default flags, so multiple names need
to be separated with the pipe symbol "|", not with commas, etc. For example:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<includes>.*</includes>
<excludes>
UNUSED_TABLE # This table (unqualified name) should not be generated
| PREFIX_.* # Objects with a given prefix should not be generated
| SECRET_SCHEMA\.SECRET_TABLE # This table (qualified name) should not be generated
| SECRET_ROUTINE # This routine (unqualified name) ...
</excludes>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withIncludes(".*")
.withExcludes("UNUSED_TABLE|PREFIX_.*|SECRET_SCHEMA\\.SECRET_TABLE|SECRET_ROUTINE"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
includes = '.*'
excludes = 'UNUSED_TABLE|PREFIX_.*|SECRET_SCHEMA\\.SECRET_TABLE|SECRET_ROUTINE'
}
}
}
A special, additional option allows for specifying whether the above two regular expressions should also
match table columns. The following example will hide an INVISIBLE_COL in any table (and also tables
called this way, of course):
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<includes>.*</includes>
<excludes>INVISIBLE_COL</excludes>
<includeExcludeColumns>true</includeExcludeColumns>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withIncludes(".*")
.withExcludes("INVISIBLE_COL")
.withIncludeExcludeColumns(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
includes = '.*'
excludes = 'INVISIBLE_COL'
includeExcludeColumns = true
}
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<includeTables>true</includeTables>
<includeRoutines>true</includeRoutines>
<includePackages>true</includePackages>
<includePackageRoutines>true</includePackageRoutines>
<includePackageUDTs>true</includePackageUDTs>
<includePackageConstants>true</includePackageConstants>
<includeUDTs>true</includeUDTs>
<includeSequences>false</includeSequences>
<includePrimaryKeys>false</includePrimaryKeys>
<includeUniqueKeys>false</includeUniqueKeys>
<includeForeignKeys>false</includeForeignKeys>
<includeIndexes>false</includeIndexes>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withIncludeTables(true)
.withIncludeRoutines(true)
.withIncludePackages(true)
.withIncludePackageRoutines(true)
.withIncludePackageUDTs(true)
.withIncludePackageConstants(true)
.withIncludeUDTs(true)
.withIncludeSequences(false)
.withIncludePrimaryKeys(false)
.withIncludeUniqueKeys(false)
.withIncludeForeignKeys(false)
.withIncludeIndexes(false))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
includeTables = true
includeRoutines = true
includePackages = true
includePackageRoutines = true
includePackageUDTs = true
includePackageConstants = true
includeUDTs = true
includeSequences = false
includePrimaryKeys = false
includeUniqueKeys = false
includeForeignKeys = false
includeIndexes = false
}
}
}
timestamp fields. These regular expressions should match at most one column per table, again either by
their fully qualified names (catalog.schema.table.column_name) or by their names only (column_name):
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<recordVersionFields>REC_VERSION</recordVersionFields>
<recordTimestampFields>REC_TIMESTAMP</recordTimestampFields>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withRecordVersionFields("REC_VERSION")
.withRecordTimestampFields("REC_TIMESTAMP"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
recordVersionFields = 'REC_VERSION'
recordTimestampFields = 'REC_TIMESTAMP'
}
}
}
Note again that these expressions are regular expressions with default flags
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<syntheticIdentities>SCHEMA\.TABLE\.ID</syntheticIdentities>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withSyntheticIdentities("SCHEMA\.TABLE\.ID"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
syntheticIdentities = 'SCHEMA\.TABLE\.ID'
}
}
}
Note again that these expressions are regular expressions with default flags
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<syntheticPrimaryKeys>SCHEMA\.TABLE\.COLUMN(1|2)</syntheticPrimaryKeys>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withSyntheticPrimaryKeys("SCHEMA\.TABLE\.COLUMN(1|2)"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
syntheticPrimaryKeys = 'SCHEMA\.TABLE\.COLUMN(1|2)'
}
}
}
If the regular expression matches column in a table that already has an existing primary key, that existing
primary key will be replaced by the synthetic one. It will still be reported as a unique key, though.
Note again that these expressions are regular expressions with default flags
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<overridePrimaryKeys>MY_UNIQUE_KEY_NAME</overridePrimaryKeys>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withOverridePrimaryKeys("MY_UNIQUE_KEY_NAME"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
overridePrimaryKeys = 'MY_UNIQUE_KEY_NAME'
}
}
}
If several keys match, a warning is emitted and the first one encountered will be used. This flag will also
replace synthetic primary keys, if it matches.
Note again that these expressions are regular expressions with default flags
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<dateAsTimestamp>true</dateAsTimestamp>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withDateAsTimestamp(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
dateAsTimestamp = true
}
}
}
This flag will apply before any other data type related flags are applied, including forced types.
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<ignoreProcedureReturnValues>true</ignoreProcedureReturnValues>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withIgnoreProcedureReturnValues(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
ignoreProcedureReturnValues = true
}
}
}
- org.jooq.types.UByte
- org.jooq.types.UShort
- org.jooq.types.UInteger
- org.jooq.types.ULong
Those types work just like ordinary java.lang.Number wrapper types, except that there is no primitive
version of them. The configuration looks like follows:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<unsignedTypes>true</unsignedTypes>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withUnsignedTypes(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
unsignedTypes = true
}
}
}
o They allow for specifying one or more catalogs (default: all catalogs) as well as one or more
schemas (default: all schemas) for inclusion in the code generator. This works in a similar fashion
as the includes and excludes elements, but it is applied on an earlier stage.
o Once all "input" catalogs and schemas are specified, they can each be associated with a
matching "output" catalog or schema, in case of which the "input" will be mapped to the "output"
by the code generator. For more details about this, please refer to the manual section about
schema mapping.
There are two ways to operate "input" and "output" catalogs and schemas configurations: "top level"
and "nested". Note that catalogs are only supported in very few databases, so usually, users will only
use the "input" and "output" schema feature.
<!-- Read only a single schema (from all catalogs, but in most databases, there is only one "default catalog") -->
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<inputSchema>my_schema</inputSchema>
</database>
</generator>
</configuration>
<!-- Read only a single catalog and all its schemas -->
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<inputCatalog>my_catalog</inputCatalog>
</database>
</generator>
</configuration>
<!-- Read only a single catalog and only a single schema -->
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<inputCatalog>my_catalog</inputCatalog>
<inputSchema>my_schema</inputSchema>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withInputCatalog("my_catalog")
.withInputSchema("my_schema"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
inputCatalog = 'my_catalog'
inputSchema = 'my_schema'
}
}
}
Nested configurations
This mode is preferrable for larger projects where several catalogs and/or schemas need to be included.
The following examples show different possible configurations:
XML configuration (standalone and Maven)
<!-- Read two schemas (from all catalogs, but in most databases, there is only one "default catalog") -->
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<schemata>
<schema>
<inputSchema>schema1</inputSchema>
</schema>
<schema>
<inputSchema>schema2</inputSchema>
</schema>
</schemata>
</database>
</generator>
</configuration>
Programmatic configuration
Gradle configuration
<!-- Map input names to the "default" catalog or schema (i.e. no name): -->
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<inputCatalog>my_input_catalog</inputCatalog>
<outputCatalogToDefault>true</outputCatalogToDefault>
<inputSchema>my_input_schema</inputSchema>
<outputSchemaToDefault>true</outputSchemaToDefault>
</database>
</generator>
</configuration>
Programmatic configuration
© 2009 - 2019 by Data Geekery™ GmbH. Page 261 / 323
The jOOQ User Manual 6.2.4.13. Catalog and schema version providers
Gradle configuration
For more information about the catalog and schema mapping feature, please refer to the relevant
section of the manual.
These schema versions will be generated into the javax.annotation.Generated annotation on generated
artefacts.
© 2009 - 2019 by Data Geekery™ GmbH. Page 262 / 323
The jOOQ User Manual 6.2.4.14. Custom ordering of generated code
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<catalogVersionProvider>SELECT :catalog_name || '_' || MAX("version") FROM "schema_version"</catalogVersionProvider>
<schemaVersionProvider>SELECT :schema_name || '_' || MAX("version") FROM "schema_version"</schemaVersionProvider>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withCatalogVersionProvider("SELECT :catalog_name || '_' || MAX(\"version\") FROM \"schema_version\"")
.withSchemaVersionProvider("SELECT :schema_name || '_' || MAX(\"version\") FROM \"schema_version\""))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
catalogVersionProvider = 'SELECT :catalog_name || '_' || MAX("version") FROM "schema_version"'
schemaVersionProvider = 'SELECT :schema_name || '_' || MAX("version") FROM "schema_version"'
}
}
}
- Catalogs, schemas, tables, user-defined types, packages, routines, sequences, constraints are
ordered alphabetically
- Table columns, user-defined type attributes, routine parameters are ordered in their order of
definition
Sometimes, it may be desireable to override this default ordering to a custom ordering. In particular,
the default ordering may be case-sensitive, when case-insensitive ordering is really more desireable at
times. Users may define an order provider by specifying a fully qualified class on the code generator's
class path, which must implement java.util.Comparator<org.jooq.meta.Definition> as follows:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<orderProvider>com.example.CaseInsensitiveOrderProvider</orderProvider>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withOrderProvider("com.example.CaseInsensitiveOrderProvider");
Gradle configuration
© 2009 - 2019 by Data Geekery™ GmbH. Page 263 / 323
The jOOQ User Manual 6.2.4.15. Forced types
myConfigurationName(sourceSets.main) {
generator {
database {
orderProvider = 'com.example.CaseInsensitiveOrderProvider'
}
}
}
package com.example;
import java.util.Comparator;
import org.jooq.meta.Definition;
While changing the order of "top level types" (like tables) is irrelevant to the jOOQ runtime, there may
be some side-effects to changing the order of table columns, user-defined type attributes, routine
parameters, as the database might expect the exact same order as is defined in the database. In order
to only change the ordering for tables, the following order provider can be implemented instead:
package com.example;
import java.util.Comparator;
import org.jooq.meta.Definition;
import org.jooq.meta.TableDefinition;
- By rewriting them to some other data type using the data type rewriting feature.
- By mapping them to some user type using the data type converter feature and a custom
org.jooq.Converter.
- By mapping them to some user type using the data type binding feature and a custom
org.jooq.Binding.
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<!-- The first matching forcedType will be applied to the data type definition. -->
<forcedTypes>
<forcedType>
<!-- Specify any data type that is supported in your database, or if unsupported, a type from org.jooq.impl.SQLDataType -->
<name>BOOLEAN</name>
<!-- Add a Java regular expression matching fully-qualified columns. Use the pipe to separate several expressions.
If provided, both "expressions" and "types" must match. -->
<expression>.*\.IS_VALID</expression>
<!-- Add a Java regular expression matching data types to be forced to have this type.
<!-- Force a type on ALL / NULL (nullable) / NOT_NULL (non-nullable) types -->
<nullability>ALL</nullability>
</forcedType>
</forcedTypes>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
// The first matching forcedType will be applied to the data type definition.
.withForcedTypes(new ForcedType()
.withName("BOOLEAN")
.withExpression(".*\.IS_VALID")
.withTypes(".*")
.withNullability(Nullability.ALL)))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
// The first matching forcedType will be applied to the data type definition.
forcedTypes {
forcedType {
name = 'BOOLEAN'
expression = '.*\.IS_VALID'
types = '.*'
nullability = ALL
}
}
}
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<!-- The first matching forcedType will be applied to the data type definition. -->
<forcedTypes>
<forcedType>
<!-- Specify the Java type of your custom type. This corresponds to the Converter's <U> type. -->
<userType>java.time.Instant</userType>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
// The first matching forcedType will be applied to the data type definition.
.withForcedTypes(new ForcedType()
.withUserType("java.time.Instant")
.withConverter("com.example.LongToInstantConverter")
.withExpression(".*\.DATE_OF_.*")
.withTypes(".*")))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
// The first matching forcedType will be applied to the data type definition.
forcedTypes {
forcedType {
userType = 'java.time.Instant'
converter = 'com.example.LongToInstantConverter'
expression = '.*\.DATE_OF_.*'
types = '.*'
}
}
}
}
}
For more information about using converters, please refer to the manual's section about custom data
type conversion.
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<!-- The first matching forcedType will be applied to the data type definition. -->
<forcedTypes>
<forcedType>
<!-- Specify the Java type of your custom type. This corresponds to the Converter's <U> type. -->
<userType>com.example.MyEnum</userType>
<!-- Associate that custom type with your inline converter. -->
<converter>org.jooq.Converter.ofNullable(Integer.class, MyEnum.class, i -> MyEnum.values()[i], MyEnum::ordinal)</converter>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
// The first matching forcedType will be applied to the data type definition.
.withForcedTypes(new ForcedType()
.withUserType("com.example.MyEnum")
.withConverter("org.jooq.Converter.ofNullable(Integer.class, MyEnum.class, i -> MyEnum.values()[i], MyEnum::ordinal)")
.withExpression(".*\.DATE_OF_.*")
.withTypes(".*")))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
// The first matching forcedType will be applied to the data type definition.
forcedTypes {
forcedType {
userType = 'com.example.MyEnum'
converter = 'org.jooq.Converter.ofNullable(Integer.class, MyEnum.class, i -> MyEnum.values()[i], MyEnum::ordinal)'
expression = '.*\.DATE_OF_.*'
types = '.*'
}
}
}
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<!-- The first matching forcedType will be applied to the data type definition. -->
<forcedTypes>
<forcedType>
<!-- Specify the Java type of your custom type. This corresponds to the Converter's <U> type. -->
<userType>com.example.MyEnum</userType>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
// The first matching forcedType will be applied to the data type definition.
.withForcedTypes(new ForcedType()
.withUserType("com.example.MyEnum")
.withEnumConverter(true)
.withExpression(".*\.MY_STATUS")
.withTypes(".*")))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
// The first matching forcedType will be applied to the data type definition.
forcedTypes {
forcedType {
userType = 'com.example.MyEnum'
enumConverter = true
expression = '.*\.MY_STATUS'
types = '.*'
}
}
}
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<!-- The first matching forcedType will be applied to the data type definition. -->
<forcedTypes>
<forcedType>
<!-- Specify the Java type of your custom type. This corresponds to the Binding's <U> type. -->
<userType>java.time.Instant</userType>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
// The first matching forcedType will be applied to the data type definition.
.withForcedTypes(new ForcedType()
.withUserType("java.time.Instant")
.withBinding("com.example.LongToInstantBinding")
.withExpression(".*\.DATE_OF_.*")
.withTypes(".*")))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
// The first matching forcedType will be applied to the data type definition.
forcedTypes {
forcedType {
userType = 'java.time.Instant'
binding = 'com.example.LongToInstantBinding'
expression = '.*\.DATE_OF_.*'
types = '.*'
}
}
}
}
}
For more information about using converters, please refer to the manual's section about custom data
type bindings.
- ordinary tables in most databases including PostgreSQL, SQL Server - because that's what they
are. They're intended for use in FROM clauses of SELECT statements, not as standalone routines.
- ordinary routines in some databases including Oracle - for historic reasons. While Oracle also
allows for embedding (pipelined) table functions in FROM clauses of SELECT statements, it is not
uncommon to call these as standalone routines in Oracle.
The <tableValuedFunctions/> flag is thus set to false by default on Oracle, and true otherwise. Here's
how to explicitly change this behaviour:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<tableValuedFunctions>true</tableValuedFunctions>
</database>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withTableValuedFunctions(true))));
Gradle configuration
© 2009 - 2019 by Data Geekery™ GmbH. Page 269 / 323
The jOOQ User Manual 6.2.5. Generate
myConfigurationName(sourceSets.main) {
generator {
database {
tableValuedFunctions = true
}
}
}
6.2.5. Generate
This element wraps all the configuration elements that are used for the jooq-codegen module, which
generates Java or Scala code, or XML from your database.
Contained elements are:
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<generate>
<!-- This overrides all the other individual flags -->
<globalObjectReferences>true</globalObjectReferences>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withGenerate(new Generate()
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
generate {
6.2.5.2. Annotations
The code generator supports a set of annotations on generated code, which can be turned on using
the following flags. These annotations include:
- JPA annotations: A minimal set of JPA annotations can be generated on POJOs and other
artefacts to convey type and metadata information that is available to the code generator. These
annotations include:
* javax.persistence.Column
* javax.persistence.Entity
* javax.persistence.GeneratedValue
* javax.persistence.GenerationType
* javax.persistence.Id
* javax.persistence.Index (JPA 2.1 and later)
* javax.persistence.Table
* javax.persistence.UniqueConstraint
While jOOQ generated code cannot really be used as full-fledged entities (use e.g. Hibernate or
EclipseLink to generate such entities), this meta information can still be useful as documentation
on your generated code. Some of the annotations (e.g. @Column) can be used by the
org.jooq.impl.DefaultRecordMapper for mapping records to POJOs.
- Validation annotations: A set of Bean Validation API annotations can be added to the generated
code to convey type information. They include:
* javax.validation.constraints.NotNull
* javax.validation.constraints.Size
jOOQ does implement the validation spec, nor does it validate your data, but you can use third-
party tools to read the jOOQ-generated validation annotations.
- Spring annotations: Some useful Spring annotations can be generated on DAOs for better
Spring integration. These include:
* org.springframework.beans.factory.annotation.Autowired
* org.springframework.stereotype.Repository
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<generate>
<jpaAnnotations>true</jpaAnnotations>
<jpaVersion>2.2</jpaVersion>
<validationAnnotations>true</validationAnnotations>
<springAnnotations>true</springAnnotations>
</generate>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withGenerate(new Generate()
.withJpaAnnotations(true)
.withJpaVersion("2.2")
.withValidationAnnotations(true)
.withSpringAnnotations(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
generate {
jpaAnnotations = true
jpaVersion = '2.2'
validationAnnotations = true
springAnnotations = true
}
}
}
Semantically, the above types are exactly equivalent, although the new types do away with the many
flaws of the JDBC types. If there is no JDBC type for an equivalent JSR 310 type, then the JSR 310 type
is generated by default. This includes
To get more fine-grained control of the above, you may wish to consider applying data type rewriting.
In order to activate the generation of these types, use:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<generate>
<javaTimeTypes>true</javaTimeTypes>
</generate>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withGenerate(new Generate()
.withJavaTimeTypes(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
generate {
javaTimeTypes = true
}
}
}
If this is not a desireable default, it can be deactivated either explicitly on a per-column basis using
forced types, or globally using the following flag:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<generate>
<forceIntegerTypesOnZeroScaleDecimals>true</forceIntegerTypesOnZeroScaleDecimals>
</generate>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withGenerate(new Generate()
.withForceIntegerTypesOnZeroScaleDecimals(true))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
generate {
forceIntegerTypesOnZeroScaleDecimals = true
}
}
}
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<generate>
<fullyQualifiedTypes>.*\.MY_TABLE</fullyQualifiedTypes>
</generate>
</generator>
</configuration>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withGenerate(new Generate()
.withFullyQualifiedTypes(".*\.MY_TABLE"))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
generate {
fullyQualifiedTypes = '.*\.MY_TABLE'
}
}
}
- packageName: Specifies the root package name inside of which all generated code is located.
This package is located inside of the <directory/>. The package name is part of the generator
strategy and can be modified by a custom implementation, if so desired.
- directory: Specifies the root directoy inside of which all generated code is located.
- encoding: The encoding that should be used for generated classes.
- clean: Whether the target package (<packageName/>) should be cleaned to contain only
generated code after a generation run. Defaults to true.
// [...]
GenerationTool.generate(configuration);
For the above example, you will need all of jooq-3.11.9.jar, jooq-meta-3.11.9.jar, and jooq-
codegen-3.11.9.jar, on your classpath.
import java.io.File;
import javax.xml.bind.JAXB;
import org.jooq.meta.jaxb.Configuration;
// [...]
// and then
GenerationTool.generate(configuration);
... and then, modify parts of your configuration programmatically, for instance the JDBC user / password:
<!-- These properties can be added directly to the generator element: -->
<generator>
<!-- The default code generator. You can override this one, to generate your own code style
Defaults to org.jooq.codegen.JavaGenerator -->
<name>org.jooq.codegen.JavaGenerator</name>
<!-- The naming strategy used for class and field names.
You may override this with your custom naming strategy. Some examples follow
Defaults to org.jooq.codegen.DefaultGeneratorStrategy -->
<strategy>
<name>org.jooq.codegen.DefaultGeneratorStrategy</name>
</strategy>
</generator>
The following example shows how you can override the DefaultGeneratorStrategy to render table and
column names the way they are defined in the database, rather than switching them to camel case:
/**
* It is recommended that you extend the DefaultGeneratorStrategy. Most of the
* GeneratorStrategy API is already declared final. You only need to override any
* of the following methods, for whatever generation behaviour you'd like to achieve.
*
* Also, the DefaultGeneratorStrategy takes care of disambiguating quite a few object
* names in case of conflict. For example, MySQL indexes do not really have a name, so
* a synthetic, non-ambiguous name is generated based on the table. If you override
* the default behaviour, you must ensure that this disambiguation still takes place
* for generated code to be compilable.
*
* Beware that most methods also receive a "Mode" object, to tell you whether a
* TableDefinition is being rendered as a Table, Record, POJO, etc. Depending on
* that information, you can add a suffix only for TableRecords, not for Tables
*/
public class AsInDatabaseStrategy extends DefaultGeneratorStrategy {
/**
* Override this to specifiy what identifiers in Java should look like.
* This will just take the identifier as defined in the database.
*/
@Override
public String getJavaIdentifier(Definition definition) {
// The DefaultGeneratorStrategy disambiguates some synthetic object names,
// such as the MySQL PRIMARY key names, which do not really have a name
// Uncomment the below code if you want to reuse that logic.
// if (definition instanceof IndexDefinition)
// return super.getJavaIdentifier(definition);
return definition.getOutputName();
}
/**
* Override these to specify what a setter in Java should look like. Setters
* are used in TableRecords, UDTRecords, and POJOs. This example will name
* setters "set[NAME_IN_DATABASE]"
*/
@Override
public String getJavaSetterName(Definition definition, Mode mode) {
return "set" + definition.getOutputName();
}
/**
* Just like setters...
*/
@Override
public String getJavaGetterName(Definition definition, Mode mode) {
return "get" + definition.getOutputName();
}
/**
* Override this method to define what a Java method generated from a database
* Definition should look like. This is used mostly for convenience methods
* when calling stored procedures and functions. This example shows how to
* set a prefix to a CamelCase version of your procedure
*/
@Override
public String getJavaMethodName(Definition definition, Mode mode) {
return "call" + org.jooq.tools.StringUtils.toCamelCase(definition.getOutputName());
}
/**
* Override this method to define how your Java classes and Java files should
* be named. This example applies no custom setting and uses CamelCase versions
* instead
*/
@Override
public String getJavaClassName(Definition definition, Mode mode) {
return super.getJavaClassName(definition, mode);
}
/**
* Override this method to re-define the package names of your generated
* artefacts.
*/
@Override
public String getJavaPackageName(Definition definition, Mode mode) {
return super.getJavaPackageName(definition, mode);
}
/**
* Override this method to define how Java members should be named. This is
* used for POJOs and method arguments
*/
@Override
public String getJavaMemberName(Definition definition, Mode mode) {
return definition.getOutputName();
}
/**
* Override this method to define the base class for those artefacts that
* allow for custom base classes
*/
@Override
public String getJavaClassExtends(Definition definition, Mode mode) {
return Object.class.getName();
}
/**
* Override this method to define the interfaces to be implemented by those
* artefacts that allow for custom interface implementation
*/
© 2009@Override
- 2019 by Data Geekery™ GmbH. Page 277 / 323
public List<String> getJavaClassImplements(Definition definition, Mode mode) {
return Arrays.asList(Serializable.class.getName(), Cloneable.class.getName());
}
/**
The jOOQ User Manual 6.4. Custom generator strategies
An org.jooq.Table example:
This is an example showing which generator strategy method will be called in what place when
generating tables. For improved readability, full qualification is omitted:
package com.example.tables;
// 1: ^^^^^^^^^^^^^^^^^^
public class Book extends TableImpl<com.example.tables.records.BookRecord> {
// 2: ^^^^ 3: ^^^^^^^^^^
public static final Book BOOK = new Book();
// 2: ^^^^ 4: ^^^^
public final TableField<BookRecord, Integer> ID = /* ... */
// 3: ^^^^^^^^^^ 5: ^^
}
// 1: strategy.getJavaPackageName(table)
// 2: strategy.getJavaClassName(table)
// 3: strategy.getJavaClassName(table, Mode.RECORD)
// 4: strategy.getJavaIdentifier(table)
// 5: strategy.getJavaIdentifier(column)
An org.jooq.Record example:
This is an example showing which generator strategy method will be called in what place when
generating records. For improved readability, full qualification is omitted:
package com.example.tables.records;
// 1: ^^^^^^^^^^^^^^^^^^^^^^^^^^
public class BookRecord extends UpdatableRecordImpl<BookRecord> {
// 2: ^^^^^^^^^^ 2: ^^^^^^^^^^
public void setId(Integer value) { /* ... */ }
// 3: ^^^^^
public Integer getId() { /* ... */ }
// 4: ^^^^^
}
// 1: strategy.getJavaPackageName(table, Mode.RECORD)
// 2: strategy.getJavaClassName(table, Mode.RECORD)
// 3: strategy.getJavaSetterName(column, Mode.RECORD)
// 4: strategy.getJavaGetterName(column, Mode.RECORD)
A POJO example:
This is an example showing which generator strategy method will be called in what place when
generating pojos. For improved readability, full qualification is omitted:
package com.example.tables.pojos;
// 1: ^^^^^^^^^^^^^^^^^^^^^^^^
public class Book implements java.io.Serializable {
// 2: ^^^^
private Integer id;
// 3: ^^
public void setId(Integer value) { /* ... */ }
// 4: ^^^^^
public Integer getId() { /* ... */ }
// 5: ^^^^^
// 1: strategy.getJavaPackageName(table, Mode.POJO)
// 2: strategy.getJavaClassName(table, Mode.POJO)
// 3: strategy.getJavaMemberName(column, Mode.POJO)
// 4: strategy.getJavaSetterName(column, Mode.POJO)
// 5: strategy.getJavaGetterName(column, Mode.POJO)
- org.jooq.codegen.example.JPrefixGeneratorStrategy
- org.jooq.codegen.example.JVMArgsGeneratorStrategy
- NOTE: All regular expressions that match object identifiers try to match identifiers
first by unqualified name (org.jooq.meta.Definition.getName()), then by qualified name
(org.jooq.meta.Definition.getQualifiedName()).
- NOTE: There had been an incompatible change between jOOQ 3.2 and jOOQ 3.3 in the
configuration of these matcher strategies. See Issue #3217 for details.
<!-- These properties can be added directly to the generator element: -->
<generator>
<strategy>
<matchers>
<!-- Specify 0..n schema matchers to provide a strategy for naming objects created from schemas. -->
<schemas>
<schema>
<!-- Match unqualified or qualified schema names. If left empty, this matcher applies to all schemas. -->
<expression>MY_SCHEMA</expression>
<!-- These elements influence the naming of a generated org.jooq.Schema object. -->
<schemaClass> --> MatcherRule </schemaClass>
<schemaIdentifier> --> MatcherRule </schemaIdentifier>
<schemaImplements>com.example.MyOptionalCustomInterface</schemaImplements>
</schema>
</schemas>
<!-- Specify 0..n table matchers to provide a strategy for naming objects created from tables. -->
<tables>
<table>
<!-- Match unqualified or qualified table names. If left empty, this matcher applies to all tables. -->
<expression>MY_TABLE</expression>
<!-- These elements influence the naming of a generated org.jooq.Table object. -->
<tableClass> --> MatcherRule </tableClass>
<tableIdentifier> --> MatcherRule </tableIdentifier>
<tableImplements>com.example.MyOptionalCustomInterface</tableImplements>
<!-- These elements influence the naming of a generated org.jooq.Record object. -->
<recordClass> --> MatcherRule </recordClass>
<recordImplements>com.example.MyOptionalCustomInterface</recordImplements>
<!-- These elements influence the naming of a generated org.jooq.DAO object. -->
<daoClass> --> MatcherRule </daoClass>
<daoImplements>com.example.MyOptionalCustomInterface</daoImplements>
<!-- These elements influence the naming of a generated POJO object. -->
<pojoClass> --> MatcherRule </pojoClass>
<pojoExtends>com.example.MyOptionalCustomBaseClass</pojoExtends>
<pojoImplements>com.example.MyOptionalCustomInterface</pojoImplements>
</table>
</tables>
<!-- Specify 0..n field matchers to provide a strategy for naming objects created from fields. -->
<fields>
<field>
<!-- Match unqualified or qualified field names. If left empty, this matcher applies to all fields. -->
<expression>MY_FIELD</expression>
<!-- These elements influence the naming of a generated org.jooq.Field object. -->
<fieldIdentifier> --> MatcherRule </fieldIdentifier>
<fieldMember> --> MatcherRule </fieldMember>
<fieldSetter> --> MatcherRule </fieldSetter>
<fieldGetter> --> MatcherRule </fieldGetter>
</field>
</fields>
<!-- Specify 0..n routine matchers to provide a strategy for naming objects created from routines. -->
<routines>
<routine>
<!-- Match unqualified or qualified routine names. If left empty, this matcher applies to all routines. -->
<expression>MY_ROUTINE</expression>
<!-- These elements influence the naming of a generated org.jooq.Routine object. -->
<routineClass> --> MatcherRule </routineClass>
<routineMethod> --> MatcherRule </routineMethod>
<routineImplements>com.example.MyOptionalCustomInterface</routineImplements>
</routine>
</routines>
<!-- Specify 0..n sequence matchers to provide a strategy for naming objects created from sequences. -->
<sequences>
<sequence>
<!-- Match unqualified or qualified sequence names. If left empty, this matcher applies to all sequences. -->
<expression>MY_SEQUENCE</expression>
<!-- These elements influence the naming of the generated Sequences class. -->
<sequenceIdentifier> --> MatcherRule </sequenceIdentifier>
</sequence>
</sequences>
<!-- Specify 0..n enum matchers to provide a strategy for naming objects created from enums. -->
<enums>
<enum>
<!-- Match unqualified or qualified enum names. If left empty, this matcher applies to all enums. -->
<expression>MY_ENUM</expression>
<!-- These elements influence the naming of a generated org.jooq.EnumType object. -->
<enumClass> --> MatcherRule </enumClass>
<enumImplements>com.example.MyOptionalCustomInterface</enumImplements>
© 2009 - 2019 by Data Geekery™ GmbH.
</enum> Page 280 / 323
</enums>
</matchers>
</strategy>
</generator>
The jOOQ User Manual 6.6. Custom code sections
The above example used references to "MatcherRule", which is an XSD type that looks like this:
<schemaClass>
<!-- The optional transform element lets you apply a name transformation algorithm
to transform the actual database name into a more convenient form. Possible values are:
<!-- The mandatory expression element lets you specify a replacement expression to be used when
replacing the matcher's regular expression. You can use indexed variables $0, $1, $2. -->
<expression>PREFIX_$0_SUFFIX</expression>
</schemaClass>
Some examples
The following example shows a matcher strategy that adds a "T_" prefix to all table classes and to table
identifiers:
<generator>
<strategy>
<matchers>
<tables>
<table>
<!-- Expression is omitted. This will make this rule apply to all tables -->
<tableIdentifier>
<transform>UPPER</transform>
<expression>T_$0</expression>
</tableIdentifier>
<tableClass>
<transform>PASCAL</transform>
<expression>T_$0</expression>
</tableClass>
</table>
</tables>
</matchers>
</strategy>
</generator>
The following example shows a matcher strategy that renames BOOK table identifiers (or table
identifiers containing BOOK) into BROCHURE (or tables containing BROCHURE):
<generator>
<strategy>
<matchers>
<tables>
<table>
<expression>^(.*?)_BOOK_(.*)$</expression>
<tableIdentifier>
<transform>UPPER</transform>
<expression>$1_BROCHURE_$2</expression>
</tableIdentifier>
</table>
</tables>
</matchers>
</strategy>
</generator>
For more information about each XML tag, please refer to the http://www.jooq.org/xsd/jooq-
codegen-3.11.0.xsd XSD file.
@Override
protected void generateRecordClassFooter(TableDefinition table, JavaWriter out) {
out.println();
out.tab(1).println("public String toString() {");
out.tab(2).println("return \"MyRecord[\" + valuesRow() + \"]\";");
out.tab(1).println("}");
}
}
The above example simply adds a class footer to generated records, in this case, overriding the default
toString() implementation.
@Override
protected void generateRecordClassJavadoc(TableDefinition table, JavaWriter out) {
out.println("/**");
out.println(" * This record belongs to table " + table.getOutputName() + ".");
out.println(" */");
}
}
When you override any of the above, do note that according to jOOQ's understanding of semantic
versioning, incompatible changes may be introduced between minor releases, even if this should be
the exception.
- Keys.java: This file contains all of the required primary key, unique key, foreign key and identity
references in the form of static members of type org.jooq.Key.
- Routines.java: This file contains all standalone routines (not in packages) in the form of static
factory methods for org.jooq.Routine types.
- Sequences.java: This file contains all sequence objects in the form of static members of type
org.jooq.Sequence.
- Tables.java: This file contains all table objects in the form of static member references to the
actual singleton org.jooq.Table object
- UDTs.java: This file contains all UDT objects in the form of static member references to the actual
singleton org.jooq.UDT object
// Generated columns
public final TableField<BookRecord, Integer> ID = createField("ID", SQLDataType.INTEGER, this);
public final TableField<BookRecord, Integer> AUTHOR_ID = createField("AUTHOR_ID", SQLDataType.INTEGER, this);
public final TableField<BookRecord, String> ITLE = createField("TITLE", SQLDataType.VARCHAR, this);
// [...]
}
- recordVersionFields: Relevant methods from super classes are overridden to return the
VERSION field
- recordTimestampFields: Relevant methods from super classes are overridden to return the
TIMESTAMP field
- syntheticPrimaryKeys: This overrides existing primary key information to allow for "custom"
primary key column sets
- overridePrimaryKeys: This overrides existing primary key information to allow for unique key to
primary key promotion
- dateAsTimestamp: This influences all relevant columns
- unsignedTypes: This influences all relevant columns
- relations: Relevant methods from super classes are overridden to provide primary key, unique
key, foreign key and identity information
- instanceFields: This flag controls the "static" keyword on table columns, as well as aliasing
convenience
- records: The generated record type is referenced from tables allowing for type-safe single-table
record fetching
@Id
@Column(name = "ID", unique = true, nullable = false, precision = 7)
@Override
public Integer getId() {
return getValue(BOOK.ID);
}
// Navigation methods
public AuthorRecord fetchAuthor() {
return create.selectFrom(AUTHOR).where(AUTHOR.ID.eq(getValue(BOOK.AUTHOR_ID))).fetchOne();
}
// [...]
}
- syntheticPrimaryKeys: This overrides existing primary key information to allow for "custom"
primary key column sets, possibly promoting a TableRecord to an UpdatableRecord
- overridePrimaryKeys: This overrides existing primary key information to allow for unique key to
primary key promotion, possibly promoting a TableRecord to an UpdatableRecord
- dateAsTimestamp: This influences all relevant getters and setters
- unsignedTypes: This influences all relevant getters and setters
- relations: This is needed as a prerequisite for navigation methods
- daos: Records are a pre-requisite for DAOs. If DAOs are generated, records are generated as well
- interfaces: If interfaces are generated, records will implement them
- jpaAnnotations: JPA annotations are used on generated records (details here)
- jpaVersion: Version of JPA specification is to be used to generate version-specific annotations. If
it is omitted, the latest version is used by default. (details here)
@NotNull
private Integer authorId;
@NotNull
@Size(max = 400)
private String title;
@Override
public void setId(Integer id) {
this.id = id;
}
// [...]
}
// [...]
}
Generated DAOs
Every table in your database will generate a org.jooq.DAO implementation that looks like this:
// Generated constructors
public BookDao() {
super(BOOK, Book.class);
}
// [...]
}
// All IN, IN OUT, OUT parameters and function return values generate a static member
public static final Parameter<String> AUTHOR_NAME = createParameter("AUTHOR_NAME", SQLDataType.VARCHAR);
public static final Parameter<BigDecimal> RESULT = createParameter("RESULT", SQLDataType.NUMERIC);
addInParameter(AUTHOR_NAME);
addOutParameter(RESULT);
}
// [...]
}
// [...]
}
// [...]
}
<database>
<!-- Add a Java regular expression matching fully-qualified columns. Use the pipe to separate several expressions.
<!-- Add a Java regular expression matching data types to be forced to have this type.
following configuration elements to specify, that you'd like to use GregorianCalendar for all database
fields that start with DATE_OF_
<database>
<!-- Specify the Java type of your custom type. This corresponds to the Converter's <U> type. -->
<userType>java.util.GregorianCalendar</userType>
<!-- Add a Java regular expression matching fully-qualified columns. Use the pipe to separate several expressions.
See also the section about data type rewrites to learn about an alternative use of <forcedTypes/>.
The above configuration will lead to AUTHOR.DATE_OF_BIRTH being generated like this:
// [...]
public final TableField<TAuthorRecord, GregorianCalendar> DATE_OF_BIRTH = // [...]
// [...]
This means that the bound type of <T> will be GregorianCalendar, wherever you reference
DATE_OF_BIRTH. jOOQ will use your custom converter when binding variables and when fetching data
from java.util.ResultSet:
// We're binding <T> = Object (unknown JDBC type), and <U> = JsonElement (user type)
public class PostgresJSONGsonBinding implements Binding<Object, JsonElement> {
@Override
public Object to(JsonElement u) {
return u == null || u == JsonNull.INSTANCE ? null : new Gson().toJson(u);
}
@Override
public Class<Object> fromType() {
return Object.class;
}
@Override
public Class<JsonElement> toType() {
return JsonElement.class;
}
};
}
// Rending a bind variable for the binding context's value and casting it to the json type
@Override
public void sql(BindingSQLContext<JsonElement> ctx) throws SQLException {
// Depending on how you generate your SQL, you may need to explicitly distinguish
// between jOOQ generating bind variables or inlined literals.
if (ctx.render().paramType() == ParamType.INLINED)
ctx.render().visit(DSL.inline(ctx.convert(converter()).value())).sql("::json");
else
ctx.render().sql("?::json");
}
// Converting the JsonElement to a String value and setting that on a JDBC PreparedStatement
@Override
public void set(BindingSetStatementContext<JsonElement> ctx) throws SQLException {
ctx.statement().setString(ctx.index(), Objects.toString(ctx.convert(converter()).value(), null));
}
// Getting a String value from a JDBC ResultSet and converting that to a JsonElement
@Override
public void get(BindingGetResultSetContext<JsonElement> ctx) throws SQLException {
ctx.convert(converter()).value(ctx.resultSet().getString(ctx.index()));
}
// Getting a String value from a JDBC CallableStatement and converting that to a JsonElement
@Override
public void get(BindingGetStatementContext<JsonElement> ctx) throws SQLException {
ctx.convert(converter()).value(ctx.statement().getString(ctx.index()));
}
// Getting a value from a JDBC SQLInput (useful for Oracle OBJECT types)
@Override
public void get(BindingGetSQLInputContext<JsonElement> ctx) throws SQLException {
throw new SQLFeatureNotSupportedException();
}
}
<database>
<forcedTypes>
<forcedType>
<!-- Specify the Java type of your custom type. This corresponds to the Binding's <U> type. -->
<userType>com.google.gson.JsonElement</userType>
<!-- Add a Java regular expression matching fully-qualified columns. Use the pipe to separate several expressions.
See also the section about data type rewrites to learn about an alternative use of <forcedTypes/>.
The above configuration will lead to AUTHOR.CUSTOM_DATA_JSON being generated like this:
// [...]
public final TableField<TAuthorRecord, JsonElement> CUSTOM_DATA_JSON = // [...]
// [...]
Schema mapping
The following configuration applies mapping only for schemata, not for catalogs. The <schemata/>
element is a standalone element that can be put in the code generator's <database/> configuration
element:
<schemata>
<schema>
<!-- Use this as the developer's schema: -->
<inputSchema>LUKAS_DEV_SCHEMA</inputSchema>
The following configuration applies mapping for catalogs and their schemata. The <catalogs/> element
is a standalone element that can be put in the code generator's <database/> configuration element:
© 2009 - 2019 by Data Geekery™ GmbH. Page 293 / 323
The jOOQ User Manual 6.20. Code generation for large schemas
<catalogs>
<catalog>
<!-- Use this as the developer's catalog: -->
<inputCatalog>LUKAS_DEV_CATALOG</inputCatalog>
- Methods (including static / instance initialisers) are allowed to contain only 64kb of bytecode.
- Classes are allowed to contain at most 64k of constant literals
While there exist workarounds for the above two limitations (delegating initialisations to nested classes,
inheriting constant literals from implemented interfaces), the preferred approach is either one of these:
- Distribute your database objects in several schemas. That is probably a good idea anyway for
such large databases
- Configure jOOQ's code generator to exclude excess database objects
- Configure jOOQ's code generator to avoid generating global objects using
<globalObjectReferences/>
- Remove uncompilable classes after code generation
In this section we'll see that both approaches have their merits and that none of them is clearly better.
This approach is particularly useful when your Java developers are not in full control of or do not have
full access to your database schema, or if you have many developers that work simultaneously on the
same database schema, which changes all the time. It is also useful to be able to track side-effects of
database changes, as your checked-in database schema can be considered when you want to analyse
the history of your schema.
With this approach, you can also keep track of the change of behaviour in the jOOQ code generator,
e.g. when upgrading jOOQ, or when modifying the code generation configuration.
The drawback of this approach is that it is more error-prone as the actual schema may go out of sync
with the generated schema.
Derived artefacts
When you consider generated code to be derived artefacts, you will want to:
This approach is particularly useful when you have a smaller database schema that is under full control
by your Java developers, who want to profit from the increased quality of being able to regenerate all
derived artefacts in every step of your build.
The drawback of this approach is that the build may break in perfectly acceptable situations, when parts
of your database are temporarily unavailable.
Pragmatic combination
In some situations, you may want to choose a pragmatic combination, where you put only some parts
of the generated code under version control. For instance, jOOQ-meta's generated sources are put
under version control as few contributors will be able to run the jOOQ-meta code generator against
all supported databases.
@Entity
@Table(name = "author")
public class Author {
@Id
int id;
@Column(name = "first_name")
String firstName;
@Column(name = "last_name")
String lastName;
@OneToMany(mappedBy = "author")
Set<Book> books;
@Entity
@Table(name = "book")
public class Book {
@Id
public int id;
@Column(name = "title")
public String title;
@ManyToOne
public Author author;
Now, instead of connecting the jOOQ code generator to a database that holds a representation of the
above schema, you can use jOOQ's JPADatabase and feed that to the code generator. The JPADatabase
uses Hibernate internally, to generate an in-memory H2 database from your entities, and reverse-
engineers that again back to jOOQ classes.
The easiest way forward is to use Maven in order to include the jooq-meta-extensions library (which
then includes the H2 and Hibernate dependencies)
<dependency>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
With that dependency in place, you can now specify the JPADatabase in your code generator
configuration:
<generator>
<database>
<name>org.jooq.meta.extensions.jpa.JPADatabase</name>
<properties>
<!-- A comma separated list of Java packages, that contain your entities -->
<property>
<key>packages</key>
<value>com.example.entities</value>
</property>
The above will generate all jOOQ artefacts for your AUTHOR and BOOK tables.
+-------------------+
| Your JPA entities |
+-------------------+
^ ^
depends on | | depends on
| |
+---------------------+ +---------------------+
| jOOQ codegen plugin | | Your application |
+---------------------+ +---------------------+
| |
generates | | depends on
v v
+-------------------------+
| jOOQ generated classes |
+-------------------------+
You cannot put your JPA entities in the same module as the one that runs the jOOQ code generator.
<?xml version="1.0"?>
<information_schema xmlns="http://www.jooq.org/xsd/jooq-meta-3.11.0.xsd">
<schemata>
<schema>
<schema_name>TEST</schema_name>
</schema>
</schemata>
<tables>
<table>
<table_schema>TEST</table_schema>
<table_name>AUTHOR</table_name>
</table>
<table>
<table_schema>TEST</table_schema>
<table_name>BOOK</table_name>
</table>
</tables>
<columns>
<column>
<table_schema>PUBLIC</table_schema>
<table_name>AUTHOR</table_name>
<column_name>ID</column_name>
<data_type>NUMBER</data_type>
<numeric_precision>7</numeric_precision>
<ordinal_position>1</ordinal_position>
<is_nullable>false</is_nullable>
</column>
...
</columns>
</information_schema>
<table_constraints>
<table_constraint>
<constraint_schema>TEST</constraint_schema>
<constraint_name>PK_AUTHOR</constraint_name>
<constraint_type>PRIMARY KEY</constraint_type>
<table_schema>TEST</table_schema>
<table_name>AUTHOR</table_name>
</table_constraint>
...
</table_constraints>
<key_column_usages>
<key_column_usage>
<constraint_schema>TEST</constraint_schema>
<constraint_name>PK_AUTHOR</constraint_name>
<table_schema>TEST</table_schema>
<table_name>AUTHOR</table_name>
<column_name>ID</column_name>
<ordinal_position>1</ordinal_position>
</key_column_usage>
...
</key_column_usage>
<referential_constraints>
<referential_constraint>
<constraint_schema>TEST</constraint_schema>
<constraint_name>FK_BOOK_AUTHOR_ID</constraint_name>
<unique_constraint_schema>TEST</unique_constraint_schema>
<unique_constraint_name>PK_AUTHOR</unique_constraint_name>
</referential_constraint>
...
</referential_constraints>
</information_schema>
The above file can be made available to the code generator configuration by using the XMLDatabase
as follows:
<generator>
<database>
<name>org.jooq.meta.xml.XMLDatabase</name>
<properties>
If you already have a different XML format for your database, you can either XSL transform your own
format into the one above via an additional Maven plugin, or pass the location of an XSL file to the
XMLDatabase by providing an additional property:
<generator>
<database>
<name>org.jooq.meta.xml.XMLDatabase</name>
<properties>
...
This XML configuration can now be checked in and versioned, and modified independently from your
live database schema.
While the script uses pretty standard SQL constructs, you may well use some vendor-specific
extensions, and even DML statements in between to set up your schema - it doesn't matter. You will
simply need to set up your code generation configuration as follows:
XML configuration (standalone and Maven)
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<database>
<name>org.jooq.meta.extensions.ddl.DDLDatabase</name>
<properties>
Where:
- ** matches any directory subtree
- * matches any number of characters in a directory / file name
- ? matches a single character in a directory / file name
-->
<property>
<key>scripts</key>
<value>src/main/resources/database.sql</value>
</property>
Programmatic configuration
configuration
.withGenerator(new Generator(
.withDatabase(new Database()
.withName("org.jooq.meta.extensions.ddl.DDLDatabase")
.withProperties(
new Property()
.withKey("scripts")
.withValue("src/main/resources/database.sql"),
new Property()
.withKey("sort")
.withValue("semantic")))));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
database {
name = 'org.jooq.meta.extensions.ddl.DDLDatabase'
properties {
property {
key = 'scripts'
value = 'src/main/resources/database.sql'
}
property {
key = 'sort'
value = 'semantic'
}
}
}
}
}
Dependencies
Note that the org.jooq.meta.extensions.ddl.DDLDatabase class is located in an external dependency,
which needs to be placed on the classpath of the jOOQ code generator. E.g. using Maven:
<dependency>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-8 for commercial editions with Java 8 support,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
<configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd">
<generator>
<name>org.jooq.codegen.XMLGenerator</name>
</generator>
...
</configuration>
Programmatic configuration
configuration.withGenerator(new Generator()
.withName("org.jooq.codegen.XMLGenerator"));
Gradle configuration
myConfigurationName(sourceSets.main) {
generator {
name = 'org.jooq.codegen.XMLGenerator'
}
}
This configuration does not interfere with most of the remaining code generation configuration, e.g.
you can still specify the JDBC connection or the generation output target as usual.
<plugin>
<!-- Specify the maven code generator plugin -->
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
<executions>
<execution>
<id>jooq-codegen</id>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
...
</configuration>
</execution>
</executions>
</plugin>
<plugin>
...
<configuration>
<!-- A boolean property (or constant) can be specified here to tell the plugin not to do anything -->
<skip>${skip.jooq.generation}</skip>
<!-- Instead of providing an inline configuration here, you can specify an external XML configuration file here -->
<configurationFile>${externalfile}</configurationFile>
</configuration>
...
</plugin>
<dependencies>
<dependency>
<!-- JDBC driver -->
</dependency>
<dependency>
<!-- Use org.jooq for the Open Source Edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
compile 'org.jooq:jooq:3.11.9'
runtime 'com.h2database:h2:1.4.177'
testCompile 'junit:junit:4.11'
}
buildscript {
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
classpath 'org.jooq:jooq-codegen:3.11.9'
classpath 'com.h2database:h2:1.4.177'
}
}
// Use your favourite XML builder to construct the code generation configuration file
// ----------------------------------------------------------------------------------
def writer = new StringWriter()
def xml = new groovy.xml.MarkupBuilder(writer)
.configuration('xmlns': 'http://www.jooq.org/xsd/jooq-codegen-3.11.0.xsd') {
jdbc() {
driver('org.h2.Driver')
url('jdbc:h2:~/test-gradle')
user('sa')
password('')
}
generator() {
database() {
}
// Watch out for this caveat when using MarkupBuilder with "reserved names"
// - https://github.com/jOOQ/jOOQ/issues/4797
// - http://stackoverflow.com/a/11389034/521799
// - https://groups.google.com/forum/#!topic/jooq-user/wi4S9rRxk4A
generate([:]) {
pojos true
daos true
}
target() {
packageName('org.jooq.example.gradle.db')
directory('src/main/java')
}
}
}
In case of conflict between the above default value and a more concrete, local configuration, the latter
prevails and the default is overridden.
7. Tools
These chapters hold some information about tools to be used with jOOQ
jOOQ has two annotations that are very interesting for the Checker Framework to type check, namely:
- org.jooq.Support: This annotation documents jOOQ DSL API with valuable information about
which database supports a given SQL clause or function, etc. For instance, only CUBRID,
Informix, and Oracle currently support the CONNECT BY clause.
- org.jooq.PlainSQL: This annotation documents jOOQ DSL API which operates on plain SQL. Plain
SQL being string-based SQL that is injected into a jOOQ expression tree, these API elements
introduce a certain SQL injection risk (just like JDBC in general), if users are not careful.
Using the optional jooq-checker module (available only from Maven Central), users can now type-check
their code to work only with a given set of dialects, or to forbid access to plain SQL.
Example:
A detailed blog post shows how this works in depth. By adding a simple dependency to your Maven
build:
<dependency>
<!-- Use org.jooq for the Open Source edition
org.jooq.pro for commercial editions,
org.jooq.pro-java-6 for commercial editions with Java 6 support,
org.jooq.trial for the free trial edition -->
<groupId>org.jooq</groupId>
<artifactId>jooq-checker</artifactId>
<version>3.11.9</version>
</dependency>
SQLDialectChecker
The SQLDialect checker reads all of the org.jooq.Allow and org.jooq.Require annotations in your source
code and checks if the jOOQ API you're using is allowed and/or required in a given context, where that
context can be any scope, including:
- A package
- A class
- A method
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<fork>true</fork>
<annotationProcessors>
<annotationProcessor>org.jooq.checker.SQLDialectChecker</annotationProcessor>
</annotationProcessors>
<compilerArgs>
<arg>-Xbootclasspath/p:1.8</arg>
</compilerArgs>
</configuration>
</plugin>
And now, you'll no longer be able to use any SQL Server specific functionality that is not available in
Oracle, for instance. Perfect!
There are quite some delicate rules that play into this when you nest these annotations. Please refer
to this blog post for details.
PlainSQLChecker
This checker is much simpler. Just add the following compiler plugin to deactivate plain SQL usage by
default:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<fork>true</fork>
<annotationProcessors>
<annotationProcessor>org.jooq.checker.PlainSQLChecker</annotationProcessor>
</annotationProcessors>
<compilerArgs>
<arg>-Xbootclasspath/p:1.8</arg>
</compilerArgs>
</configuration>
</plugin>
From now on, you won't risk any SQL injection in your jOOQ code anymore, because your compiler
will reject all such API usage. If, however, you need to place an exception on a given package / class /
method, simply add the org.jooq.Allow.PlainSQL annotation, as such:
The Checker Framework does add some significant overhead in terms of compilation speed, and its
IDE tooling is not yet at a level where such checks can be fed into IDEs for real user feedback, but the
framework does work pretty well if you integrate it in your CI, nightly builds, etc.
See it in action:
package gudusoft.sql2jooq.readme;
import gudusoft.gsqlparser.EDbVendor;
import gudusoft.gsqlparser.sql2jooq.SQL2jOOQ;
import gudusoft.gsqlparser.sql2jooq.db.DatabaseMetaData;
import gudusoft.gsqlparser.sql2jooq.tool.DatabaseMetaUtil;
import java.sql.Connection;
import java.sql.DriverManager;
Class.forName("com.mysql.jdbc.Driver");
Connection conn = DriverManager.getConnection(url, userName, password);
SQL 2 jOOQ is a joint venture by Gudu Software Limited and Data Geekery GmbH. We will ship, test and
maintain this awesome new addition with our own deliverables. So far, SQL 2 jOOQ supports the MySQL
and PostgreSQL dialects and it is in an alpha stadium. Please, community, provide as much feedback
as possible to make this great tool rock even more!
Please take note of the fact that the sql2jooq library is Open Source, but it depends on the commercial
gsp.jar parser, whose trial licensing terms can be seen here:
https://github.com/sqlparser/sql2jooq/blob/master/sql2jooq/LICENSE-GSQLPARSER.txt
For more information about the General SQL Parser, please refer to the product blog.
Please report any issues, ideas, wishes to the jOOQ user group or the sql2jooq GitHub project.
8. Reference
These chapters hold some general jOOQ reference information
For an up-to-date list of currently supported RDBMS and minimal versions, please refer to http://
www.jooq.org/legal/licensing/#databases.
This chapter should document the most important notes about SQL, JDBC and jOOQ data types.
Each of these wrapper types extends java.lang.Number, wrapping a higher-level integer type, internally:
- YEAR TO MONTH: This interval type models a number of months and years
- DAY TO SECOND: This interval type models a number of days, hours, minutes, seconds and
milliseconds
Both interval types ship with a variant of subtypes, such as DAY TO HOUR, HOUR TO SECOND, etc. jOOQ
models these types as Java objects extending java.lang.Number: org.jooq.types.YearToMonth (where
Number.intValue() corresponds to the absolute number of months) and org.jooq.types.DayToSecond
(where Number.intValue() corresponds to the absolute number of milliseconds)
Interval arithmetic
In addition to the arithmetic expressions documented previously, interval arithmetic is also supported
by jOOQ. Essentially, the following operations are supported:
Field<Result<Record>> cursor;
In fact, such a cursor will be fetched immediately by jOOQ and wrapped in an org.jooq.Result object.
Field<Integer[]> intArray;
- H2
- HSQLDB
- Postgres
Performance implications
When binding TIMESTAMP variables to SQL statements, instead of truncating such variables to DATE,
the cost based optimiser may choose to widen the database column from DATE to TIMESTAMP using an
Oracle INTERNAL_FUNCTION(), which prevents index usage. Details about this behaviour can be seen
in this Stack Overflow question.
@Override
public final void sql(BindingSQLContext<Timestamp> ctx) throws SQLException {
render.keyword("cast").sql('(')
.visit(val(ctx.value()))
.sql(' ').keyword("as date").sql(')');
}
Deprecated functionality
Historic versions of jOOQ used to support a <dateAsTimestamp/> flag, which can be used with the out-
of-the-box org.jooq.impl.DateAsTimestampBinding as a custom data type binding:
<database>
<!-- Use this flag to force DATE columns to be of type TIMESTAMP -->
<dateAsTimestamp>true</dateAsTimestamp>
<!-- Define a custom binding for such DATE as TIMESTAMP columns -->
<forcedTypes>
<forcedType>
<userType>java.sql.Timestamp</userType>
<binding>org.jooq.impl.DateAsTimestampBinding</binding>
<types>DATE</types>
</forcedType>
</forcedTypes>
</database>
For more information, please refer to the manual's section about custom data type bindings.
The above example also shows missing operator overloading capabilities, where "=" is replaced by "," in
jOOQ. Another example are row value expressions, which can be formed with parentheses only in SQL:
(a, b) IN ((1, 2), (3, 4)) row(a, b).in(row(1, 2), row(3, 4))
In this case, ROW is an actual (optional) SQL keyword, implemented by at least PostgreSQL.
GROUP BY groupBy()
ORDER BY orderBy()
WHEN MATCHED THEN UPDATE whenMatchedThenUpdate()
Future versions of jOOQ may use all-uppercased method names in addition to the camel-cased ones
(to prevent collisions with Java keywords):
GROUP BY GROUP_BY()
ORDER BY ORDER_BY()
WHEN MATCHED THEN UPDATE WHEN_MATCHED_THEN_UPDATE()
- BEGIN .. END
- REPEAT .. UNTIL
- IF .. THEN .. ELSE .. END IF
jOOQ omits some of those keywords when it is too tedious to write them in Java.
The above example omits THEN and END keywords in Java. Future versions of jOOQ may comprise a
more complete DSL, including such keywords again though, to provide a more 1:1 match for the SQL
language.
The parentheses used for the WITHIN GROUP (..) and OVER (..) clauses are required in SQL but do not
seem to add any immediate value. In some cases, jOOQ omits them, although the above might be
optionally re-phrased in the future to form a more SQLesque experience:
- CASE
- ELSE
- FOR
- BOOLEAN
- CHAR
- DEFAULT
- DOUBLE
- ENUM
- FLOAT
- IF
- INT
- LONG
- PACKAGE
= equal(), eq()
<>, != notEqual(), ne()
|| concat()
SET a = b set(a, b)
For those users using jOOQ with Scala or Groovy, operator overloading and implicit conversion can be
leveraged to enhance jOOQ:
= ===
<>, != <>, !==
|| ||
A more sophisticated example are common table expressions (CTE), which are currently not supported
by jOOQ:
WITH t(a, b) AS (
SELECT 1, 2 FROM DUAL
)
SELECT t.a, t.b
FROM t
Common table expressions define a "derived column list", just like table aliases can do. The formal
record type thus created cannot be typesafely verified by the Java compiler, i.e. it is not possible to
formally dereference t.a from t.
- To evade JDBC's verbosity and error-proneness due to string concatenation and index-based
variable binding
- To add lots of type-safety to your inline SQL
- To increase productivity when writing inline SQL using your favourite IDE's autocompletion
capabilities
With jOOQ being in the core of your application, you want to be sure that you can trust jOOQ. That is
why jOOQ is heavily unit and integration tested with a strong focus on integration tests:
Unit tests
Unit tests are performed against dummy JDBC interfaces using http://jmock.org/. These tests verify that
various org.jooq.QueryPart implementations render correct SQL and bind variables correctly.
Integration tests
This is the most important part of the jOOQ test suites. Some 1500 queries are currently run against
a standard integration test database. Both the test database and the queries are translated into every
one of the 14 supported SQL dialects to ensure that regressions are unlikely to be introduced into the
code base.
For libraries like jOOQ, integration tests are much more expressive than unit tests, as there are so many
subtle differences in SQL dialects. Simple mocks just don't give as much feedback as an actual database
instance.
jOOQ integration tests run the weirdest and most unrealistic queries. As a side-effect of these extensive
integration test suites, many corner-case bugs for JDBC drivers and/or open source databases have
been discovered, feature requests submitted through jOOQ and reported mainly to CUBRID, Derby,
H2, HSQLDB.
Routines r1 = ROUTINES.as("r1");
Routines r2 = ROUTINES.as("r2");
// Ignore the data type when there is at least one out parameter
DSL.when(exists(
selectOne()
.from(PARAMETERS)
.where(PARAMETERS.SPECIFIC_SCHEMA.eq(r1.SPECIFIC_SCHEMA))
.and(PARAMETERS.SPECIFIC_NAME.eq(r1.SPECIFIC_NAME))
.and(upper(PARAMETERS.PARAMETER_MODE).ne("IN"))),
val("void"))
.otherwise(r1.DATA_TYPE).as("data_type"),
r1.CHARACTER_MAXIMUM_LENGTH,
r1.NUMERIC_PRECISION,
r1.NUMERIC_SCALE,
r1.TYPE_UDT_NAME,
These rather complex queries show that the jOOQ API is fit for advanced SQL use-cases, compared to
the rather simple, often unrealistic queries in the integration test suite.
- There is only one place in the entire code base, which consumes values from a JDBC ResultSet
- There is only one place in the entire code base, which transforms jOOQ Records into custom
POJOs
Keeping things DRY leads to longer stack traces, but in turn, also increases the relevance of highly
reusable code-blocks. Chances that some parts of the jOOQ code base slips by integration test coverage
decrease significantly.
- [N] in Row[N] has been raised from 8 to 22. This means that existing row value expressions with
degree >= 9 are now type-safe
- Subqueries returned from DSL.select(...) now implement Select<Record[N]>, not Select<Record>
- IN predicates and comparison predicates taking subselects changed incompatibly
- INSERT and MERGE statements now take typesafe VALUES() clauses
// But Record2 extends Record. You don't have to use the additional typesafety:
Record record = create.select(BOOK.TITLE, BOOK.ID).from(BOOK).where(ID.eq(1)).fetchOne();
Result<?> result = create.select(BOOK.TITLE, BOOK.ID).from(BOOK).fetch();
Factory was split into DSL (query building) and DSLContext (query
execution)
The pre-existing Factory class has been split into two parts:
o The DSL: This class contains only static factory methods. All QueryParts constructed from
this class are "unattached", i.e. queries that are constructed through DSL cannot be executed
immediately. This is useful for subqueries.
The DSL class corresponds to the static part of the jOOQ 2.x Factory type
o The DSLContext: This type holds a reference to a Configuration and can construct executable
("attached") QueryParts.
The DSLContext type corresponds to the non-static part of the jOOQ 2.x Factory /
FactoryOperations type.
// jOOQ 3.0
DSLContext create = DSL.using(connection, dialect);
create.selectOne()
.whereExists(
selectFrom(BOOK) // Create a static subselect from the DSL
).fetch(); // Execute the "attached" query
// jOOQ 2.6
Condition condition = BOOK.ID.equalAny(create.select(BOOK.ID).from(BOOK));
// jOOQ 3.0 adds some typesafety to comparison predicates involving quantified selects
QuantifiedSelect<Record1<Integer>> subselect = any(select(BOOK.ID).from(BOOK));
Condition condition = BOOK.ID.eq(subselect);
FieldProvider
The FieldProvider marker interface was removed. Its methods still exist on FieldProvider subtypes. Note,
they have changed names from getField() to field() and from getIndex() to indexOf()
GroupField
GroupField has been introduced as a DSL marker interface to denote fields that can be passed to
GROUP BY clauses. This includes all org.jooq.Field types. However, fields obtained from ROLLUP(),
CUBE(), and GROUPING SETS() functions no longer implement Field. Instead, they only implement
GroupField. An example:
// jOOQ 2.6
Field<?> field1a = Factory.rollup(...); // OK
Field<?> field2a = Factory.one(); // OK
// jOOQ 3.0
GroupField field1b = DSL.rollup(...); // OK
Field<?> field1c = DSL.rollup(...); // Compilation error
GroupField field2b = DSL.one(); // OK
Field<?> field2c = DSL.one(); // OK
NULL predicate
Beware! Previously, Field.eq(null) was translated internally to an IS NULL predicate. This is no longer the
case. Binding Java "null" to a comparison predicate will result in a regular comparison predicate (which
never returns true). This was changed for several reasons:
Here is an example how to check if a field has a given value, without applying SQL's ternary NULL logic:
// jOOQ 2.6
Condition condition1 = BOOK.TITLE.eq(possiblyNull);
// jOOQ 3.0
Condition condition2 = BOOK.TITLE.eq(possiblyNull).or(BOOK.TITLE.isNull().and(val(possiblyNull).isNull()));
Condition condition3 = BOOK.TITLE.isNotDistinctFrom(possiblyNull);
Configuration
DSLContext, ExecuteContext, RenderContext, BindContext no longer extend Configuration for
"convenience". From jOOQ 3.0 onwards, composition is chosen over inheritance as these objects are
not really configurations. Most importantly
In order to resolve confusion that used to arise because of different lifecycle durations, these types are
now no longer formally connected through inheritance.
ConnectionProvider
In order to allow for simpler connection / data source management, jOOQ externalised connection
handling in a new ConnectionProvider type. The previous two connection modes are maintained
backwards-compatibly (JDBC standalone connection mode, pooled DataSource mode). Other
connection modes can be injected using:
- Connection-related JDBC wrapper utility methods (commit, rollback, etc) have been moved to the
new DefaultConnectionProvider. They're no longer available from the DSLContext. This had been
confusing to some users who called upon these methods while operating in pool DataSource
mode.
ExecuteListeners
ExecuteListeners can no longer be configured via Settings. Instead they have to be injected into the
Configuration. This resolves many class loader issues that were encountered before. It also helps
listener implementations control their lifecycles themselves.
Object renames
These objects have been moved / renamed:
- jOOU: a library used to represent unsigned integer types was moved from org.jooq.util.unsigned
to org.jooq.util.types (which already contained INTERVAL data types)
Feature removals
Here are some minor features that have been removed in jOOQ 3.0
- The ant task for code generation was removed, as it was not up to date at all. Code generation
through ant can be performed easily by calling jOOQ's GenerationTool through a <java> target.
- The navigation methods and "foreign key setters" are no longer generated in Record classes, as
they are useful only to few users and the generated code is very collision-prone.
- The code generation configuration no longer accepts comma-separated regular expressions.
Use the regex pipe | instead.
- The code generation configuration can no longer be loaded from .properties files. Only XML
configurations are supported.
- The master data type feature is no longer supported. This feature was unlikely to behave exactly
as users expected. It is better if users write their own code generators to generate master enum
data types from their database tables. jOOQ's enum mapping and converter features sufficiently
cover interacting with such user-defined types.
- The DSL subtypes are no longer instanciable. As DSL now only contains static methods,
subclassing is no longer useful. There are still dialect-specific DSL types providing static methods
for dialect-specific functions. But the code-generator no longer generates a schema-specific DSL
- The concept of a "main key" is no longer supported. The code generator produces
UpdatableRecords only if the underlying table has a PRIMARY KEY. The reason for this removal
is the fact that "main keys" are not reliable enough. They were chosen arbitrarily among UNIQUE
KEYs.
- The UpdatableTable type has been removed. While adding significant complexity to the type
hierarchy, this type adds not much value over a simple Table.getPrimaryKey() != null check.
- The USE statement support has been removed from jOOQ. Its behaviour was ill-defined, while it
didn't work the same way (or didn't work at all) in some databases.
8.6. Credits
jOOQ lives in a very challenging ecosystem. The Java to SQL interface is still one of the most important
system interfaces. Yet there are still a lot of open questions, best practices and no "true" standard has
been established. This situation gave way to a lot of tools, APIs, utilities which essentially tackle the same
problem domain as jOOQ. jOOQ has gotten great inspiration from pre-existing tools and this section
should give them some credit. Here is a list of inspirational tools in alphabetical order:
- Hibernate: The de-facto standard (JPA) with its useful table-to-POJO mapping features have
influenced jOOQ's org.jooq.ResultQuery facilities
- JaQu: H2's own fluent API for querying databases
- JPA: The de-facto standard in the javax.persistence packages, supplied by Oracle. Its annotations
are useful to jOOQ as well.
- OneWebSQL: A commercial SQL abstraction API with support for DAO source code generation,
which was integrated also in jOOQ
- QueryDSL: A "LINQ-port" to Java. It has a similar fluent API, a similar code-generation facility, yet
quite a different purpose. While jOOQ is all about SQL, QueryDSL (like LINQ) is mostly about
querying.
- SLICK: A "LINQ-like" database abstraction layer for Scala. Unlike LINQ, its API doesn't really
remind of SQL. Instead, it makes SQL look like Scala.
- Spring Data: Spring's JdbcTemplate knows RowMappers, which are reflected by jOOQ's
RecordHandler or RecordMapper