JSF/Primefaces and session timeout login redirection

If you use JSF and Primefaces you can encouter the case where your session expired and the next action your user does is an Ajax request. If you configured a form login then the ajax request will get the form redirection as expected but…it is an ajax call so your user will see nothing basically excepted an irresponsive GUI.

To work around it you can implement a custom PhaseListener checking if the request is an ajax request redirected to the login page and if so enforce the redirection through JSF API.

Continue reading

JavaEE and Swagger: TomEE example

More and more applications are composed of REST services. In JavaEE land it means you develop and expose JAX-RS services.

Once developped and well tested with TomEE the first thing you will realize is that to make an API useful you need to document it. There are a lot of ways to do it but Swagger seems to be the trendy one and it is indeed a nice solution as we’ll see in this post.

Continue reading

Mix TomEE embedded and Angular 2 with Maven

Since months a typical web application is a JAX-RS for the server-side and a javascript on the client-side.

This powerful architecture can sometimes reveal some challenges in the build pipeline.

However today it is not that hard to mix both frontend and backend build tools to get a single build pipeline easily integrable in a continuous integration solution.

To illustrate that we’ll digg into how to create an Angular 2 application packaged with Maven.

Continue reading

JPA + Java8 Stream = paginated findAll()

For batch tasks it is quite common to need to browse a full table. Depending the table it can be done in memory without thinking much or it can be too big and needs pagination.

A common solution was to use a kind of PageResult object which was representing the current page and its index and let the client/caller iterating over PageResults.

With java 8 streams the API can be more concise and efficient.

Continue reading

CDI Context: it is possible without scope annotations in your API!

If you never implemented a CDI context/scope it is a simple as implementing this interface:

public interface Context {
   Class<? extends Annotation> getScope();
   <T> T get(Contextual<T> component, CreationalContext<T> creationalContext);
   <T> T get(Contextual<T> component);
   boolean isActive();

Note: in CDI 1.1 there is AlterableContext too which just adds a destroy(Contextual) method which is not important for this post so I will ignore it but I would recommand you to use it instead of Context if you can rely on CDI 1.1.

The Context implementation is quite simple:

  • isActive() returns true if the context is usable by the CDI container
  • getScope() returns the associated annotation (often @XXXScoped)
  • get(Contextual) returns the instance of the Contextual (~= Bean) for “current” context
  • get(Contextual, CreationalContext) creates or returns current instance

Creating a scope “annotation” is as easy as creating a runtime annotation:

// @NormalScope(passivating=false)
@Target({ METHOD, TYPE, FIELD })
public @interface WrappingMethodScoped {

Note: the @NormalScope is optional since it can be done by an extension – this is what we’ll do.

Now we know what is a CDI scope let see how to activate it programmatically.

Continue reading

@CacheResult: JCache + CDI to the rescue of microservices?

JCache API comes with several built in interceptors for CDI making its usage decoupled from the cache API itself and more user friendly.

Let’s have a look to this API.

CacheResult: the method execution killer

Probable one of the most common use cache is to avoid to pay the cost of a method each time you call it.

Reasons can be as different as:

  • Computation done by the method is expensive
  • The method contacts a remote service and you want to cut off the implied latency
  • The method accesses a rate limited resource
  • ….

In this cache @CacheResult brings a nice and easy to setup solution. Simply decorating the method with @CacheResult you will avoid the actual method invocation after the first call and while it is cached.

Basic usage

Here a sample using a service simulating a slow method:

Continue reading

OpenJPA and Serialization, or how to replace @AttributeConverter in JPA 2.0

JPA 2.1 introduced the @AttributeConverter annotation. The idea is to be able to implement and map programmatically the serialization of a field.
One typical example today is to use it to be able to use Java 8 new types in its entities such as LocalDateTime. This type is not handled yet by JPA specification (JPA 2.1 was too early).

OpenJPA being still JPA 2.0, you surely think it is not possible to get this feature.
However, by entering a bit in OpenJPA you will realize that it has provided this feature since years!

Of course it uses a vendor API, as all vendor API doesn’t map 1-1 the API introduced in the specification. Yet, the use of it is quite simple, and if you think in term of migration path to JPA 2.1, you’ll see that it’s easy to write a one shot tool to migrate all entities to the standard annotation/implementation (I’ll deal with that point at the end of the post).

Externalize me, Factorize me!

OpenJPA provides two specific annotations for the conversion: org.apache.openjpa.persistence.Externalizer and org.apache.openjpa.persistence.Factory.

The first one is used to convert the instance to the value to serialize in the database. The second
one is the symetric: it reads the serialized value and builds an instance used in the entity.

OpenJPA supports instance method reference as well as static method reference. In this case you can pass the “to convert” value
and StoreContext instance giving you some information on the persistence environment.

Note: to ease later migration and keep converter logic simple, sticking to String in the signature is not a bad idea.

Oops, java serialization, seriously?

At that point, you know you can write some code and wire it in your entity thanks to two annotations. Now:

  • you write your converter,
  • start to execute some JPA statements
  • and you realize that OpenJPA serializes the value you return in your externalizer.

Well, it sounds logic if you return a custom object…but it makes it even for a String.

Actually, it is not really an issue because the field is not fully considered as persistent. If you want it to be
so, then use varchar you have to decorate your column with @org.apache.openjpa.persistence.Persistent instead of storing Blob for String.

Once it is done, you get the varchar column and SQL friendly values.

One sample

Let’s take a simple entity having a long ID and a LocalDateTime field.

Converter logic can be:

public interface LocalDateTimes {
    ZoneId ZONE_ID = ZoneId.systemDefault();

    static String toString(final LocalDateTime time) {
        return time.atZone(ZONE_ID).format(DateTimeFormatter.ISO_DATE_TIME);

    static LocalDateTime fromString(final String time) {
        return LocalDateTime.ofInstant(Instant.from(DateTimeFormatter.ISO_DATE_TIME.parse(time)), ZONE_ID);

Our naked entity would be:

public class DatedEntity { // + getters/setters
    private long id;

    private LocalDateTime created;

Now let’s wire our converter:

public class DatedEntity {
    private long id;

    private LocalDateTime created;

And here we are. If you dump the SQL statement used by OpenJPA to create the table, you’ll get:


Note: if you remove the @Persistent annotation you’ll get:


and the values would be serialized java.

without modifying entities

OpenJPA supports XML configuration through mapping files for externalizer/factory.

Just use the extended orm schema of openjpa:

<entity-mappings xmlns="" 	xmlns:openjpa="" 	xmlns:orm="" 	xmlns:xsi="" 	version="2.0">

<entity class="com.github.rmannibucau.domain.DatedEntity">
    <openjpa:persistent name="externalizer"       externalizer="com.rmannibucau.java8.LocalDateTimes.toString"       factory="com.rmannibucau.java8.LocalDateTimes.fromString"/>


Test it!

If you reuse OpenJPARule I talked about in a previous post, you can write a test like:

public class DatedEntityTest {
    public final OpenJPARule $ = new OpenJPARule()
            .configure("openjpa.Log", "SQL=TRACE")
                "PrintParameters=true, PrettyPrint=true, PrettyPrintLineLength=80");

    public void checkDate() {
        final LocalDateTime now =;

        final DatedEntity de = new DatedEntity();

        $.transaction((em) -> {
            return null;

        DatedEntity loaded = $.transaction((em) -> em.find(DatedEntity.class, 1L));
        assertEquals(now, loaded.getCreated());

        $.transaction((em) -> {
            final DatedEntity entity = em.find(DatedEntity.class, 1L);
            return entity;

The log output looks like:

27  INFO   [main] openjpa.Runtime - Starting OpenJPA 2.4.0
144  INFO   [main] openjpa.jdbc.JDBC - Using dictionary class "org.apache.openjpa.jdbc.sql.HSQLDictionary".
919  INFO   [main] openjpa.jdbc.JDBC - Connected to HSQL Database Engine version 2.2 using JDBC driver HSQL Database Engine Driver version 2.3.2.
1271  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 1050065615> executing prepstmnt 265321659

1272  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 1050065615> [0 ms] spent
1283  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 270056930> executing stmnt 794075965
1284  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 270056930> [1 ms] spent
1554  INFO   [main] openjpa.Enhance - Creating subclass and redefining methods for "[class com.github.rmannibucau.openjpa.java8.DatedEntity]". This means that your application will be less efficient than it would if you ran the OpenJPA enhancer.
1698  INFO   [main] openjpa.Runtime - OpenJPA dynamically loaded the class enhancer. Any classes that were not enhanced at build time will be enhanced when they are loaded by the JVM.
1831  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 633240419> executing prepstmnt 1916575798
INSERT INTO DatedEntity (id, created)
    VALUES (?, ?)
[params=(long) 1, (String) 2015-05-24T09:36:20.229+02:00[Europe/Paris]]
1832  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 633240419> [0 ms] spent
1873  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 1222768327> executing prepstmnt 1193471756
SELECT t0.created
    FROM DatedEntity t0
    WHERE = ?
[params=(long) 1]
1874  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 1222768327> [0 ms] spent
1938  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 76659128> executing prepstmnt 2032169857
    WHERE id = ?
[params=(long) 1]
1941  TRACE  [main] openjpa.jdbc.SQL - <t 896644936, conn 76659128> [2 ms] spent

You can identify the serialized date is correctly bound to the prepared statement:

[params=(long) 1, (String) 2015-05-24T09:36:20.229+02:00[Europe/Paris]]

Preparing to JPA 2.1

This post is already too long to detail the implementation but I’ll try to give the overall idea to easily migrate entities to JPA 2.1 converters if needed.

If you don’t have much entities, don’t waste your time writing any tool. Instead, directly do the migration manually.
I know it may sound obvious, but any tool has an investment in terms of time, and here in particular as you need to write it. Personally, if the migration takes less than 20 minutes by hand, I would go for the manual solution.

If you think it would be longer, or if you are in a big company and you want to provide an out-of-the-box tool for all projects, here are few points to succeed in writing such a tool:

  • wrap the conversion in an easy-to-run tool. Since that’s a one-shot migration, I would write a simple main(String[]) which would integrate with maven and gradle easily. In fact, many developers will rely on it, through maven-exec-plugin for maven for instance.
    Though, the packaging can be a all-in-one bundle making it easy to run without integrating to any tooling as well. Once again that’s a one shot
    tool, so no need to update its project pom in absolute.
  • finding entities. Two choices here: either you rely on bytecode parsing or on source code parsing. Bytecode parsing implies sources to be compiled to an annotation processor. This can be a good choice but bytecode parsing is not a big hypothesis. Thus, using any class finder like xbean-finder is an easy option as well.
  • finding converters: once you’ve found entities (see previous step) you need to find converters. It is not that hard: simply check for
    @Externalizer and @Factory in your entities. If you used xbean-finder in the previous step you can skip entities finding and directly find these annotations on fields and methods. Side note: if you configured them through xml, be sure to find persistence.xml files and read mapping, you can reuse OpenJPA implementation parser to get them.
  • parse @Externalizer and @Factory to get the “converter” implementation. Here is the tip: once you get the converter, the idea is to simply generate a JPA 2.1 converter delegating to the old one. OpenJPA supports several signatures for the externalizers/factories. In practice, the most common use is to have a static method thus a plain delegation is enough.
  • once you’ve generated all converters (if they are reused accross the code, no need to generate them twice), add to all entities
    the @AttributeConverter annotation to take them into account and remove openjpa specific imports and decoration on he field/method.

This can look a bit complicated but it actually takes less than 1h of hacking.

Want more content ? Keep up-to-date on my new blog

Or stay in touch on twitter @rmannibucau

CDI and @Startup: SOLVED!

Speaking on the CDI list about CDI 2.0 standalone container API Jozef Hartinger sent an answer which is quite obvious when you know it but which can change your life if you don’t: how CDI 1.1 introduced @Startup without the need of EJBs.

Continue reading

CDI and Instance: 3 pitfalls you need to know

Recently on OpenWebBeans mailing list a nice discussion was started by Karl Kildén about its usage of CDI Instance.

This API is nice and useful but it is not as trivial as it can look. Let’s dig a bit into it.

Continue reading

Json API (JavaEE 7) + Java 8: collection to JsonArray

JavaEE 7 brings an API for JSON arrays (the surprising JsonArray ;)), Java 8 brings a stream API and lambdas so now you can combine both to create a JsonArray from a Collection!

Continue reading