Tag Archives: cdi

@Throttled: the CDI extension


In JavaEE the throttling is often done using a stateless bean cause they are by designed pooled and the pool provide a contention point. This is however IMO a workaround more than a solution for the throttling need and a small CDI extension can be worth it.

Continue reading

CDI: replace the configuration by a register pattern


CDI doesn’t really have a configuration file. Of course the beans.xml is used to activate few features like interceptors but you can’t register a bean in it, can’t add a qualifier on a bean etc…

When it comes to writing a CDI library the question of the configuration hits you pretty quickly. Let see how to solve it with a not very complicated pattern making users life really nicer.

Continue reading

YAML configuration for DeltaSpike


YAML is a nice and readable format for configuration allowing to set your properties hierarchically making them organized and readable.

Let see how to use this format with Apache DeltaSpike @ConfigProperty injections!

Continue reading

CDI Mapper: get rid of the proxy layer!


In a recent post (https://rmannibucau.wordpress.com/2015/12/01/write-your-own-cdi-extension-for-bean-mapping/) I explained how to implement a simple mapper with CDI integration. We can actually make it simpler leveraging on CDI a bit more.

Continue reading

DeltaSpike configuration: read where you want and decrypt passwords


DeltaSpike configuration: read where you want and decrypt passwords

DeltaSpike configuration is a very elegant configuration solution for CDI.

However to make it fitting your application you often need to integrate it:

  • to read the configuration from the source/location you desire
  • to use a custom algorithm to decrypt passwords or sensitive data

Read your own configuration file

There are generally three cases to add a custom configuration file:

  • respect a company convention
  • read it from outside the application (in ${tomee.base}/conf for instance ;))
  • read a custom format (yaml, xml, …)

In this post we will tackle the second one since mixing the last two solves generally the first one and the last one is mainly the same solution with some specific conversion logic I don’t want to enter in for this post.

So our goal will be to read the configuration from ${catalina.base}/conf/my-app.properties and add it in deltaspike properties.

To do so we just need deltaspike core:

<dependency>
  <groupId>org.apache.deltaspike.core</groupId>
  <artifactId>deltaspike-core-api</artifactId>
  <version>${deltaspike.version}</version>
</dependency>
<dependency>
  <groupId>org.apache.deltaspike.core</groupId>
  <artifactId>deltaspike-core-impl</artifactId>
  <version>${deltaspike.version}</version>
</dependency>

Then we need to implement a custom org.apache.deltaspike.core.spi.config.ConfigSource reading ${catalina.base} from the corresponding system properties (our implementation will have a fallback on openejb.base property for openejb embedded tests):

import org.apache.deltaspike.core.impl.config.PropertiesConfigSource;

import java.io.BufferedInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.Properties;

// uses ${base}/conf/my-app.properties as source
public class MyConfigSource extends PropertiesConfigSource {
    public MyConfigSource() {
        super(loadProperties());
    }

    public String getConfigName() {
        return "MyAppConfig";
    }

    private static Properties loadProperties() {
        return new Properties() {{
            final File config = new File(
                System.getProperty("catalina.base", System.getProperty("openejb.base", "")),
                "conf/my-app.properties");
            if (config.isFile()) {
                try (final InputStream is = new BufferedInputStream(new FileInputStream(config))) {
                    load(is);
                } catch (final IOException e) {
                    throw new IllegalArgumentException(e);
                }
            }
        }};
    }
}

Then to “activate” it just create a META-INF/services/org.apache.deltaspike.core.spi.config.ConfigSource containing our qualified class name.

Decrypt passwords with your own algorithm

To decrypt password DeltaSpike uses org.apache.deltaspike.core.spi.config.ConfigFilter implementations. It has two methods:

  • filterValue: actual decryption
  • filterValueForLog: decryption for logging, in general I just log “xxxxxx” if my filter handles the value

For this post the decryption will just reverse the value but a real implementation can use any ciphering considered secured in your environment:

import org.apache.deltaspike.core.spi.config.ConfigFilter;

public class MyConfigFilter implements ConfigFilter {
    @Override
    public String filterValue(final String key, final String value) {
        return isEncrypted(key) ? decrypt(value) : value;
    }

    @Override // filter passwords and secrets in logs
    public String filterValueForLog(final String key, final String value) {
        return isEncrypted(key) ? "xxxxxx" : value;
    }

    // for the sample just "reverse" the string but in real life use some encryption
    private String decrypt(final String value) {
        return new StringBuilder(value).reverse().toString();
    }

    private boolean isEncrypted(final String key) {
        return key.contains("password") || key.contains("secret");
    }
}

As for ConfigSource and since this classes are used before CDI is started to configure DeltaSpike itself don’t forget to register the filter adding the fully qualified name in META-INF/services/org.apache.deltaspike.core.spi.config.ConfigFilter.

Now if you set for instance in your configuration:

my.password = tset

And get my.password injected:

@Inject
@ConfigProperty(name = "my.password")
private String pwd;

Then pwd value will be test :).

Conclusion

The code of this post can be found there: https://github.com/rmannibucau/deltaspike-config-example.

The interesting part is to understand deltaspike can be integrated with all kind of configuration and environment which is the main feature of a configuration API. Then you still get it integrated with CDI for free thanks to DeltaSpike @ConfigProperty which makes your application simple and decoupled from your actual configuration system.

Write your own CDI extension for bean mapping


CDI descriptive bean mapping: how to write a CDI extension to map beans

The idea of this post is to show you how to end up with a CDI extension allowing you to get injected a mapper defined only doing this:

@Mapper
public interface MyMapper {
    @Mapping(source = "inputId", target = "id")
    @Mapping(source = "employeeId")
    Output1 toOutput1(final Input2 input);

    @Mapping(source = "id")
    @Mapping(source = "name", target = "firstName")
    Output2 toOutput2(final Input1 input);
}

Of course the API is very (very) close to mapstruct one and this post doesn’t intend to go that far but the difference is that the extension will all be built for runtime analysis using CDI. Said otherwise it is more dynamic and usable in real projects when you want a declarative API.

First define the API

The API is pretty straight forward:

  • @Mapper is marking an interface as a mapper – this could be optional but makes code cleaner IMO
  • @Mapping is a repeatable annotation defining which field – source – is read in the input (parameter) and which field – target – is set in the output (returned type). Small sugar there, if source and target are equals, target is optional.

Since this is just defining three annotations I’ll just paste the code there:

@Target(TYPE)
@Retention(RUNTIME)
public @interface Mapper {
}

@Repeatable(Mappings.class)
@Target(METHOD)
@Retention(RUNTIME)
public @interface Mapping {
    String source();
    String target() default "";
}

@Target(METHOD)
@Retention(RUNTIME)
public @interface Mappings {
    Mapping[] value();
}

Creating instances from the interfaces

So how to create an instance of a bean if we have such an interface? Just reading all metadata and creating a proxy!

Creaying a proxy is as simple as calling:

final MyMapper mapper = (MyMapper) Proxy.newProxyInstance(contextClassLoader, new Class<?>{} { MyMapper.class }, handler);

So the obvious thing is we need a handler able to do the conversion on each method invocation.

It is done implementing java.lang.reflect.InvocationHandler. For this post implementation, the MapperHandler will read from an AnnotatedType the metadata (annotations) to build its runtime model (used to actually do the mapping) and an AtomicReference since our implementation will just abstract the coercing of types to not make this post too long.

The idea is to build a model with a map of reader/writer pairs which will get used to map input to the output:

public class MapperHandler implements InvocationHandler {
    private final Map<Method, MappingMethod> mapping;
    private final AtomicReference<Converter> converter;

    public <T> MapperHandler(final AnnotatedType<Object> type, final AtomicReference<Converter> converter) {
        this.mapping = type.getMethods().stream()
            .filter(m -> m.isAnnotationPresent(Mappings.class) && m.getParameters().size() == 1)
            .collect(toMap(AnnotatedMethod::getJavaMember, MappingMethod::new));
        this.converter = converter;
    }

    @Override
    public Object invoke(final Object proxy, final Method method, final Object[] args) throws Throwable {
        if (method.getDeclaringClass() == Object.class) {
            try {
                return method.invoke(this, args);
            } catch (final InvocationTargetException ite) {
                throw ite.getCause();
            }
        }
        return mapping.get(method).map(args[0]);
    }

    private class MappingMethod {
        private final Class<?> from;
        private final Class<?> to;
        private final Map<Reader, Writer> mapping;

        public MappingMethod(final AnnotatedMethod<?> annotatedMethod) {
            if (annotatedMethod.getParameters().size() != 1) {
                throw new IllegalArgumentException("Mapping method needs to have one parameter.");
            }

            from = Class.class.cast(annotatedMethod.getParameters().iterator().next().getBaseType());
            to = annotatedMethod.getJavaMember().getReturnType();

            mapping = Stream.of(annotatedMethod.getAnnotation(Mappings.class).value())
                // can be extended to support field access
                .collect(toMap(m -> new Reader() {
                    private final Method method = findMethod(from, mtd -> mtd.getName().equals("get" + toUppercase(m.source())) && mtd.getParameterCount() == 0, m.source());

                    @Override
                    public Object get(final Object instance) {
                        try {
                            return method.invoke(instance);
                        } catch (final IllegalAccessException e) {
                            throw new IllegalStateException(e);
                        } catch (final InvocationTargetException e) {
                            throw new IllegalStateException(e.getCause());
                        }
                    }
                }, m -> new Writer() {
                    private final Method method = findMethod(to, mtd -> mtd.getName().equals("set" + toUppercase(targetField())) && mtd.getParameterCount() == 1, targetField());

                    @Override
                    public void set(final Object instance, final Object value) {
                        try {
                            final Converter converter = MapperHandler.this.converter.get();
                            final boolean convert = !(converter == null || method.getParameterTypes()[0].isInstance(value));
                            method.invoke(instance, convert ? converter.to(value, method.getParameterTypes()[0]) : value);
                        } catch (final IllegalAccessException e) {
                            throw new IllegalStateException("error invoking " + method, e);
                        } catch (final InvocationTargetException e) {
                            throw new IllegalStateException("error invoking " + method, e.getCause());
                        }
                    }

                    private String targetField() {
                        return m.target().isEmpty() ? m.source() : m.target();
                    }
                }));
        }

        public Object map(final Object args) {
            if (!from.isInstance(args)) {
                throw new IllegalArgumentException(args + " not an instance of " + from);
            }

            try {
                final Object newInstance = to.newInstance();
                mapping.forEach((r, w) -> ofNullable(r.get(args)).ifPresent(v -> w.set(newInstance, v)));
                return newInstance;
            } catch (final IllegalAccessException | InstantiationException e) {
                throw new IllegalStateException(e);
            }
        }
    }

    private static String toUppercase(final String m) {
        return Character.toUpperCase(m.charAt(0)) + (m.length() == 1 ? "" : m.substring(1));
    }

    private static Method findMethod(final Class<?> type, final Predicate<Method> matcher, final String name) {
        for (final Method m : type.getMethods()) {
            if (matcher.test(m)) {
                return m;
            }
        }
        throw new IllegalArgumentException("Missing " + name);
    }

    @FunctionalInterface
    private interface Reader {
        Object get(Object instance);
    }

    @FunctionalInterface
    private interface Writer {
        void set(Object instance, Object value);
    }
}

Be able to register our proxy as a CDI Bean

To be able to add an “implementation” to CDI context we need to wrap our proxy in a javax.enterprise.inject.spi.Bean.

The implementation is straight forward and starts from the same input parameter as our handler:

public class MapperBean<T> implements Bean<T> {
    private final Set<Type> types;
    private final Set<Annotation> qualifiers;
    private final Class<T> clazz;
    private final Class<?>[] proxyTypes;
    private final MapperHandler handler;

    public MapperBean(final AnnotatedType at, final AtomicReference<Converter> converter) {
        clazz = at.getJavaClass();
        types = new HashSet<>(asList(clazz, Object.class));
        qualifiers = new HashSet<>(asList(DefaultLiteral.INSTANCE, AnyLiteral.INSTANCE));
        proxyTypes = new Class<?>[] { clazz };
        handler = new MapperHandler(at, converter);
    }

    @Override
    public Set<Type> getTypes() {
        return types;
    }

    @Override
    public Set<Annotation> getQualifiers() {
        return qualifiers;
    }

    @Override
    public Class<? extends Annotation> getScope() {
        return ApplicationScoped.class;
    }

    @Override
    public String getName() {
        return null;
    }

    @Override
    public boolean isNullable() {
        return false;
    }

    @Override
    public Set<InjectionPoint> getInjectionPoints() {
        return emptySet();
    }

    @Override
    public Class<?> getBeanClass() {
        return clazz;
    }

    @Override
    public Set<Class<? extends Annotation>> getStereotypes() {
        return emptySet();
    }

    @Override
    public boolean isAlternative() {
        return false;
    }

    @Override
    public T create(final CreationalContext<T> context) {
        final ClassLoader contextClassLoader = Thread.currentThread().getContextClassLoader();
        return (T) Proxy.newProxyInstance(
            contextClassLoader == null ? ClassLoader.getSystemClassLoader() : contextClassLoader,
            proxyTypes, handler);
    }

    @Override
    public void destroy(final T instance, final CreationalContext<T> context) {
        // no-op
    }
}

Things to note are:

  • We scoped our implementation @ApplicationScoped since the proxy is stateless
  • Most of methods are using default values since our proxy doesn’t need any injection or specific model
  • We set the @Default and @Any qualifiers to be able to retrieve our implementation without any specific qualifiers

Wire it all in an extension

Now all our implementation is ready we just need to make it real in a CDI extension (dont forget to register it in META-INF/services/javax.enterprise.inject.spi.Extension).
This extension will be responsible to capture mapper interface types and register the MapperBean to make them available in CDI context.

This sample implementation of this extension handles the retrieval of an optional Converter if you want to plug some advanced coercing for type conversion:

public class MapperExtension implements Extension {
    private final Collection<AnnotatedType<?>> detectedMappers = new ArrayList<>();
    private final AtomicReference<Converter> converterRef = new AtomicReference<>();

    void captureMapper(@Observes final ProcessAnnotatedType<?> potentialMapper) {
        final AnnotatedType<?> annotatedType = potentialMapper.getAnnotatedType();
        if (annotatedType.isAnnotationPresent(Mapper.class)) {
            detectedMappers.add(annotatedType);
        }
    }

    void addMapperBeans(@Observes final AfterBeanDiscovery abd) {
        detectedMappers.stream().forEach(at -> abd.addBean(new MapperBean(at, converterRef)));
        detectedMappers.clear();
    }

    void findConverter(@Observes final AfterDeploymentValidation adv, final BeanManager beanManager) {
        final Set<Bean<?>> beans = beanManager.getBeans(Converter.class);
        final Bean<?> bean = beanManager.resolve(beans);
        // converter should be normal-scoped otherwise we need to release the creational context when shutdown event is fired
        ofNullable(bean).ifPresent(b -> converterRef.set(Converter.class.cast(beanManager.getReference(bean, Converter.class, null))));
    }
}

Use your CDI Mapper extension!

Now suppose you deploy your extension with the initial sample of this post, then you can simply use it as in this example:

@Path("test")
@ApplicationScoped
public class MyEndpoint {
  @Inject
  private MyMapper mapper;

  @Inject
  private MyService service;

  @GET
  @Path("{id}")
  public Output1 findOutput(@PathParam("id) String id) {
      return mapper.toOutput1(service.findInput2(id));
  }
}

What is nice about such a solution – this includes mapstruct 🙂 – is you define your mapping in a well defined place. This means the behavior is well defined and dedicated to the mapping which avoid a lot of boilerplate code on one side and makes it easy to understand and maintain on the other side. The awesome CDI feature is thanks to AnnotatedType you can change the mapping dynamically and programmatically if you need without hanging the mapper (if they don’t belong to your own codebase for instance).

Happy mapping!

CDI Context: it is possible without scope annotations in your API!


If you never implemented a CDI context/scope it is a simple as implementing this interface:

public interface Context {
   Class<? extends Annotation> getScope();
   <T> T get(Contextual<T> component, CreationalContext<T> creationalContext);
   <T> T get(Contextual<T> component);
   boolean isActive();
}

Note: in CDI 1.1 there is AlterableContext too which just adds a destroy(Contextual) method which is not important for this post so I will ignore it but I would recommand you to use it instead of Context if you can rely on CDI 1.1.

The Context implementation is quite simple:

  • isActive() returns true if the context is usable by the CDI container
  • getScope() returns the associated annotation (often @XXXScoped)
  • get(Contextual) returns the instance of the Contextual (~= Bean) for “current” context
  • get(Contextual, CreationalContext) creates or returns current instance

Creating a scope “annotation” is as easy as creating a runtime annotation:

// @NormalScope(passivating=false)
@Target({ METHOD, TYPE, FIELD })
@Retention(RUNTIME)
public @interface WrappingMethodScoped {
}

Note: the @NormalScope is optional since it can be done by an extension – this is what we’ll do.

Now we know what is a CDI scope let see how to activate it programmatically.

Continue reading

@CacheResult: JCache + CDI to the rescue of microservices?


JCache API comes with several built in interceptors for CDI making its usage decoupled from the cache API itself and more user friendly.

Let’s have a look to this API.

CacheResult: the method execution killer

Probable one of the most common use cache is to avoid to pay the cost of a method each time you call it.

Reasons can be as different as:

  • Computation done by the method is expensive
  • The method contacts a remote service and you want to cut off the implied latency
  • The method accesses a rate limited resource
  • ….

In this cache @CacheResult brings a nice and easy to setup solution. Simply decorating the method with @CacheResult you will avoid the actual method invocation after the first call and while it is cached.

Basic usage

Here a sample using a service simulating a slow method:

Continue reading

Lambda-CDI in Alpha!


Lambda + CDI in Alpha!

Lambda are nice new features of Java 8 and I’m sure we’re still missing
a lot of their uses to get nicer API in all domains where Java can be used (i.e all domains ;)).

However, it is often limited to JavaSE only. Typically, you can desire to get “injected” as method
parameter(s) some CDI beans – this would be the same for Spring beans or any IoC, but if you browsed this blog
you’ll understand that I’ll deal with CDI.

Now, the question is : how to link both worlds and get the best of them?

An example

Here is a simple sample showing you how mixing CDI and lambda can look like:

public class MyRuntimeBuilder {
    public void doSomething() {
       onEvent((MyEvent e, MyCdiEventProcessor processor) -> {
           processor.process(e);
       }).then((MyCdiWebSocket w) -> {
           w.send("Got it!");
       });
    }
}

Then, why do we need Lambda? There are several cases:

  • Laziness: if you need to “lookup” the instances after having called the builder (CDI not yet fully started for instance).
  • Contextuality: if you build a DSL, you would want to get “method scoped” instances, and not to rely on DSL “builder” injected instances
  • Closeness: make the “injections” closer to their use. In several cases it makes the code more readable.
  • Reusability: this way your lambdas can be reused (keep in mind a lambda can also be a method with “::” syntax)
  • Because it is fun 😉
  • And much more…

To make the contextuality point more obvious, think of JBatch: you define a set of steps and we advise you to use @Dependent instances
for JBatch components. If you add a DSL and a BatchBuilder API, you can not count on injected components in your builder. If you have
N times the same component (N > 1), here is what you get:

public class MyBatchBuilder {
    public void define() {
       job().id("my-batch")
        .step().id("download-csv")
          .batchlet((HttpConnection connection) -> {
            connection.configureProxy("myproxy", "8080");
            connection.download("http://csvbase.com/mycsv.csv");
          })
        .step().id("download-xml")
          .batchlet((HttpConnection connection) -> {
            connection.configureTimeout(5, TimeUnit.MINUTE);
            connection.download("http://xmlbase.com/myxml.xml");
          })
    }
}

In the sample below, we can suppose HttpConnection Instance must be used only once. If we use it twice, its configuration would leak
between steps.

CDI: from reflection to parameter instances

In this part, we’ll suppose that we have a java.lang.reflect.Method. The question is: how to get CDI parameters to call the method instance? Here, the solution is not provided by CDI but it is not that far.

You can get the parameter types and their annotations from the method (here, I’ll simplify a bit and skip annotations since a lambda doesn’t have parameter annotations). You just need to lookup the associated instance from the bean type.

To do so, we’ll use the BeanManager “getInjectableReference” method and we’ll need an InjectionPoint and a CreationalContext.

To get the BeanManager, we’ll use new CDI 1.1 “CDI” utility class. If you are on a CDI 1.0 server you can use DeltaSpike BeanManagerProvider:

BeanManager bm = CDI.current().getBeanManager();

Then we’ll implement an InjectionPoint. In CDI 1.1 you can create it from an “AnnotatedParameter”. I’ll enlarge that solution later, but for CDi 1.0 you can implement a custom injection point to get the same result. The easiest way is to create an implementation of “AnnotatedMethod” and “AnnotatedParameter”. Both are linked as expected.

Here is a proposal for their implementation – note that you can enhance it to support more metadata for qualifiers for instance:

// AnnotatedMethodImpl
public class AnnotatedMethodImpl<T> implements AnnotatedMethod<T> {
    private final AnnotatedType<T> annotatedType;
    private final Method method;
    private final Set<Annotation> annotations;
    private final List<AnnotatedParameter<T>> parameters;

    public AnnotatedMethodImpl(AnnotatedType<T> annotatedType, Method method, int paramCount, Type[] paramTypes) {
        this.annotatedType = annotatedType;
        this.method = method;
        this.annotations = new HashSet<>(asList(method.getAnnotations()));
        this.parameters = new LinkedList<>();

        for (int i = 0; i < paramCount; i++) {
            this.parameters.add(new AnnotatedParameterImpl<>(this, i, paramTypes[i], new HashSet<>(asList(method.getParameterAnnotations()[i]))));
        }
    }

    @Override
    public List<AnnotatedParameter<T>> getParameters() {
        return parameters;
    }

    @Override
    public Method getJavaMember() {
        return method;
    }

    @Override
    public boolean isStatic() {
        return Modifier.isStatic(method.getModifiers());
    }

    @Override
    public AnnotatedType<T> getDeclaringType() {
        return annotatedType;
    }

    @Override
    public Type getBaseType() {
        return method.getDeclaringClass();
    }

    @Override
    public Set<Type> getTypeClosure() {
        return Collections.singleton(getBaseType());
    }

    @Override
    public Set<Annotation> getAnnotations() {
        return annotations;
    }

    @Override
    public <T extends Annotation> T getAnnotation(Class<T> annotationType) {
        return (T) annotations.stream().filter(a -> a.annotationType() == annotationType).findFirst().orElse(null);
    }

    @Override
    public boolean isAnnotationPresent(Class<? extends Annotation> annotationType) {
        return annotations.stream().filter(a -> a.annotationType() == annotationType).findFirst().isPresent();
    }
}

// AnnotatedParameterImpl
public class AnnotatedParameterImpl<T> implements AnnotatedParameter<T> {
    private final AnnotatedMethod<T> method;
    private final int position;
    private final Set<Type> types = new HashSet<>();
    private final Set<Annotation> annotations;
    private final Type baseType;

    public AnnotatedParameterImpl(AnnotatedMethod<T> method, int position, Type baseType, Set<Annotation> annotations) {
        this.method = method;
        this.baseType = baseType;
        this.position = position;

        this.types.add(baseType);
        this.types.add(Object.class);

        this.annotations = new HashSet<>(annotations);
    }

    @Override
    public int getPosition() {
        return position;
    }

    @Override
    public AnnotatedCallable<T> getDeclaringCallable() {
        return method;
    }

    @Override
    public Type getBaseType() {
        return baseType;
    }

    @Override
    public Set<Type> getTypeClosure() {
        return types;
    }

    @Override
    public <T extends Annotation> T getAnnotation(Class<T> annotationType) {
        for (Annotation a : annotations) {
            if (a.annotationType().getName().equals(annotationType.getName())) {
                return (T) a;
            }
        }
        return null;
    }

    @Override
    public Set<Annotation> getAnnotations() {
        return annotations;
    }

    @Override
    public boolean isAnnotationPresent(Class<? extends Annotation> annotationType) {
        return getAnnotation(annotationType) != null;
    }
}

NB: you surely noticed that I didn’t use “method.getParameterTypes()” (or generic version) to get parameter types. In fact, I used a provided array of “Type” because lambda will not fill it properly (see next part).

Now, we have all we need to create an injection point and then make a lookup on a  parameter.

Let’s do it:

Type[] types = ...;
Method method = ...;

BeanManager bm = CDI.current().getBeanManager();

// create our Annotated*
AnnotatedMethod annotatedMethod = new AnnotatedMethodImpl<>(bm.createAnnotatedType(method.getDeclaringClass()), method, cdiBeanParameterNumber, types);
List<AnnotatedParameter<?>> annotatedMethodParameters = annotatedMethod.getParameters();

// for all parameters, get the parameter instance
Parameters parameters = new Parameters(); // just a small wrapper to keep tracking of CreationalContexts
for (int i = 0; i < cdiBeanNumber; i++) {
    CreationalContext<?> creational = bm.createCreationalContext(null);
    Object instance = bm.getInjectableReference(bm.createInjectionPoint(annotatedMethodParameters.get(i)), creational);
    parameters.addInstance(instance, creational);
}

Note: cdiBeanNumber is optional and we can use method.getParameterCount(). But in my implementation I’ll support CDI (and non-CDI) parameters, so I need a limit.

Now that we have all our parameters, we just need to call the method with it!

But wait, how do I get the Method? Are the method types right? Let’s dig into it!

Lambda: from instance to Method

Finding a lambda method it is not that hard. You simply iterate over the lambda methods and remove all Object methods and default methods:

public static void invokeLambda(Object lambda, Object...args) {
    Class<?> lambdaClass = lambda.getClass();
    for (Method method : lambdaClass.getMethods()) {
        Class<?> declaringClass = method.getDeclaringClass();
        if (declaringClass == Object.class) {
            continue;
        }
        if (declaringClass == Serializable.class) {
            continue;
        }
        if (method.isDefault()) {
            continue;
        }

        // TODO: invoke it!
        return;
    }
    throw new IllegalArgumentException(lambda + " not a lambda");
}

Now that you identified the “lambda method”, remember that it can be a lambda or a normal (java 7 like) implementation of the functional interface. To check it, an easy way is to check instance class name. If it contains “$$Lambda” then it is a generated class by the JVM to implement the lambda. If it is not the case then you can use “method.getGenericParameterTypes()” to get method parameter types. If it is the case you need few more code.

You have two cases to find the lambda parameter types:

  • Your functional interface is Serializable. In this case, serialize the lambda calling writeReplace method and cast it to SerializedLambda. You can get the signature of a SerializedLambda (“getImplMethodSignature()”). From this signature, you can build a “MethodType” with “MethodType.fromMethodDescriptorString” which will allow you to access parameter types.
  • If you don’t want to make your functional interface serializable to keep your API clean, you can get the “ConstantPool” from the lambda class, and the types from this instance. This is the job of the next snippet.

Note: in next snippet there is few reflection for sun.* invocations. This is not mandatory and a strong cast would have worked but allows to keep the build failing on sun.* imports – important for other parts of the code to avoid wrong imports not noticed.

private  static java.lang.reflect.Type[] methodTypeParameters(Object lambdaInstance) {
    try {
        ConstantPool pool = ConstantPool.class.cast(GET_CONSTANT_POOL.invoke(lambdaInstance.getClass()));
        String[] methodRef = pool.getMemberRefInfoAt(pool.getSize() - 2);

        // Type[] types = jdk.internal.org.objectweb.asm.Type.getArgumentTypes(methodRef[2]);
        Object[] argTypes = (Object[]) GET_ARGUMENTS_TYPE.invoke(null, methodRef[2]);

        Collection<java.lang.reflect.Type> types = new ArrayList<>(argTypes.length);
        ClassLoader loader = Thread.currentThread().getContextClassLoader();
        for (Object argType : argTypes) {
            Class<?> clazz = loader.loadClass(String.valueOf(GET_CLASS_NAME.invoke(argType)));
            types.add(clazz);
        }
        return types.toArray(new java.lang.reflect.Type[types.size()]);
    } catch (Exception e) {
        throw new IllegalStateException(e);
    }
}

This code is low level but the overall idea is:

  1. extract the types from the lambda class metadata.
  2. convert it in java qualified name thanks to the shaded version of asm included in the JVM
  3. load the associated class.

Put it all together

So now we are able to find lambda parameter types, and to lookup CDI beans by type. It is time to put it all together!

Click here to see the utility class with all that code: https://github.com/rmannibucau/lambda-cdi/blob/master/src/main/java/com/github/rmannibucau/blog/lambda/cdi/Lambdas.java

Lambdas utility use

Now we are able to call lambdas with CDI injections. Let’s see how to use our utility class.

First, define few functional interfaces:

@FunctionalInterface
public interface TaskWithCdiParameter<T> {
    void work(T cdiBean);
}

@FunctionalInterface
public interface TaskWithCdiParameterAndArgument<T, A> {
    void work(T cdiBean, A arg);
}

Then define a class based on these interfaces:

// this class stack tasks then execute them sequentially
public abstract class MyFlowBuilder implements Runnable {
    private final Collection<Runnable> tasks = new LinkedList<>();

    // cdi parameter(s)
    protected <T> void work(final TaskWithCdiParameter<T> task) {
        tasks.add(() -> Lambdas.invokeLambda(task));
    }

    // cdi params + args
    protected <T, A> void work(final TaskWithCdiParameterAndArgument<T, A> task, A arg) {
        tasks.add(() -> Lambdas.invokeLambda(task, arg));
    }

    public abstract void defineFlow();

    @Override
    public void run() {
        tasks.stream().forEach(Runnable::run);
    }
}

Finally define your own builder:

MyFlowBuilder builder = new MyFlowBuilder() {
    @Override
    public void defineFlow() {
        work((ABean1 bean) -> bean.call());
        work((ABean1 bean1, ABean2 bean2) -> {
            bean1.call();
            bean2.call();
        });
        work((ABean3 bean, String arg) -> bean.call(arg), "param");
        work(new TaskWithCdiParameter<ABean1>() { // java 7 style
            @Override
            public void work(ABean1 cdiBean) {
                cdiBean.call();
            }
        });
    }
};
builder.defineFlow();
builder.run();

This example is effortless but already shows you how to use CDI lambda injections of course. It also shows you how to mix them with parameters. This last case can easily be extended to support CDI qualifiers on lambda 🙂

Conclusion

Creating an API with lambda is not that obvious because you often need to:

  • Create several interfaces with N parameters (personally I use N in [0, 10] because I think N > 10
    makes API too complicated and doesn’t bring anything more – you can still create another type to aggregate several injections)
  • Duplicate this API with parameters or not (same rule – already makes 10*10 functional interfaces)
  • Potentially add some more behavior in the functional interfaces (small tip here is to create other
    instances of your functional interfaces from the current one)
  • If type is not enough (you can use qualifiers) then you can duplicate it again with
    qualifiers (10*10*2 interfaces)

To make it easy to maintain, you can generate them from templates (cucumber does it). I don’t get
into details here, because I’ll try to write another post about it soon.

However keep it in mind this has impacts on your API. You need to carefully think about it because skipping this generation step will make your life easier… but can make your API less user-friendly.

Finally, linking your lambda API with any other library is not that hard since you can get almost
all metadata you need (I would still love to see annotations in these metadatas but you can fix it with a custom signature enriching lambda with metadatas).

Don’t be afraid of the lambdas and enter the game, a lot of new API style are next our doors!

CDI and @Startup: SOLVED!


Speaking on the CDI list about CDI 2.0 standalone container API Jozef Hartinger sent an answer which is quite obvious when you know it but which can change your life if you don’t: how CDI 1.1 introduced @Startup without the need of EJBs.

Continue reading