Monthly Archives: March 2016

Implementing an annotation interface

Using annotation is every day task for a Java developer. If nothing else simple @Override annotation should ring the bell. Creating annotations is a bit more complex. Using the “home made” annotations during run-time via reflection or creating a compile time invoked annotation processor is again one level of complexity. But we rarely “implement” an annotation interface. Somebody secretly, behind the scenes certainly does for us.

When we have an annotation

public @interface AnnoWithDefMethod {
    String value() default "default value string";

then a class annotated with this annotation

@AnnoWithDefMethod("my default value")
public class AnnotatedClass {

and finally we when get the annotation during runtime executing

AnnoWithDefMethod awdm = AnnotatedClass.class.getAnnotation(AnnoWithDefMethod.class);

then what do we get into the variable awdm? It is an object. Objects are instances of classes, not interfaces. Which means that somebody under the hood of the Java runtime has “implemented” the annotation interface. We can even print out features of the object:

        for (Method m : awdm.getClass().getDeclaredMethods()) {

to get a result something like

my default value
class com.sun.proxy.$Proxy1
interface AnnoWithDefMethod

So we do not need to implement an annotation interface but we can if we wanted. But why would we want that? So far I have met one situation where that was the solution: configuring guice dependency injection.

Guice is the DI container of Google. The configuration of the binding is given as Java code in a declarative manner as it is described on the documentation page. You can bind a type to an implementation simply declaring


so that all TransactionLog instance injected will be of DatabaseTransactionLog. If you want to have different imlpementation injected to different fields in your code you should some way signal it to Guice, for example creating an annotation, putting the annotation on the field, or on the constructor argument and declare the


This requires PayPal to be an annotation interface and you are required to write an new annotation interface acompaniing each CreditCardProcessor implementation or even more so that you can signal and separate the implementation type in the binding configuration. This may be an overkill, just having too many annotation classes.

Instead of that you can also use names. You can annotate the injection target with the annotation @Named("CheckoutPorcessing") and configure the binding


This is a tehnique that is well known and widely used in DI containers. You specify the type (interface), you create the implementations and finally you define the binding type using names. There is no problem with this, except that it is hard to notice when you type porcessing instead of processing. Such a mistake remains hidden until the binding (run-time) fails. You can not simply use a final static String to hold the actual value because it can not be used as the annotation parameter. You could use such a constant field in the binding definition but it is still duplication.

The idea is to use something else instead of String. Something that is checked by the compiler. The obvious choice is to use a class. To implement that the code can be created learning from the code of NamedImpl, which is a class implementing the annotation interface. The code is something like this (Note: Klass is the annotation interface not listed here.):

class KlassImpl implements Klass {
    Class<? extends Annotation> annotationType() {
        return Klass.class
    static Klass klass(Class value){
        return new KlassImpl(value: value)
    public boolean equals(Object o) {
        if(!(o instanceof Klass)) {
            return false;
        Klass other = (Klass)o;
        return this.value.equals(other.value());
    public int hashCode() {
        return 127 * "value".hashCode() ^ value.hashCode();
     Class value
    Class value() {
        return value

The actual binding will look something like

  public RealBillingService(@Klass(CheckoutProcessing.class) CreditCardProcessor processor,
      TransactionLog transactionLog) {

In this case any typo is likely to be discovered by the compiler. What happens actually behind the scenes, and why were we requested to implement the annotation interface?

When the binding is configured we provide an object. Calling Klass.klass(CheckoutProcessing.class) will create an instance of KlassImpl and when Guice tries to decide if the actual binding configuration is valid to bind CheckoutCreditCardProcessor to the CreditCardProcessor argument in the constructor of RealBillingService it simply calls the method equals() on the annotation object. If the instance created by the Java runtime (remember that Java runtime creates an instance that had a name like class com.sun.proxy.$Proxy1) and the instance we provided are equal then the binding configuration is used otherwise some other binding has to match.

There is another catch. It is not enough to implement equals(). You may (and if you are a Java programmer (and you are why else you read this article (you are certainly not a lisp programmer)) you also should) remember that if you override equals() you have to override also hashCode(). And actually you should provide an implementation that does the same calculation as the class created by the Java runtime. The reason for this is that the comparison may not directly be performed by the application. It may (and it does) happen that Guice is looking up the annotation objects from a Map. In that case the hash code is used to identify the bucket in which the comparing object has to be and the method equals() is used afterwards to check the identity. If the method hashCode() returns different number in case of the Java runtime created and out objects they will not even match up. equals() would return true, but it is never invoked for them because the object is not found in the map.

The actual algorithm for the method hashCode is described on the documentation of the interface java.lang.annotation. I have seen this documentation before but understood the reason why the algorithm is defined when I first used Guice and implemented a similar annotation interface implementing class.

The final thing is that the class also has to implement annotationType(). Why? If I ever figure that out I will write about that.


Java compile in Java

In a previous post I wrote about how to generate a proxy during run-time and we got as far as having Java source code generated. However to use the class it has to be compiled and the generated byte code to be loaded into memory. That is “compile” time. Luckily since Java 1.6 we have access the Java compiler during run time and we can, thus mix up compile time into run time. Though that may lead a plethora of awful things generally resulting unmaintainable self modifying code in this very special case it may be useful: we can compile our run-time generated proxy.

Java compiler API

The Java compiler reads source files and generates class files. (Assembling them to JAR, WAR, EAR and other packages is the responsibility of a different tool.) The source files and class files do not necessarily need to be real operating system files residing in a magnetic disk, SSD or memory drive. After all Java is usually good about abstraction when it comes to the run-time API and this is the case now. These files are some “abstract” files you have to provide access to via an API that can be disk files but the same time they can be almost anything else. It would generally be a waste of resources to save the source code to disk just to let the compiler running in the same process to read it back and to do the same with the class files when they are ready.

The Java compiler as an API available in the run-time requires that you provide some simple API (or SPI of you like the term) to access the source code and also to send the generated byte code. In case we have the code in memory we can have the following code (from this file):

public Class<?> compile(String sourceCode, String canonicalClassName)
			throws Exception {
		JavaCompiler compiler = ToolProvider.getSystemJavaCompiler();
		List<JavaSourceFromString> sources = new LinkedList<>();
		String className = calculateSimpleClassName(canonicalClassName);
		sources.add(new JavaSourceFromString(className, sourceCode));

		StringWriter sw = new StringWriter();
		MemoryJavaFileManager fm = new MemoryJavaFileManager(
				compiler.getStandardFileManager(null, null, null));
		JavaCompiler.CompilationTask task = compiler.getTask(sw, fm, null,
				null, null, sources);

		Boolean compilationWasSuccessful =;
		if (compilationWasSuccessful) {
			ByteClassLoader byteClassLoader = new ByteClassLoader(new URL[0],
					classLoader, classesByteArraysMap(fm));

			Class<?> klass = byteClassLoader.loadClass(canonicalClassName);
			return klass;
		} else {
			compilerErrorOutput = sw.toString();
			return null;

This code is part of the opensource project Java Source Code Compiler (jscc) and it is in the file

The compiler instance is available through the ToolProvider and to create a compilation task we have to invoke getTask(). The code write the errors into a string via a string writer. The file manager (fm) is implemented in the same package and it simply stored the files as byte arrays in a map, where the keys are the “file names”. This is where the class loader will get the bytes later when the class(es) are loaded. The code does not provide any diagnistic listener (see the documentation of the java compiler in the RT), compiler options or classes to be processed by annotation processors. These are all nulls. The last argument is the list of source codes to compile. We compile only one single class in this tool, but since the compiler API is general and expects an iterable source we provide a list. Since there is another level of abstraction this list contains JavaSourceFromStrings.

To start the compilation the created task has to be “call”ed and if the compilation was successful the class is loaded from the generated byte array or arrays. Note that in case there is a nested or inner class inside the top level class we compile then the compiler will create several classes. This is the reason we have to maintain a whole map for the classes and not a single byte array even though we compile only one source class. If the compilation was not successful then the error output is stored in a field and can be queried.

The use of the class is very simple and you can find samples in the unit tests:

	private String loadJavaSource(String name) throws IOException {
		InputStream is = this.getClass().getResourceAsStream(name);
		byte[] buf = new byte[3000];
		int len =;
		return new String(buf, 0, len, "utf-8");
	public void given_PerfectSourceCodeWithSubClasses_when_CallingCompiler_then_ProperClassIsReturned()
			throws Exception {
		final String source = loadJavaSource("");
		Compiler compiler = new Compiler();
		Class<?> newClass = compiler.compile(source, "com.javax0.jscc.Test3");
		Object object = newClass.newInstance();
		Method f = newClass.getMethod("method");
		int i = (int) f.invoke(object, null);
		Assert.assertEquals(1, i);

Note that the classes you create this way are only available to your code during run-time. You can create immutable versions of your objects for example. If you want to have classes that are available during compile time you should use annotation processor like scriapt.