Serverless and Spring Cloud Function

We have been discussing going to a more serverless architecture at work and so I decided that I should do some research to see where that stuff is now. When I was at Choose we used AWS Lambda to implement the backend of an abandoned shopping cart service. We would then use that data to drive an email campaign to encourage the users to come back and finish purchasing an energy plan. It had a huge effect in driving conversion rates and we were able to implement the service in about 25 lines of vanilla java code. I opted not to use Spring as I judged the startup times to be too slow for it to be worth it. To manage libraries we used the maven shade plugin in our build process to build a fat jar.

In my current role we will deploy on AWS lambda as well. One thing I am looking for in considering serverless frameworks is what allows the developer to have a very nice flow as well, as back when we did the abandoned cart service debugging was a painful experience. At that point I think we did a bunch of console logging messages to figure it out. That won’t scale up for a development team that is going heavy into serverless it is too inefficient.

Then last week I came across this Spring Tips video about a new project Project Riff. I am happy to see Pivotal getting into this space as they build great frameworks and tools. And Riff seems no different it allows a developer to easily install it on their laptop and seems like it has first class support for Google Cloud and anyone’s infrastructure that is already running Kubernetes.

So I really liked the Riff part of that video. But then I was watching the Spring Cloud Function part. Now everyone who knows me knows that I love the Spring framework. I have been using it since 2008, and I think in terms of what it can do for you on Java server apps there is nothing else close. It really accelerates developer productivity and lets them focus on solving business problems. But as I watched this video I was sort of like I don’t get it. Like year I get that people want to use Spring in serverless, but when you watch it take 5 seconds to stand up a function, to me that is a deal breaker. It is too slow and if you are being charged by the second for your service paying 5 seconds to execute a function that will take less than a second is too expensive.

I was pretty much ready to write Spring Cloud Function off as a bad idea based on that then I saw something on Twitter about Dave Syer demoing something at Spring IO this week which brings massively faster startup times. I suspect it has something to do with the spring-context-indexer that Juergen talks about in this video, but I have been unable to find any examples or documentation on how to use it.

The other thing I need to take a look at is the AWS SAM Local tool. The big pain point I had doing Lamda’s a couple of years ago was around debugging them. This looks like it may solve that problem for us by giving us an convenient way to hook this into our development lifecycle.

Java 10, already!

Java 9 we hardly knew ye, yet here we are and today was the GA release of Java 10. This is especially true for those of us using Spring boot as we just got official Java 9 support a couple of weeks ago and now 10 is out. From my standpoint the big features of the release are the Root Certificates. Java 9 shipped without root certificates if you used OpenJDK which in my mind makes it less useable as now you have to with that, you can’t just run it and go. That is what was keeping me on Oracle’s releases of Java. Now in Java 10 there is no difference with the oracle JDK and the OpenJDK so I will probably just deploy my apps on OpenJDK in the future. That simplifies Linux installation as now you can just run apt-get install and go. Before I used to have to add the ppa for Oracles and install it and then deal with the strong crypto policy files. All those old pain points are gone.

The next big change is Parallel Full GC support in G1GC. In Java 9 they changed the default VM to G1 and now they have enhanced it so that if you have to run a full collection it will at least run in parallel and reduce the pause time for your application.

The feature all the developers are talking about is local variable type inference. So now you can write:

var list = new ArrayList<String>();

Instead of:

List<String> list = new ArrayList<>();

The thinking being the name of the variable is the most important part and this may make the code more readable. For me I am not completely sold on this change. I think in some circumstances it may make the code more readable, but in other cases it will be worse, so I am taking a wait and see approach with using this style.

The other small developer feature that looks good to me, are APIs to create unmodifiable collections. I could see myself using that a lot, and it will be nice to no longer need Guava for some of these things. Instead of thehorrible old:

List<String> list = new ArrayList<>();
list.add("String 1");
list.add("String 2");

return Collections.unmodifiableList(list);

And making sure that you aren’t leaking the original reference you can just do:

var list = List.of("String1", "String2");

or if you have the list shown in the first example and you want to lock it down you can use:

return List.copyOf(list);

which I think reads better than:


and it is obvious that it is making a copy so you don’t have to worry about leaking the original reference.

Other than that there is better support for the VM to realize it is running in a docker container and only assume the resources assigned to the container instead of reading the OS values. Otherwise there isn’t a lot there. That being said if you have managed to get on Java 9 already in production you need to upgrade right away as 9 is no longer supported, and you need 10 to make sure that you have all the latest security fixes.

Finally I will leave you with this:

Java version text

The times they are a changing…

Java 9 Upgrade

After upgrading my test app to Spring Boot 2.0 yesterday I decided to see how difficult the Java 9 upgrade was from there. I am happy to report that it was fairly trivial. I upgraded my maven pom to set the Java version to 9 and did a mvn clean install.

Immediately I see some no class def exceptions around javax.transaction.Transaction. I did some quick google searching and discovered the problem seems to be in the Maven Surefire plugin. I found a work around that said to set the version to 2.20.1 and added a command line flag of –add-modules javax.transaction. After doing that I was seeing errors around java.xml.bind. Doing some more searching I then added a second –add-modules java.xml.bind. This fixed the issue. In the course of doing so I found a link to the issue on apache’s website. Reading through the comments I ended up with a final configuration of 2.21.0 with the following options:

                <argLine>--add-modules java.xml.bind</argLine>
Once I dropped that configuration into my pom everything built and ran correctly. So once you get your app to Spring Boot 2.0 the jump to Java 9 is pretty seamless. My follow up change now will be to switch to the new factory methods on List and Set in the app to take full advantage of the features. Once Java 10 drops in a couple of weeks, I will take the App to 10 and see how that goes.

Upgrading to Java 9

Upgrading to Java 9

Ever since Java 9 was released last fall, I have been wanting to upgrade our software at work to the new platform. I am not interested in the new module stuff, mostly I just want the convenience methods like List.of(), and the platform improvements. I think G1 by default looks good, the new representation for strings to save memory looks like a huge win, and all the performance numbers that I have seen show it to be a big win. Unfortunately this is not as straight forward as one should hope.

Step 1 – Spring Support

Even though Spring 5.0 was released back in December I think which fully supports Java 9, Spring Boot 2.0 is not yet released. I believe it is in RC2 now and about to be released in the next week or so, but at this point we are less than a month away from Java 10, before we will have proper Spring Boot support in Java 9. All of our microservices are currently running on Spring Boot 1.5.10 except for one. It is still running on the 1.3.x version of Spring boot, which brings us to our second Spring issue to resolve. Some contractors that originally started that service did a hack to lazy load @Formula s in hibernate. While this worked in Hibernate 4.x in Hibernate 5 the hack no longer worked so we can’t even update this service until we refactor those entities and move that data into views. That is still a work in progress. The other microservices could be moved to 2.0 when it comes out, but I think their is a consensus that we don’t want to move to 2.0 until we have all our services to a place where we could do that.

Step 2 – Gradle

We are currently running gradle 3.5.1. Going to Java 9 is going to require Gradle 4.2 (which is also a requirement for the Spring Boot 2.0 Gradle plugin). Normally this is no big deal upgrade the file edit the wrapper portion to the new version run ./gradlew wrapper and check in the new gradle wrapper. But in this case we still have a developer running on IntelliJ 2016.1 and another on 2016.2. They would have to upgrade to at least 2017.2 in order for the new gradle to work with their IDE. So we will need to convince the whole team to update their IDE if they haven’t already ideally at that point to 2018.1.

Step 3 – FlywayDB

We are currently using Flyway DB 4.2.0. This didn’t work with Java 9 when I was testing it, so we need to go to 5.0.x. Additionally Spring Boot 2.0 requires Flyway 5.0.x. Ideally we want to upgrade this ahead of Spring to minimize risk. Bring the new flyway into production and then once we are satisfied that it doesn’t break anything upgrade Spring Boot.

Step 4 – Docker Containers

Once we have all that software in place we then will have to update our docker containers to pull in the new JVM. It sounds like in Java 9 Linux, OpenJDK ships with an empty certificate store. So that means finding a Oracle or Azul docker container as our base. We then have to verify that New Relic works with Java 9 so we have our container monitoring. After doing all that we can update our base VM to be Java 9.

Step 5 – Update Jenkins

We will need to add a new VM to Jenkins so that as different builds switch to the new Java 9 container we also switch to compiling them with the Java 9 VM (though with the Java 8 flags still at this point).

Will the pain never end…

Finally after doing everything listed above we could change our compiler flags to 9 and start using the features. It is still a painful road to walk to get us there. At this point it feels like by the time we finish the work that needs to be done (as obviously we need to be shipping features and bug fixes, while we lay the foundation for this on the side) Java 11 will be out and we will be moving to that.

Even though Java 9 support is so close to being out in Spring Boot I still feel like I am at least 6 months away from getting to use any of it. This feels much more painful than when I moved an App from Java 7 -> 8. At that point I think it was just a matter of coordinating a JBoss container upgrade from 6.0.1 to 6.4. After we got that into production we updated the VM to Java 8 (still with everything compiling on 7). Then we upgraded our Jenkins machine to have Java 8 for the compiler, and finally we updated our maven pom to use source and target of 8. I wonder if anyone Spring shops will even make it to 9 or if everyone is holding out for 11 at this point (also to get the LTS release). While I really like the sound of the 6 month release cadence actually getting the dependencies in place and lined up to do an upgrade will probably mean most enterprises just end up going from LTS release to LTS release and I would guess that Java 9 and 10 see very little use running production code and mostly will just be things for developers to play with on their machines.

TIL: default hashCode() in Java

I came across this blog post today which I thought was really good. It is a deep dive into the default hashCode() implementation in java. To me the most amazing outcome of the piece is that if a given class is going to be accessed by multiple threads you really need to override hashCode otherwise biased locking is disabled. All in all it is an interesting look in the guts of the JVM and worth a read: default hashCode

AWS Lambda or should I call them nano services?

Recently at work I worked on a project using Amazon AWS Lambda. This is a cool concept. Amazon calls it serverless computing, but really what it is, is abstracting the server so that you can just focus on a small task that needs to run.

In this case we had a rest endpoint that just stores some data in a database.  If we think about a traditional Spring Boot Microservice we would probably do Spring Data JPA, point it at a mysql DB, and then have some rest controllers that talk to a service tier which persists the data. With Spring Boot this isn’t much code, but you still have some embedded Tomcat Server and a fair amount of ceremony for doing something very simple. After building the app you will need to deploy it to Elastic Beanstalk instance or else an EC2 Nano Instance or something similar. That is a lot of DevOps overhead to do something very simple. With Lamdba we can create a simple class that takes a pojo java object (Jackson style). With Lambda you don’t have Hibernate, you are just dealing with raw JDBC but when you are just inserting 1 Row into a Database you don’t really need am object relational mapping. You then use Amazon’s API gateway to send any requests to an endpoint to the lambda function and you are all good to go.

That got me thinking S3 now has the ability to serve an S3 bucket as a website, so you could drop an angular app into an S3 bucket and serve that up and then point it at API Gateway which then hits a Lambda and talks to an RDS instance. If your site didn’t have much traffic it would be a super cheap way to host it as Lamdba is billed based on how much compute time you use and in our case our task runs so fast I am not sure we will even break out of the free tier with all the traffic we get. You could run an entire site with no server instances provisioned which is cool. In reality I think as the app grew it would be hard to manage as separate lambda functions and you would benefit greatly from a proper framework like Spring, but for something very small and light this seems super cool. The other neat thing about Lambda is the wide language support you can use, so I wrote my Lambda in Java and my boss just made a Lambda to do some logging that was written in Node. It is a super cool concept worth checking out.

MacOS Sierra Slowdown update

I have an update on my slowdown issues on Sierra. It appears the real problem lies in the AWS Java SDK. After talking to the spring boot people via github they were able to narrow it down to an Amazon issue. I opened an issue on github with Amazon and they responded that the version of the SDK that ships in the current spring cloud has this issue in it, and it has been fixed in a newer version of the SDK. One of the big value propositions of Spring Boot to me and the release train concept of Spring Cloud or Spring Data is that it is a collection of dependencies that have all been tested together, which lowers my risk of using them together. So I opened a request with Spring Cloud AWS to upgrade their SDK. Unfortunately they don’t seem very timely in responding to issues as I notice it looks like there are no responses on any of the issues raised in the last 2 weeks.

In the meantime I have a work around that doesn’t involve having to manually bump up your AWS SDK version and that is to set the following property in your file of your Spring Boot App:

Obviously setting this flag for your app running on AWS that uses the IAM profiles isn’t good, but it is a good local workaround on your development machine until the SDK gets updated in Spring Cloud.

Spring Boot and Security using Spring Data JPA for authentication

Recently one of my friends was working on a Spring Boot project and he was having trouble finding an example of how to configure user login for his site with Spring Boot using JPA. I had mentioned that there is some mention of configuring security in Greg Turnquist’s book Learning Spring Boot. He had just purchased Spring Boot in Action and I don’t think he was rushing to grab another book, but he hadn’t been able to find a good online tutorial.

Today I was browsing Twitter and I came across exactly what my friend was looking for. Priyadarshini B wrote a great online tutorial on doing just that on her blog. The post was so great I wanted to share it. So head over there and check it out, you won’t be disappointed as it will walk you through the whole process.

JavaMug Spring Boot Discussion

I attended JavaMug last Wednesday as the speaker was Craig Walls author of Spring Boot in Action. When I heard about the book I had planned on purchasing it, but was disappointed there was no kindle version on Amazon. It does state if you purchase the print edition they will give you the kindle one for free, but I am trying to move away from paper books in general.

Overall the talk was pretty good. It is nice that there is a Pivotal employee local to the area so we can get a talk like this done. For most of the talk Craig just sort of demonstrated examples of what you can do with Spring Boot since there were people of varying degrees of experience with it. It was held at Improving in Addison which I had never been to, but they had some nice beer on tap (Kentucky Bourbon Barrel Ale). In a talk like this where you are just trying to introduce the concept to people it is hard to get as deep of a dive as I would like. But I did enjoy the part of the demo playing around with the metrics. That is something I haven’t really played around with, but of course got me immediately thinking about how much I would like to use that at work. I think maybe this year I will attempt to convert our legacy app to Spring Boot. It will be painful, but it just seems like more and more the benefits are so good that is what we should be doing. Hopefully I can find the time at work.

Then at the end of the talk they did a raffle and lo and behold I won the digital copy of Spring Boot in Action. I was pretty stoked about that given that I wanted that originally to begin with. When they get it to me and I get a chance to read it I will post a review up here.

Field injection is not evil

I am a big fan of the work Oliver Gierke has done in Spring Data. It is a framework I use daily at work and it really is amazing. A while back Greg Turnquist had posted something on Twitter against field injection. So I asked what was wrong with field injection and he pointed me to this post that Oliver had written about it.

This post is going to be my attempt to argue against Oliver’s position on the subject. It may wind up an epic failure, but if nothing else I figured it would help me think through my thoughts and assumptions on the issue and see if I can make an argument in favor of it. First head over to Oliver’s post on the topic and read it before you continue on.

Argument 1: “It’s a null pointer begging to happen

In Oliver’s example he MyCollaborator into MyComponent with the @Inject tag that you would use in Spring or in CDI. Then his argument is that a person would instantiate MyComponent outside of the Container with MyComponent component = new MyComponent().

My first thought is, you are dealing with a container managed component if you have @Inject in it so if someone is standing up the component outside of the container to me that already says they are doing something wrong. Given that I don’t find this a compelling argument. If you are using the component in a framework as it is designed to be used the framework will tell you if it can’t wire up the component due to missing dependencies, whether your container is Spring or it is JavaEE.

I suppose one could imagine designing a component that could be used in either Spring or CDI or standalone without a framework, and Oliver’s constructor example could still work (assuming you don’t blow up on the missing @Inject annotation which I think could happen.) So in my opinion this argument isn’t really valid or a big concern as is someone is misusing your component I am not too concerned with them getting a null pointer in that scenario.

Let’s consider his benefits of Constructor Injection. The first case is you can only create your component by providing all the dependencies. This forces the user of the component to provide everything. While I consider this a valid argument, I think the point is somewhat moot since we are talking about designing a component that already has a lifecycle to it and that a framework will inject the dependency for.

His second benefit is that you communicate mandatory requirements publicly. I have to admit I find this his most compelling argument. I don’t have a counter argument to this.

His third argument is that you can then set those fields final. I have to admit I do like making everything final so that is also a compelling argument to me. But not enough to counter the negatives of it.

Let’s consider the argument that he tries to dispel:

An often faced argument I get is: “Constructors just get too verbose if I have 6 or 7 dependencies. With fields only, this is fine”. Awesome, you’ve effectively worked around a clear indicator that the code you write is doing way too much. An increase in the number of dependencies a type has should hurt, as it makes you think about whether you should split up the component into multiple ones.

I don’t buy his counter argument here. In the project that I am working on we have a bunch of service level business logic components. We have a thin service layer that we expose to the presentation tier, and then do all the heavy lifting of our business logic in these hidden service level components. Given the high volume of business logic code we have we compose the different operations with injected reusable business logic components. This is also a huge benefit when working with a large team you get less merge conflicts as the code is spread out across more files and when you are onboarding new employees you have a lot more smaller business logic classes that are easier to digest than some massive classes with all this business logic in it.

If I split those components up that is what we have already done which is why we have to inject them all over the place to reuse those business methods. I think when dealing with a non-trivial app that is extremely large, breaking things into fine grained components that are reusable in multiple places actually leads to needing to inject more fields and not less.

Argument 2: Testability

In this case Oliver argues that you have to be doing reflection magic in unit tests if you use field level injection. To which I respond if you are running a dependency injection framework why wouldn’t you be using a dependency injection unit test framework. And of course we do. The answer is Mockito. With Mockito you structure your unit test like you would structure your component. So to test my service I might have something like below:

@InjectMocks final MyService service = new MyService();

@Mock Component requiredComponent;

@Test public void myTest() {

when(requiredComponent.methodCall()).thenReturn(new MockedResponse);

final boolean result = service.methodIAmTesting();



Now I have a unit test that is structured largely like my service. The testing framework injects the mocked dependencies into our service we are testing and we wire up the results on the mocks so we can test our service. Again this strikes me as very testable as now I am writing tests that look a lot like the way my services themselves look.

So that is my argument against. Basically I think the number of arguments to the constructor actually does get extremely unreasonable in many real world scenarios where you have a lot of finely defined business logic components that can be composed in many different ways to solve the project. You do end up with a lot more boiler plate, and even if you use Lombok to hide it then you have ugly and somewhat cryptic Annotations to tell Lombok to put the @Inject annotation on the constructor. I think if you are running a dependency injection framework, it isn’t reasonable for people to instantiate those objects outside of that framework, and likewise it is very reasonable to expect your unit test framework to also be dependency injection driven.

Let me know if I have persuaded anyone to my point or if you think I am completely wrong and Oliver is right about this, I would love to hear someone explain to me why my arguments are wrong and why Oliver has been right all along, or if there is a 3rd even better way what that way is.