Internet of Things coding workshop

The Event

On Thursday afternoon I attended a free Internet of Things Coding Workshop put on by AT&T and Texas Instruments at AT&T’s M2M/CD Foundry in Plano, TX. While this type of programming isn’t really relevant to what I do in my day job I am interested in the topic so I asked by boss if it was okay if I attend and he was fine with it. The hosts were kind enough to provide us with a free lunch and I think there were about 50 attendees. This was the 3rd session they had done and it sounds like the others were even busier. They had a 4th session that evening on the topic as well.

I sat down and ate my lunch and then started taking a look at the materials provided. They gave us a Texas Instruments MSP-432 LaunchPad development kit and a TI SimpleLink Wi-Fi CC3100 Module BoosterPack to plug into it. There was also a USB Key stick which contained drivers for the hardware and an IDE called Energia. The drivers were available for both Windows and Mac, so I was using my MacBook Pro that work provides me for the class. The driver installation was easy on the Mac, but the Energia required me to load a legacy Java 6 environment. TI should really update that to just run off of the latest Java 8 VM on the Mac, that would be my only real complaint about the setup. The USB drive also had the lab book in PDF form (we received a paper copy as well). Additionally the presentations done in the session were available on the drive.

The toys:

The Texas Instrument Hardware provided in the class

After I finished installing the software I started on Lab 1. The first thing we had to do was create an AT&T M2X account. M2X seems to be AT&T’s IoT service. It is what they are selling hence the reason for sponsoring the event. It seems to be a great service to collect time series sensor data and do things with that data. One of the things that I most liked about the service is everything is a REST call and they have great documentation for what you can do with it. For the first lab we just created a virtual device on M2X and hit the endpoint with different rest calls to simulate data coming in. We set a stream on the device that was tracking speed in miles per hour and you would send different speeds into it via HTTP POST and get real time updates on a graph. It was a nice introduction for how to work with M2X.

The second Lab had us using our TI LaunchPad hardware with the M2X service. The goal here was to create a M2X stream for push button data. We were going to simulate a sensor on a refrigerator door that would keep track of how long the door was open. The first thing we did add a new stream which tracked seconds. Then they gave us the skeleton of a LaunchPad app for our board. We had to edit the Wifi Settings in the Code as well as add our device ID, stream name and our API-Key on M2X. Then we added the main logic loop to the program. We would track the button reads and the state about whether it was pushed or not. Then we would subtrack the time it was released from the time it was poshed and use the m2XClient to send the stream value. In the Energia you are doing C programming, but the great thing about it is they ahve abstracted away the hardware, so you are just calling a digitalRead function on a push button in this case and it pulls in the value for you as an it. So you aren’t dealing with the low level networking to get the data out, nor are you dealing with the hardware as they are wrapping it all for you in a library. In the Energia environment you are selecting which board you are targeting and then they go ahead and compile it for the hardware that you have. This really takes the pain out of doing hardware level programming and I feel like creates and environment that makes it very accessible to anyone with a moderate amount of programming experience. In fact there was one father son group in the class, and I do feel like this would be accessible to children that had done some computer programming. Once we finished that we uploaded our code to the board and ran it. I was unable to get the serial line monitoring of my board working on the mac, but the code was working as I would press and hold the button and would get real time graphs of how long the button was held for.

My computer and board setup:

My Hardware Setup

On to Lab 3. For lab 3 we learned about triggers in M2X. Basically you can watch your data stream and when certain events happen it will fire a trigger for that event. The first part of it we just hit a URL on Request Bin. After you saw the trigger work we then tried out the If This Then That service better known as IFTTT. AT&T has a tutorial on using that service with M2X over here. We used it to send an SMS when our trigger condition was met with our previous trigger. The great thing about the triggers in general is, they are just a callback URL so you can do almost anything with it.

Lab 4 was using AT&T’s Flow Designer to wire services together. Flow is basically a graphical way in your web browser to wire different services together by dragging components onto a canvas and wiring them up. Your input can be something like a rest endpoint or a RabbitMQ message, or any number of options. You can then transform and apply different operations to the data that comes in and send it out somewhere else. In the first example we wired a message that had a unix timestamp. We then transformed the date by using some javascript operations on the payload to print out a nicely formatted date time string. One of the interesting things about the Flows is this is AT&T’s Platform as a Service (PaaS) offering. So they will spin up a docker container with your service that you can test out on the spot. Flow Designer also lets import and export Flows as json objects which is a nice way people can share different flows for services. It is also built around a github model, so you can build public flows or fork other users flows. So there is an opportunity for people to design complex services here and share them.

Lab 5 was basically tying everything together. So we went back to M2X and wired that trigger into a Flow from Flow Designer. The end result was that we pushed the button on our TI board, which sent a message to M2X and then triggered a flow in Flow Designer which then operations on that data coming in.

All in all this was a great coding workshop. Ideally I wish there was at least another lab doing something more advanced with our TI Arduino boards as for me that was probably the most interesting part. It is interesting to see AT&T position themselves as the network for IoT and I think they will do well in the space with their offering. They have some very developer friendly services to start messing around with their stuff. Their M2X data store reminds me of what you can do with Cassandra and time series data, but in this case they take care of all of the work of building and designing that data store and give you great real time data out of the box. The other nice offering AT&T had was if you developed some IoT app and needed connectivity they gave us an offer where they would give us 3 small form factor sims that each has 10 mb of data on it for free for 3 months so they are serious about wanting to be the network behind your service. I finished lab 5 at 3pm when the rest of the class was still in lab 3 and was about to head out when the guy from TI got up and gave their hardware presentation. It was very interesting and they have tons of stuff we can plug into our boards for all sorts of capabilities priced from $10-$30 per module. They also have this little Beagle Bone Board that completes with a Raspberry Pi. It looks like Arduino, Beagle Bones are also an open source hardware thing you can find here. I had considered playing around with a Raspberry Pi for a while, but I haven’t gotten around to getting one, so it was interesting to hear about a competing similar type board. At the end of the presentation they did a drawing, and boy am I glad I didn’t head back to work right at 3pm, as I won the raffle. I had the choice of getting either a LCD panel for my Ti Board, or a Sensor Pack or the Beagle Bone board. I decided I wanted something else to plug into the LaunchPad so I was torn between the LCD and the Sensors. I probably should have taken the Sensors just to do more with the board, but I took the LCD as I liked the idea of being able to get some feedback without the serial line. The screen is the 430BOOST-SHARP96 screen. After winning that and with the other hardware it was an amazing 3 hours and 15 minute workshop for me and if you see something like that in your area I strongly recommend attending, it is a blast. For more information on the types of things people can do with this type of stuff check out this TED Talk:


The downside of automatic updates

I have sort of taken for granted how easy all the updates are for a WordPress site. If there is a security update WordPress just goes up and updates itself without me doing anything and the plugin community is so active there are frequent updates to all of the plugins I used. Because it always just works I don’t really hesitate to run any of the updates when I see them on the site.

Obviously what happens next is no surprise yesterday I saw there was an update to 2 of my plugins so I just clicked upgrade and next thing you know when I clicked on any admin page I just got a blank white page. My first question was is my entire site down? So I checked and nope I could see all the content fine just couldn’t get into the admin.

They say every crisis is an opportunity and this sort of proves that point, I figured well no big deal I can deal with it later, and given that I don’t have SSH keys to this site on my work computer anyway I figured I would deal with it last night. Last night I did some googling and found out how to deal with a bad plugin install (basically rename your plugin dir and hit the plugin page which will disable all the plugins). I then renamed it back and updated the probably offending plugin as it had a new version already (and I had just updated during the day otherwise it had crashed in that update and it needed the update still), and then activated my plugins one by one and life is good.

So I am back up and running and the whole thing ended up being a good learning opportunity for me, but it prevented me from working on what I wanted to write about last night which was the Internet of Things Coding Workshop that I had attended, but that will come next.

Spring Boot

Gotta love open source and github

The Problem

I have been working on this project to make our app run in any Java Container. Currently we run in JBoss, but ideally I would like the app to work in JBoss or Tomcat, or TomEE or Wildfly. One of the challenges in making this change is to remove JBoss specific dependencies from our app and pull those libs into the webapp as part of our project. We did the first piece of this a couple of years ago when we stopped using JBoss’ version of Hibernate and pulled a newer version into our app. We have since upgraded JBoss versions so this is somewhat moot since the bundled version and our version are the same, but it is one less thing that I will have to deal with as part of this project.

The first part that I have decided to tackle is not using JBoss’ built in transaction manager and instead bundling one. Looking around it seems like my choices are Atomikos, Bitronix, or JBossTM in standalone mode. Our situation is somewhat more complicated based on the fact that we are not running an XA jdbc driver. Originally the company was using one, but at some point ripped it out because Microsoft SQLServer wouldn’t work correctly or scale with it in there. We are aware of the limitations of not running in XA mode.


That alone rules out Bitronix as they pretty much require XA to do their thing. I had also sort of ruled them out given that the commercial company behind them ceased operating some time ago, and ideally we want to be in a position to pay a company for support. I will say one nice thing about Bitronix is it does have good Spring Boot support.


Atomikos does have a paid supported version if we found we needed it down the line and they have support for a Non XA datasource. Another thing Atomikos has going for it, is that we are already using it for our persistence unit tests, though not with the NonXADataSourceBean but rather with Apache Commons DBCP. I switched over to the NonXADataSourceBean version and most of the unit tests seemed to work which was promising.

JBossTM aka Narayana

At the end of the day though my thinking was the lowest risk solution would be to use JBossTM in standalone mode. Given that we are already using JBoss’ built in version of it this seems to be the least amount of change we can do. The first challenge is actually finding anything out about the project. Just searching JbossTM or JBoss Transactions doesn’t find much for you. I eventually discovered it is called Narayana. Ignore the horribly designed website, I just found that tonight, I originally found it on the JBoss home pages and on github. JBoss has a quickstarts example github repository showing you how to use their stuff as well. The Spring example was sort of what I wanted, but not exactly so I figured why not just do a quick Spring Boot app as a proof of concept. So I forked the repository and added a Spring Boot app. I hit a problem though, I created a unit test to try out reverts my insert wasn’t reverting on a Runtime exception. Just to make sure I wasn’t completely crazy I created a quick Spring Boot app doing the same thing using Atomikos. That test app worked perfectly.

At this point I feared I would be forced into switching to Atomikos as that was all that I could prove was working with a trivial test. I went over to the JBoss forums as a last ditch effort and posted a question about it. Back in the day when I would post things to forums I would often get very little response or things would be vague cause it is hard sometimes on the other side of a forum to be clear what a person is asking about. In this case I could just include the links to the two github repositories as examples of what I was trying to do. I posted it and didn’t hear anything back on the day. The next morning I woke up and saw that at 3am a Redhat Engineer in China had forked my project and provided a fix. Immediately I could see what I was missing and he made my test pass right away. Back in the day I probably would have given up and just gone Atomikos, but with github it is so easy to share sample code of what you are trying to do and Spring boot makes it so fast to stand up a sample app there is really no reason not to do so anymore.

 One more note while I was working on this experiment in testing how to configure different JTA managers the Narayana guys submitted code to Spring Boot 1.4 Milestone 2 to include a Narayana Quickstarter so now it will be possible to easily wire that into a Spring Boot app. I know I talk about the greatness of Spring Boot all the time, but once again it saved me a ton of time by allowing me to prototype out a few options before I go too far down the road of solving the problem, and with github allowing me to collaborate with people around the world, we truly live in some amazing times.


SSL Certificates and Google Domains

Recently I ported my domain hosting from Godaddy to Google Domains. My main reason for doing so was to save money. Domain names on Godaddy cost $3 more per year, plus they charge you for privacy on whois searches whereas Google includes that for free. It was a fairly easy process to transfer my domain names in, but configuring the DNS was a little bit weird as their zone file editing interface was different that godaddy’s. However I thought I had it all good and working so I was happy with my setup.

Then last night one of my friends mentioned that he had just renewed his SSL certificate. That got me thinking I only had about 2 weeks left on my certificate and I needed to do that as well. I had mentioned previously that I switched my SSL certificates to Let’s Encrypt. The great thing about Let’s Encrypt is that it is free, and once you get it setup, less hassle to renew that other free certificate sites that I had used in the past. The drawback seems to be that they only issue certificates that are good for 90 days. I updated my Let’s Encrypt software (it is based out of a Github repository so there is normally a new version when you need to renew.) When I ran the renew command it failed on That is when I realized that I had misconfigured my cname record on Google Domains DNS setting and www. was completely broken only worked. It took me a little while to figure out. On Godaddy I think I would do the cname of www and point it to @ or * I don’t recall which they used. Google doesn’t let you point the cname to @. Aftering some googling I found out that on their DNS setup you have to point to your hostname that is registered at @. So it ends up being cname www and it points to Once I got that taken care of my certificate renewed without any issue.

While I was in there messing around I decided I would disable TLS 1.0. Doing so means dropping support for a ton of browsers including IE10. But it is widely considered as the next protocol to be hacked and at this point pretty much everyone supports 1.2 and the handful of readers I have I expect to be running current browsers (whether on their phones or computers.) I reran the Qualys SSL Test to make sure that I hadn’t broken anything. All looked well with the higher score now on the protocol section and many more test browsers that failed. In the course of running that test I noticed the HSTS preloading test that they are doing now. I didn’t even realize such a thing existed. I did some research and added the preload header to my HSTS header on my server and put my site on the preload list for Chrome. We shall see if that works or if I meet all the requirements, but I think I do. While I was editing my headers I noticed that I was doing the domain name rewriting wrong if the person came in through The code was working if they either came in without the www or they came in on www without the https. So it ended up being a useful night as I found 2 issues in my server config I was able to fix. In the course of writing this post I realized I should add the other cnames I have registered for this domain to my SSL certificate so that will be the next thing on my agenda.