In the past when I was a Java developer we would run Sonar on our projects for static analysis. I have always liked the dashboard view it provides and the way it can find all sorts of problems in a code base that are often overlooked. When I learned that Sonar supported Go I knew that I would eventually integrate it into our environment. Since I had already built out our continuous integration pipeline in Bitbucket, I figured it would be easy to integrate Sonar into our builds. Little did I know that there wasn’t much documentation out there on the internet showing how to do so.
I knew I need to run the
sonar-scanner-cli against my project, but the only example I could find was this helpful blog post here. I started out with that approach but had an immediate problem. Before I integrated my sonar step in my pipeline, my builds took about 4 minutes from the time a feature was merged into master until the software was running on Kubernetes in the cloud. After I followed that blog post my builds were taking an extra 3.5 minutes. Given that you are billed based on build minutes and the amount of code we ship this was going to be unacceptable for us as we were going to chew through too much build time.
Setting up your pipeline
Before we go any further though let’s setup a build pipeline for go. My post assumes that you are using Go modules as it makes the pipeline much simpler. My initial pipelines were built before I modularized my code and there was more setup. Given that in Go 1.14 modules are the default I am not going to document the old
Start with the top of your pipeline file and you will define it like so:
We will use the golang docker container as our base container for our pipeline. We will then define caches for our Go module dependencies and our
.sonar directories. This will speed up subsequent builds and as mentioned above in bitbucket pipelines time is money.
Once we have that defined we define our build steps we will look at the main two steps for our objective here:
- step: &build
name: Build the app
- git config --global url."email@example.com:".insteadOf "https://bitbucket.org/"
- go build
- go test -cover --coverprofile=coverage.out ./...
- go vet ./...
- step: &sonar
name: Sonar code analysis
- export SONAR_LOGIN=$SONAR_API_TOKEN
- export SONAR_PROJECT_BASE_DIR=.
- /opt/sonar-scanner/bin/sonar-scanner -Dsonar.login=$SONAR_API_TOKEN
The first thing we do is a git configuration. This is because if you use libraries in a private bitbucket repository
go get by default will fail authenticate. This tells it that any requests to bitbucket should be by ssh instead of https. You will then need to have ssh keys configured on your pipeline.
The next thing we do is build our code and run the unit tests. We use the built in Go code coverage and output the coverage to a file. We also run
go vet which is great for finding issues in your code. At the end of that build step it saves the coverage file for use in later steps and also saves the Go cache so that all the dependencies that were downloaded for your build don’t need to be downloaded for each run.
The second step above is the sonar step. Since downloading the entire cli tool each time was too slow, I figured I would start with the docker container published by sonar source. This loads much faster than copying the tool down. Once that image starts up we set some environmental variables and then invoke our scanner. This setup assumes that you have created an API token for your sonar instance and that you have configured it as a pipeline variable in bitbucket.
There is just 1 thing missing now and that is our sonar config. In the root of your project create a
sonar-project.properties file to configure your settings. Mine looks similar to this:
Here we configure a project name and key and point to the URL of our sonar instance. Then we configure which sources to scan and which to exclude and which tests to scan. We also configure our code coverage file here. Once we have this we can finish our pipeline. Once we have our steps declared we have our pipeline run on master as shown below:
- step: *build
- step: *build
- step: *sonar
This says whenever anything happens (someone pushes a commit to a branch) we will run our build step. We then setup a branch specific rule that whenever anyone merges code into master we will build our code and run the sonar analysis on our code and publish it to our sonar server.
That is all you need to do to get a very basic pipeline up and running on bitbucket which will build your Go app and run sonar against all new features merged into your master branch. I hope this helps and saves you a bunch of time as it took me a while to figure out how to get it running well.