Maven

Maven-user starting with package-management in Javascript

Gerbrand van Dieijen

I wanted to get started with Javascript and AngularJS, a framework for creating frontend for apps - e.g. human user interfaces. Reason:  software is eating the world, but Javascript is eating all software.

I don't like the messy javascript approach of downloading js files storing them manually in your project-dir, or worse, copy&pasting snippets. I'm used to programming Java, with using Maven to manage my Java-dependencies, and using brew (Mac) or apt-get (Ubuntu) to manage platform-specific dependencies. In this posting all write down my experiences on starting with Javascript-development, with practical use of package managers.

 Read more

Speeding up Require.js optimization in Maven

Sannie Kwakman

In the last couple of web applications we've been working on, we used the Require.js library to modularize our Javascript code. This allows developers to separate large amounts of Javascript code into smaller modules which makes the code a lot easier to read and maintain.

Require.js also provides an optimizer script which compiles and compresses these modules into one (or a few) files for efficient delivery to the end-user. Sadly, when running the optimizer as part of a Maven build this process can take several minutes to complete. In this article I’ll explain how to speed things up a bit.

The Rhino in the room

We use Require.js' optimizer as part of our application package- and release cycle which currently is powered by Maven. To run the optimizer with Maven we use the requirejs-maven-plugin. As the optimizer is written in Javascript and Maven runs Java, this plugin runs the optimizer script using the JRE’s built-in Rhino  Read more

How to integrate FitNesse tests into Jenkins

Marcus Martina

In an ideal continuous integration pipeline different levels of testing are involved. Individual software modules are typically validated through unit tests, whereas aggregates of software modules are validated through integration tests. When a continuous integration build tool like Jenkins is used it is natural to define different build steps, each step returning feedback and generating test reports and trend charts for a specific level of testing.

FitNesse is a lightweight testing framework that is meant to implement integration testing in a highly collaborative way, which makes it very suitable to be used within agile software projects. With Jenkins and Maven it is quite easy to trigger the execution of FitNesse integration tests automatically. When properly configured and bootstrapped, Jenkins can treat the FitNesse test results in a very similar way as it treats regular JUnit test results. Read more

Continuous releasing of Maven artifacts

Marcus Martina

Conceptually Maven distinguishes between snapshot versions and release versions. For top-level Maven projects that are continuously integrated it is unnatural to make this distinction. Especially when having some kind of continuous deployment implemented, it doesn't make much sense to create and deploy artifacts with a snapshot version at all. Snapshot versions cannot be linked to a unique revision in a version control system and can have ambiguous snapshot dependencies. That is why such artifacts shouldn't get deployed to a live environment.

Although it is possible to embed version control revision information as metadata into artifacts, for instance via the Manifest file, it is recommended to avoid snapshot versions at all. Only unique release versions should be continuously created and deployed instead. In fact, every single revision can be considered to be a release as will be shown below.

In order to implement this concept of continuous releasing the revision number needs to be made part of the artifact version. Read more

How to build true pipelines with Jenkins and Maven

Marcus Martina

The essence of creating a pipeline is breaking up a single build process in smaller steps, each having its own responsibility. In this way faster and more specific feedback can be returned. Lets define a true pipeline as being a pipeline that is strictly associated with a single revision within a version control system. This makes sense as ideally we want the build server to return full and accurate feedback for each single revision.

As new revisions can be committed any time it is natural that multiple pipelines actually get executed next to each other. If needed it is even possible to allow concurrent executions of the same build step for different pipelines. However some measurements need to be taken in order to guarantee that all steps executed within one pipeline are actually based on the same revision.

Within a Jenkins build server instance it is possible to create a true pipeline for a Maven based project quite easily and in an efficient way. But in order to establish one we need some extra Jenkins plugins to be provisioned and configured properly as we will see below. Read more

jar-with-deps don't like META-INF/services

Andrew Phillips

Recently, I was preparing a connection checker for Deployit's powerful remote execution framework Overthere. To make the checker, as compact as possible, I put together a jar-with-deps1 for distribution.
Tests and trial runs from the IDE worked, so I was expected the dry-run of the distribution to be a quick formality. Instead: boom!
Turns out that one of the libraries used by Overthere, TrueZIP - or indeed any code that utilizes Java's SPI mechanism2 - doesn't play well with the jar-with-deps idea. Read more

How Sonatype Nexus 1.9 ruined my day

Barend Garvelink

Update, 26-02: Brian Demers from Sonatype pointed out in the comments that Maven 2.0.10 and later are forwards-compatible with changes in the metadata format. If your Maven 2 version is one of the recommended versions on the download page, you will not have this problem.

Two days, in fact. Yesterday evening, after my colleagues went home, I brought down our Nexus 1.8.0.1 instance to upgrade it to 1.9 without interrupting their work. The download page for Nexus 1.9 contains the following instruction:

Sonatype has changed how the lucene indexes are stored on disk, it is required that users reindex all repositories in their nexus server to start benefitting from the changes (and for search to work properly).

Inconspicuous enough. Furthermore, clicking through from the change overview to the full change log reveals:

[NEXUS-3849] - Add full support for the new maven 3 snapshot metadata

What it doesn't reveal is that the rebuild metadata command in the repository administration screen, which would appear to be proper housekeeping at a time when you're reindexing the repositories, now generates Maven 3 style metadata and inadvertently breaks compatibility with Maven 2 (update: older versions). This is where the fun begins.

 Read more

DocBook, FOP and Fonts

Wilfred Springer

I'm proud to say that - during the six years of employment at the company formerly known as Sun Microsystems - I wrote all my documents in DocBook. Of course there was the occasional warning that we were all expected to use StarOffice, but by making sure the DocBook generated output resembled the printed material produced by HQ, it never turned into a big argument. And since my entire DocBook chain was built from open source, I had to use Apache FOP.

Apache FOP has a long history Read more

Middleware integration testing with JUnit, Maven and VMware, part 3 (of 3)

Vincent Partington

Last year, before the Christmas holidays ;-), I described how we do middleware integration testing at XebiaLabs and I described the way we deploy test servlets by wrapping them in WAR and EAR files that get generated on the fly. There is only one thing left to explain; how do we integrate these tests into a continuous build using Maven and VMware?

Running the middleware integration tests

So let's start with the Maven configuration. As I mentioned in the first blog of this series, the integration tests are recognizable by the fact that the classnames end in Itest. That means they won't get picked up by the default configuration of the Maven Surefire plugin. And that is fortunate because we don't always want to run these tests. Firstly they require a very specific test setup (the application server configurations should be in an expected state, see below) and secondly they can take a long time to complete and that would get in the way of the quick turnaround we want from commit builds in our continuous integration system.
 Read more

Middleware integration testing with JUnit, Maven and VMware, part 2 (of 3)

Vincent Partington

Last week I wrote about the approach we use at XebiaLabs to test the integrations with the different middleware products our Java EE deployment automation product Deployit supports.

The blog finished with the promise that I would discuss how to test that an application can really use the configurations that our middleware integrations (a.k.a. steps) create. But before we delve into that, let us first answer the question as to why we need this. If the code can configure a datasource without the application server, it must be OK for an application to use it, right? Well, not always. While WebSphere and WebLogic contain some functionality to test the connection to the database and thereby verify whether the datasource has been configured correctly, this functionality is not available for other configurations such as JMS settings. And JBoss has no such functionality at all. So the question is: how can we prove that an application can really work with the configurations created by our steps?
 Read more