Mastering Unit Testing Using Mockito and JUnit
上QQ阅读APP看书,第一时间看更新

Continuous Integration

In college, I was working on a critical steganography (image watermarking) project and simultaneously developing a module on my home computer, where I integrated my changes with other changes on the college server. Most of my time was wasted in integration. After manual integration, I would find everything broken; so, integration was terrifying.

When CI is not available, development teams or developers make changes to code and then all the code changes are brought together and merged. Sometimes, this merge is not very simple; it involves the integration of lots of conflicting changes. Often, after integration, weird bugs crop up and a working module may start to fail, as it involves a complete rework of numerous modules. Nothing goes as planned and the delivery is delayed. As a result, the predictability, cost, and customer service are affected.

CI is an XP concept. It was introduced to prevent integration issues. In CI, developers commit the code periodically, and every commit is built. Automated tests verify the system integrity. It helps in the incremental development and periodic delivery of the working software.

Benefits of CI

CI is meant to make sure that we're not breaking something unconsciously in our hurry. We want to run the tests continuously, and we need to be warned if they fail.

In a good software development team, we'd find test-driven development (TDD) as well as CI.

CI requires a listener tool to keep an eye on the version control system for changes. Whenever a change is committed, this tool automatically compiles and tests the application (sometimes it creates a WAR file, deploys the WAR/EAR file, and so on).

If compilation fails, or a test fails, or deployment fails, or something goes wrong, the CI tool immediately notifies the concerned team so that they can fix the issue.

CI is a concept; to adhere to CI, tools such as Sonar and FindBugs can be added to the build process to track the code quality, and they automatically monitor the code quality and code coverage metrics. Good quality code gives us confidence that a team is following the right path. Technical debts can be identified very quickly, and the team can start reducing the debts. Often, CI tools have the ability to present dashboards pertaining to quality metrics.

In a nutshell, CI tools enforce code quality, predictability, and provide quick feedback, which reduces the potential risk. CI helps to increase the confidence in the build. A team can still write very poor quality code, even test poor quality code, and the CI will not care.

Numerous CI tools are available on the market, such as Go, Bamboo, TeamCity, CruiseControl, and Jenkins. However, CruiseControl and Jenkins are the widely used ones.

Jenkins supports various build scripting tools. It integrates almost all sorts of projects and is easy to configure. In this chapter, we will work with Jenkins.

CI is just a generic conduit to run the commands; often, build tools are used to execute the commands, and then the CI tool collects the metrics produced by the commands or build tools. Jenkins needs build scripts to execute tests, compile the source code, or even deploy deliverables. Jenkins supports different build tools to execute the commands—Gradle, Maven, and Ant are the widely used ones. We will explore the build tools and then work with Jenkins.

Note

You can download the code for this chapter. Extract the ZIP file. It contains a folder named Packt. This folder has two subfolders: gradle and chapter02. The gradle folder contains the basic Gradle examples and the chapter02 folder contains the Java projects and Ant, Gradle, and Maven build scripts.

Gradle automation

Gradle is a build automation tool. Gradle has many benefits such as loose structure, ability to write scripts to build, simple two-pass project resolution, dependency management, remote plugins, and so on.

The best feature of Gradle is the ability to create a domain-specific language (DSL) for the build. An example would be generate-web-service-stubs or run-all-tests-in-parallel.

Note

A DSL is a programming language specialized for a domain and focuses on a particular aspect of a system. HTML is an example of DSL. We cannot build an entire system with a DSL, but DSLs are used to solve problems in a particular domain. The following are the examples of DSLs:

  • A DSL for building Java projects
  • A DSL for drawing graph

It's one of the unique selling point (USP) is an incremental build. It can be configured to build a project only if any resource has changed in the project. As a result, the overall build execution time decreases.

Gradle comes up with numerous preloaded plugins for different projects types. We can either use them or override.

Unlike Maven or Ant, Gradle is not XML based; it is based on a dynamic language called Groovy. Groovy is a developer-friendly Java Virtual Machine (JVM) language. Its syntax makes it easier to express the code intent and provides ways to effectively use expressions, collections, closures, and so on. Groovy programs run on JVM; so, if we write Java code in a Groovy file, it will run. Groovy supports DSL to make your code readable and maintainable.

Groovy's home page is http://groovy.codehaus.org/.

Tip

We can use Ant or Maven in a Gradle script. Gradle supports the Groovy syntax. Gradle provides support for Java, Web, Hibernate, GWT, Groovy, Scala, OSGi, and many other projects.

Big companies such as LinkedIn and Siemens use Gradle. Many open source projects, such as Spring, Hibernate, and Grails use Gradle.

Getting started

Java (jdk 1.5 +) needs to be installed before executing a Gradle script. The steps to do this are as follows:

  1. Go to the command prompt and run java –version; if Java is not installed or the version is older than 1.5, install the latest version from the Oracle site.
  2. Gradle is available at command prompt and go to the bin directory. You can extract the media to any directory you want. For example, if you extract the Gradle media under D:\Software\gradle-1.10, then open the command prompt and go to D:\Software\gradle-1.10\bin.
  3. Now, check the Gradle version using the gradle –v command. It will show you the version and other configuration. To run the Gradle from anywhere in your computer, create a GRADLE_HOME environment variable and set the value to the location where you extracted the Gradle media.
  4. Add %GRADLE_HOME%\bin (in Windows) to the PATH variable (export GRADLE_HOME and PATH to bash_login in Linux and bashrc in Mac).
  5. Open a new command prompt, go to any folder, and run the same command gradle –v again to check whether the PATH variable is set correctly.

The other option is to use the Gradle wrapper (gradlew) and allow the batch file (or shell script) to download the version of Gradle specific to each project. This is an industry standard for working with Gradle, which ensures that there's consistency among Gradle versions. The Gradle wrapper is also checked into the source code control along with the build artifacts.

Gradling

In the programming world, "Hello World" is the starting point. In this section, we will write our first "Hello World" Gradle script. A Gradle script can build one or more projects. Each project can have one or more tasks. A task can be anything like compiling Java files or building a WAR file.

Tip

To execute a task, we will create a build.gradle file and execute the gradle command to run a build. Gradle will look for a file named build.gradle in the current directory. To execute a build file other than build.gradle, use the –b <file name> option.

We will create a task to print "Hello World" on the console. Perform the following steps:

  1. Open a text editor and enter the following:
    task firstTask << {
      println 'Hello world.'
    }

    Save the file as build.gradle.

  2. Open the command prompt and browse to the folder where you saved the build.gradle file. Run the gradle firstTask command, or if you saved the file under D:\Packt\gradle, simply open the command prompt and run gradle –b D:\Packt\gradle\build.gradle firstTask.

    The following information will be printed on the command prompt:

    :firstTask
    Hello world.
    BUILD SUCCESSFUL
    

Here, task defines a Gradle task and << defines a task called firstTask with a single closure to execute. The println command is Groovy's equivalent to Java's System.out.println.

When we executed the task using its name, the output shows the task name and then printed the Hello world message.

Tip

Using the–q option, we can turn off the Gradle messages. If we run gradle –q –b build.gradle firstTask, then it will print only Hello world.

Ordering subtasks using doFirst and doLast

A task can contain many subtasks. Subtasks can be defined and ordered using the doFirst and doLast keywords. The following code snippet describes the Java method style task definition and subtask ordering:

task aTask(){
  doLast{
   println 'Executing last.'
  }
  
  doFirst {
        println 'Running 1st'
  }
}

Here, we defined a task named aTask using the Java method style. The task aTask contains two closure keywords: doLast and doFirst.

The doFirst closure is executed once the task is invoked, and the doLast closure is executed at the end.

When we run gradle aTask, it prints the following messages:

:aTask
Running 1st
Executing last.

BUILD SUCCESSFUL

Default tasks

In Ant, we can define a default target; similarly, Gradle provides options for default tasks using the keyword defaultTasks 'taskName1', …'taskNameN'.

The defaultTasks 'aTask' keyword defines aTask as a default task. So now if we only execute gradle with no task name, then it will invoke the default task.

The task dependency

In Ant, a target depends on another target, for example, a Java code compile task may depend on cleaning of the output folder; similarly, in Gradle, a task may depend on another task. The dependency is defined using the dependsOn keyword. The following syntax is used to define a task dependency:

secondTask.dependsOn 'firstTask'

Here, secondTask depends on firstTask.

Another way of defining task dependency is passing the dependency in a method-like style. The following code snippet shows the method argument style:

task secondTask (dependsOn: 'firstTask') {

  doLast {
        println 'Running last'
    }
  
   doFirst {
        println 'Running first'
    }

}

Execute gradle secondTask; it will first execute the dependent task firstTask and then execute the task secondTask as follows:

:firstTask
Hello world.
:secondTask
Running first
Running last

Another way of defining intertask dependency is using secondTask.dependsOn = ['firstTask'] or secondTask.dependsOn 'firstTask'.

Note

We can abbreviate each word of a task name in a camel case to execute a task. For example, the task name secondTask can be abbreviated to sT.

Daemon

Each time the gradle command is invoked, a new process is started, the Gradle classes and libraries are loaded, and the build is executed. Loading classes and libraries take time. Execution time can be reduced if a JVM, Gradle classes, and libraries, are not loaded each time. The --daemon command-line option starts a new Java process and preloads the Gradle classes and libraries; so, the first execution takes time. The next execution with the --daemon option takes almost no time because only the build gets executed—the JVM, with the required Gradle classes and libraries is already loaded. The configuration for daemon is often put into a GRADLE_OPTS environment variable; so, the flag is not needed on all calls. The following screenshot shows the execution of daemon:

Note that the first build took 31 seconds, whereas the second build tool took only 2 seconds.

To stop a daemon process, use gradle –stop the command-line option.

Gradle plugins

Build scripts are monotonous, for example, in a Java build script, we define the source file location, third-party JAR location, clean output folder, compile Java files, run tests, create JAR file, and so on. Almost all Java project build scripts look similar.

This is something similar to duplicate codes. We resolve the duplicates by refactoring and moving duplicates to a common place and share the common code. Gradle plugins solve this repetitive build task problem by moving the duplicate tasks to a common place so that all projects share and inherit the common tasks instead of redefining them.

A plugin is a Gradle configuration extension. It comes with some preconfigured tasks that, together, do something useful. Gradle ships with a number of plugins and helps us write neat and clean scripts.

In this chapter, we will explore the Java and Eclipse plugins.

The Eclipse plugin

The Eclipse plugin generates the project files necessary to import a project in Eclipse.

Any Eclipse project has two important files: a .project file and a .classpath file. The .project file contains the project information such as the project name and project nature. The .classpath file contains the classpath entries for the project.

Let's create a simple Gradle build with the Eclipse plugin using the following steps:

  1. Create a folder named eclipse, then a file named build.gradle, and add the following script:
    apply plugin: 'eclipse'

    To inherit a plugin nature, Gradle uses the apply plugin: '<plug-in name>' syntax.

  2. Open the command prompt and check all the available tasks using the gradle tasks –-all command. This will list the available Eclipse plugin tasks for you.
  3. Now run the gradle eclipse command. It will generate only the .project file, as the command doesn't know what type of project needs to be built. You will see the following output on the command prompt:
    :eclipseProject
    :eclipse
    BUILD SUCCESSFUL
    
  4. To create a Java project, add apply plugin: 'java' to the build.gradle file and rerun the command. This time it will execute four tasks as follows:
    :eclipseClasspath
    :eclipseJdt
    :eclipseProject
    :eclipse
    
  5. Open the Eclipse folder (the location where you put the build.gradle file). You will find the .project and .classpath files and a .settings folder. For a Java project, a Java Development Tools (JDT) configuration file is required. The .settings folder contains the org.eclipse.jdt.core.prefs file.

Now, we can launch Eclipse and import the project. We can edit the .project file and change the project name.

Normally, a Java project depends on third-party JARs, such as the JUnit JAR and Apache utility JARs. In the next section, we will learn how a classpath can be generated with JAR dependencies.

The Java plugin

The Java plugin provides some default tasks for your project that will compile and unit test your Java source code and bundle it into a JAR file.

The Java plugin defines the default values for many aspects of the project, such as the source files' location and Maven repository. We can follow the conventions or customize them if necessary; generally, if we follow the conventional defaults, then we don't need to do much in our build script.

Let's create a simple Gradle build script with the Java plugin and observe what the plugin offers. Perform the following steps:

  1. Create a java.gradle build file and add the apply plugin: 'java' line.
  2. Open the command prompt and type in gradle -b java.gradle tasks –-all. This will list the Java plugin tasks for you.
  3. To build a project, we can use the build task; the build depends on many tasks. Execute the gradle -b java.gradle build command. The following screenshot shows the output:

Since no source code was available, the build script didn't build anything. However, we can see the list of available tasks—build tasks are dependent on compile, JAR creation, test execution, and so on.

Java plugins come with a convention that the build source files will be under src/main/java, relative to the project directory. Non-Java resource files such as the XML and properties files will be under src/main/resources. Tests will be under src/test/java, and the test resources under the src/test/resources.

To change the default Gradle project source file directory settings, use the sourceSets keyword. The sourceSets keyword allows us to change the default source file's location.

A Gradle script must know the location of the lib directory to compile files. The Gradle convention for library locations is repositories. Gradle supports the local lib folder, external dependencies, and remote repositories.

Gradle also supports the following repositories:

  • Maven repository: Maven can be configured on our local machine, on a network machine, or even the preconfigured central repository.
    • Maven central repository: Maven's central repository is located at repository. The following is an example of accessing the central repository:
      repositories {
          mavenCentral()
      }
    • Maven local repository: If we have a local Maven repository, we can use the mavenLocal()method to resolve dependencies as follows:
      repositories {
          mavenLocal()
      }

      Tip

      The maven() method can be used to access repositories configured on the intranet. The following is an example of accessing an intranet URL:

      repositories {
          maven {
              name = 'Our Maven repository name'
              url = '<intranet URL>'
          }
      }

      The mavenRepo() method can be used with the following code:

      repositories {
          mavenRepo(name: '<name of the repository>', url: '<URL>')
      }

      A secured Maven repository needs user credentials. Gradle provides the credentials keyword to pass user credentials. The following is an example of accessing a secured Maven repository:

      repositories {
          maven(name: repository name') {
              credentials {
                  username = 'username'
                  password = 'password'
              }
              url = '<URL>'
          }
      }
  • Ivy repository: This is a remote or local ivy repository. Gradle supports the same Maven methods for ivy. The following is an example of accessing an ivy repository and a secured ivy repository:
    repositories {
        ivy(url: '<URL>', name: '<Name>')
        ivy {     
        credentials
         { 
            username = 'user name'
                password = 'password'
          }
            url = '<URL>'
        }
    }
  • Flat directory repository: This is a local or network directory. The following is an example of accessing a local directory:
    repositories {
        flatDir(dir: '../thirdPartyFolder', name: '3rd party library')
        flatDir {
            dirs '../springLib', '../lib/apacheLib', '../lib/junit' 
            name = ' Configured libraries for spring, apache and JUnit'
        }
    }

    Gradle uses flatDir() to locate a local or network-shared library folder. Here, dir is used to locate a single directory and dirs with directory locations separated by commas are used to locate distributed folders.

In this section, we will create a Java project, write a test, execute the test, compile source or test files, and finally build a JAR file. Perform the following steps:

  1. Create a build.gradle build script file under packt\chapter02\java.
  2. Add Eclipse and Java plugin support using the following lines of code:
    apply plugin: 'eclipse'
    apply plugin: 'java'
  3. We will write a JUnit test, so our project will be dependent on JUnit JARs. Create a lib directory under packt\chapter02 and copy the hamcrest-core-1.3.jar and junit-4.11.jar JARs (we downloaded these JARs in Chapter 1, JUnit 4 – a Total Recall).
  4. In this example, we will use the flat directory repository. We created a lib directory for JUnit JARs. Add the following lines to the build.gradle file to configure our repository:
    repositories {
        flatDir(dir: '../lib', name: 'JUnit Library')
    }

    We have a single lib folder; so, we will use flatDir and dir conventions.

    A repository can have numerous library files, but we may need only some of them. For example, source file compilation doesn't require the JUnit JARs but test files and test execution need them.

    Gradle comes with dependency management. The dependencies keyword is used to define dependencies.

    The closure dependencies support the following default types:

    • Compile: These are the dependencies required to compile the source of the project.
    • Runtime: These dependencies are required by the production classes at runtime. By default, these also include the compile time dependencies.
    • testCompile: These dependencies are required to compile the test source of the project. By default, they also include the compiled production classes and the compile-time dependencies.
    • testRuntime: These dependencies are required to run the tests. By default, they also include the compile, runtime, and testCompile dependencies.

    Each dependency type needs a coordinate: a group, name, and version of a dependent JAR.

    Some websites, such as dependency string, such as http://mvnrepository.com/artifact/org.springframework/spring-aop/3.1.1.RELEASE.

    Note

    Suppose we need to include the org.springframework.aop-3.1.1.RELEASE.jar file in our classpath (here org.springframework is the group, aop is the name, and 3.1.1.RELEASE is the version). We can simply write org.springframework:aop:3.1.1.RELEASE to identify aop.jar.

  5. Tests need JUnit JAR support. Add the following lines to our build.gradle file to add the JUnit dependency:
    dependencies {
        testCompile group: 'junit', name: 'junit', version: '4.11'
        testCompile group: '', name: 'hamcrest-core', version: '1.3'
    }

    Or simply add the following lines to the file:

    dependencies {
        testCompile 'junit:junit:4.11', ':hamcrest-core:1.3'
    }
  6. Generate an Eclipse project using the Eclipse plugin and issue the gradle eclipse command. The eclipse command will execute three tasks: eclipseClasspath, eclipseJdt, and eclipseProject.

    Go to the \chapter02\java folder, and you will find a .classpath and a .project file. Open the .classpath file and check whether junit-4.11 and hamcrest-core-1.3.jar have been added as classpathentry.

    The following screenshot shows the gradle eclipse command output:

    The following screenshot shows the content of the generated .classpath file:

  7. Launch Eclipse and import the project by navigating to File | Import | Existing Projects into Workspace. Now browse to the D:\Packt\chapter02\java folder and import the project. Eclipse will open the java project—the Java community's best practice is to keep the test and source code files under the same package but in a different source folder. Java code files are stored under src/main/java, and test files are stored under src/test/java. Source resources are stored under src/main/resources.

    We need to create the src/main/java, src/main/resources, and src/test/java folders directly under the Java project.

    The following screenshot displays the folder structure:

  8. Right-click on the leaf folders (the java and resources folders under src/main and src/test, respectively); a pop-up menu will open. Now, go to Build Path | Use as Source Folder.

    The following screenshot shows the action:

  9. We will create a Java class and unit test the behavior; the Java class will read from a properties file and return an enum type depending on the value provided in the properties file. Reading a file from the test is not recommended as I/O operations are unpredictable and slow; your test may fail to read the file and take time to slow down the test execution. We can use mock objects to stub the file read, but for simplicity, we will add two methods in the service class—one will take a String argument and return an enum type, and the other one will read from a properties file and call the first method with the value. From the test, we will call the first method with a string. The following are the steps to configure the project:
    1. Add an environment.properties properties file under /java/src/main/resources and add env = DEV in that file.
    2. Create an enum file in the com.packt.gradle package under the /java/src/main/java source package:
      public enum EnvironmentType {
        DEV, PROD, TEST
      }
    3. Create a Java class to read the properties file as follows:
      package com.packt.gradle;
      
      import java.util.ResourceBundle;
      
      public class Environment {
        public String getName() {
          ResourceBundle resourceBundle = ResourceBundle.getBundle("environment");
          return resourceBundle.getString("env");
        }
      }
    4. Create a EnvironmentService class to return an enum type depending on the environment setup as follows:
      package com.packt.gradle;
      public class EnvironmentService {
      
        public EnvironmentType getEnvironmentType() {
          return getEnvironmentType(new Environment().getName());
        }
        public EnvironmentType getEnvironmentType(String name) {
          if("dev".equals(name)) {
            return EnvironmentType.DEV;
          }else if("prod".equals(name)) {
            return EnvironmentType.PROD;
          }
          return null;
        }
      }

      The getEnvironmentType() method calls the Environment class to read the properties file value and then calls the getEnvironmentType(String name) method with the read value to return an enum type.

    5. Add a test class under /src/test/java in the com.packt.gradle package. The following is the code:
      package com.packt.gradle;
      import static org.junit.Assert.*;
      import static org.hamcrest.CoreMatchers.*;
      import org.junit.Test;
      
      public class EnvironmentServiceTest {
      EnvironmentService service = new EnvironmentService();
      @Test
      public void returns_NULL_when_environment_not_configured(){
          assertNull(service.getEnvironmentType("xyz"));      }
      
      @Test
      public void production_environment_configured(){
          EnvironmentType environmentType = service.getEnvironmentType("prod");
          assertThat(environmentType, is(EnvironmentType.PROD));
        }
      }

      Here, the returns_NULL_when_environment_not_configured() test passes xyz to the getEnvironmentType method and expects that the service will return null, assuming that there won't be any xyz environment. In another test, it passes the prod value to the getEnvironmentType method and expects that a type will be returned.

  10. Now open the command prompt and run gradle build; it will compile the source and test files, execute the test, and finally create a JAR file.

    To execute only the tests, run gradle test.

    Open the \chapter02\java\build folder, and you will find three important folders:

    • libs: This folder contains the build output JARs—Java.jar
    • reports: This folder contains the HTML test results
    • test-results: This folder contains the XML format test execution result and the time taken to execute each test

    The following screenshot shows the test execution result in the HTML format:

Gradle is an intelligent build tool, and it supports incremental build. Rerun the gradle build command. It will just skip the tasks and say UP-TO-DATE. The following is a screenshot of the incremental build:

If we make a change to the test class, only test tasks will be executed. The following are the test tasks: compileTestJava, testClasses, test, check, and build.

In next chapters, we will explore more on Gradle. Do you want to dive deep now? If so, you can visit http://www.gradle.org/docs/current/userguide/userguide.html.

Maven project management

Maven is a project build tool. Using Maven, we can build a visible, reusable, and maintainable project infrastructure.

Maven provides plugins for visibility: the code quality/best practices is visible through the PMD/checkstyle plugin, the XDOC plugin generates project content information, the JUnit report plugin makes the failure/success story visible to the team, the project activity tracking plugins make the daily activity visible, the change log plugin generates the list of changes, and so on.

As a result, a developer knows what APIs or modules are available for use; so, he or she doesn't invent the wheel (rather, he or she reuses the existing APIs or modules). This reduces the duplication and allows a maintainable system to be created.

In this section, we will explore the Maven architecture and rebuild our Gradle project using Maven.

Installation

A prerequisite for Maven is the Java Development Kit (JDK). Make sure you have JDK installed on your computer.

The following are the steps to set up Maven:

  1. Download the Maven media. Go to http://maven.apache.org/download.html to get the latest version of Maven.
  2. After downloading Maven, extract the archive to a folder; for example, I extracted it to D:\Software\apache-maven-3.1.1.
  3. For Windows OS, create an environment variable named M2_HOME and point it to the Maven installation folder. Modify the PATH variable and append %M2_HOME%\bin.
  4. For Linux, we need to export the PATH and M2_HOME environment variables to the .bashrc file. Open the .bashrc file and edit it with the following text:
    export M2_HOME=/home/<location of Maven installation>
    export PATH=${PATH}:${M2_HOME}/bin
  5. For Mac, the .bash_login file needs to be modified with following text:
    export M2_HOME=/usr/local/<maven folder>
    export PATH=${PATH}:${M2_HOME}/bin
  6. Check the installation and execute the mvn –version command. This should print the Maven version. The following is a screenshot of the output:

Maven is installed so we can start exploring Maven. Eclipse users with the m2eclipse plugin installed already have Maven, which they can directly use from Eclipse and they don't have to install Maven.

The Archetype plugin

In Maven, Archetype is a project-template generation plugin.

Maven allows us to create a project infrastructure from scratch from a list of predefined project types. The Maven command mvn archetype:generate generates a new project skeleton.

The archetype:generate command loads a catalog of available project types. It tries to connect to the central Maven repository at http://repo1.maven.org/maven2, and downloads the archetype catalog.

Tip

To get the latest catalog, you should be connected to the Internet.

Follow the ensuing steps to generate a Java project skeleton:

  1. Create a folder hierarchy /Packt/chapter02/maven, open the command prompt, and browse to the /Packt/chapter02/maven folder.
  2. Issue a mvn archetype:generate command; you will see a large list of archetypes being downloaded, each with a number, a name, and a short description.

    It will prompt you to enter an archetype number. Type in the default maven-archetype-quickstart archetype. In my case, the number is 343.

    The following screenshot shows you that the number 343 is default:

    Tip

    To get the entire catalog on Windows OS, enter the mvn archetype:generate > archetype.txt command. This will populate the text file with the project type list.

  3. Enter 343 or just hit Enter to select the default. Next, it will prompt you to select a version. Hit Enter to select the default.
  4. Now it will ask you to provide a groupId. A groupId is the root package for multiple projects, and org.springframework is the groupId for all Spring projects. Enter org.packt as groupId.
  5. Next, it will ask for artifactId. This is the project name and aop is the artifactId for org.springframework.aop-3.1.1.RELEASE. Enter Demo for the artifactId.
  6. Maven will ask for the version and the default is 1.0-SNAPSHOT. The version is your project's version, and here 3.1.1.RELEASE is the version for the org.springframework.aop-3.1.1.RELEASE project. We will accept the default. Hit Enter to accept the default.
  7. Now you will be prompted to enter the package name. Enter com.packt.edu as package's name.
  8. Finally, it will show you what you entered. Review it and accept it as shown in the following screenshot:

    Open the /Packt/chapter02/maven folder; you will see the Demo project folder is created with the following file structure:

The Maven convention for the source Java file is src/main/java and the test source file is src/test/java.

Maven will automatically create a Java file App.java under src/main/java/com/packt/edu and a test file AppTest under src/test/java/com/packt/edu.

Also, it will create an XML file pom.xml directly under Demo. This file will be used for building the project. In the next section, we will read about the POM file.

The Project Object Model (POM) file

Every Maven project contains a pom.xml file, which is a project metadata file.

A POM file can contain the following sections:

  • Project coordinates such as <groupId/>, <artifactId/>, <version/>, <dependency>, and inheritance through <modules/> and <parent/>

    Open the pom.xml file in the Demo folder; it contains the following coordinate details:

     <groupId>org.packt</groupId>
     <artifactId>Demo</artifactId>
     <version>1.0-SNAPSHOT</version>
     <packaging>jar</packaging>
  • The build details in <build> and <reporting>
  • Project visibility details such as <name>, <organization>, <developers>, <url>, and <contributors>

    Our generated pom.xml contains the following details:

     <name>Demo</name>
     
    <url>http://maven.apache.org</url>
  • Project environment details such as <scm>, <repository>, and <mailingList>

Project dependency

In a multimodule project, a project can depend on many other projects. For example, say we depend on JUnit. Maven automatically discovers the required artifact dependencies. This is very useful as we depend on many open source projects. It's always useful, be it an open source or a close source project.

Do you remember the Gradle dependency closure? It has four default types for Compile, Runtime, testCompile, and testRuntime.

Similarly, Maven has the following dependency scopes:

  • Compile: Code compile time classpath dependency; this is the default scope. If not, it is explicitly defined and then the compile time scope is set.
  • Runtime: This is required at runtime.
  • Test: This dependency is required for test code compilation and test execution.
  • Provided: The JDK or environment dependency at runtime.

A parent project defines dependencies using the following code snippet:

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
  </dependencies

All child projects inherit the dependency by just adding the <dependency> tag as follows:

<dependency>
  <groupId>junit</groupId>
  <artifactId>junit</artifactId>
</dependency>

The build life cycle

The build life cycle clearly defines the process of building and distributing a particular project artifact.

Maven has the following three built-in build life cycles:

  • Default: This life cycle handles the compile, test, packaging, deployment, and many more functions
  • Clean: This life cycle generally cleans the build artifacts generated by the previous build(s)
  • Site: This life cycle takes care of generation and deployment of the project's site documentation

Now, we will compile and test our Demo project.

In this section, we will work with compile, test, and package targets of the default life cycle.

Compiling the project

Perform the following steps to compile the project:

  1. Open the command prompt and browse to \Packt\chapter02\maven\Demo. Maven needs a pom.xml file to compile a project.
  2. Type in mvn compile; it will compile the project and create class files under \Demo\target\classes. The following screenshot shows the output:

Testing the project

To execute the tests in Demo, open the command prompt and type in mvn test; it will download JUnit JARs and surefire JARs for test compilation and test report generation respectively and then execute the test. The following screenshot shows the output:

Packaging the project

The mvn package command compiles source code, compiles tests, executes tests, and finally builds a JAR. It will generate Demo-1.0-SNAPSHOT.jar in \Packt\chapter02\maven\Demo\target.

The clean life cycle

The mvn clean command removes the target folder and deletes all the content. Run the command and check that the target folder has been deleted from \Packt\chapter02\maven\Demo\.

The site life cycle

The mvn site command generates a detailed project report in the HTML format under the target or site. It includes About, Plugin Management, Distribution Management, Dependency Information, Source Repository, Mailing Lists, Issue Tracking, Continuous Integration, Project Plugins, Project License, Project Team, Project Summary, and Dependencies.

Refer to http://maven.apache.org/guides/index.html to explore more on Maven.

The next section covers the Apache Ant.

Another neat tool (Ant)

Ant is a Java-based build tool from the Apache Software Foundation. Ant's build files are written in XML. You need Java to execute an Ant task.

Download Apache Ant from create an ANT_HOME variable and set the value to the extracted location. Edit PATH and append %ANT_HOME%\bin in Windows. For Mac or Linux OS, you need to export ANT_HOME and PATH as described in the Installation section of Maven project management earlier in this chapter.

Ant needs a build.xml file to execute tasks. Ant supports the –f option to specify a build script; so the ant –f myBuildFile.xml command will work.

We will create a build script and execute the Maven project (\Packt\chapter02\maven\Demo) using Ant. Follow the ensuing steps:

  1. Create an XML file build.xml in \Packt\chapter02\maven\Demo.
  2. Add the following lines in the build.xml file:
    <?xml version="1.0"?>
    <project name="Demo"  basedir=".">
      <property name="src.dir" location="src/main/java" />
      <property name="build.dir" location="bin" />
      <property name="dist.dir" location="ant_output" />
    </project>

    The <project> tag is a defined tag in Ant. You can name your project, and Demo is the name of the project. Next, we will set properties; a property can have a name and value or location. Here, src.dir is a property name, and this property can be accessed from any task using the ${src.dir} syntax. The location attribute refers to a relative location from the build.xml file. Since src/main/java contains the source file, we set the location value to src/main/java. The other two properties, build.dir and dist.dir, will be used by the Java compiling task to compile class files and generate the JAR file.

  3. Do you remember the clean task in Maven? Ant doesn't provide default targets. We have to define a clean target to remove old build outputs, and we will call Ant's <delete> command to delete directories. Then, using the <mkdir> command, we will recreate the directories:
      <target name="clean">
        <delete dir="${build.dir}" />
        <delete dir="${dist.dir}" />
      </target>
      <target name="makedir">
        <mkdir dir="${build.dir}" />
        <mkdir dir="${dist.dir}" />
      </target>

    Note that we added two targets using the <target> tag. Each target is identified using a name. We will call the clean target to delete build.dir (generated .class files) and dist.dir (build output JARs).

  4. Compile task is inbuilt in Gradle/Maven, but Ant doesn't have any inbuilt compile targets; so, we will create a target to compile Java files as follows:
      <target name="compile" depends="clean, makedir">
        <javac srcdir="${src.dir}" destdir="${build.dir}">
        </javac>
      </target>

    Use the <javac> command to compile Java files. The <javac> command accepts srcdir and destdir. Compiler reads Java files from srcdir and generates class files to destdir.

    A target may depend on another, and depends allows us to pass comma-separated target names. Here, compile target depends on clean and makedir.

  5. The compilation is done. Now, we will create jar from the class files using the <jar> command as follows:
    <target name="jar" depends="compile">
        <jar destfile="${dist.dir}\${ant.project.name}.jar" basedir="${build.dir}">
        </jar>
      </target>

    The jar target needs to know the class file's location and destination. The destfile attribute refers to the destination JAR file name and location and basedir refers to the class file location. Check whether we used ${dist.dir}\${ant.project.name}.jar to represent the destination JAR file name and folder. Here, ${dist.dir} refers to the destination folder, and ${ant.project.name}.jar represents the JAR name. ${ant.project.name} is the name (Demo) we mentioned in the <project> tag.

  6. The Ant script is ready to compile and create a JAR. Open the command prompt, go to \Packt\chapter02\maven\Demo and issue an ant jar command. Here, jar depends on compile and compile depends on clean and makedir. So, the jar command will create two directories, bin and ant_output, compile the Java file and generate the.class file in the bin folder, and finally create a Demo.jar JAR in the ant_output folder.
  7. The compilation is done; now, it's time to execute the tests. Tests need JUnit JARs and generated source class files to compile and execute. We have created the lib directory for Gradle in \Packt\chapter02\lib and kept the JUnit 4 JARs in it. We will use this lib. Add three properties for the test source file directory, library directory, and test report as follows:
      <property name="test.dir" location="src/test/java" />
      <property name="lib.dir" location="../../lib" />
      <property name="report.dir" location="${dist.dir}/report" />

    Check whether the lib.dir location is relative to the build.xml location. The test.dir attribute points to src/test/main and test reports will be generated inside ant_output/report.

  8. Path allows us to refer to a directory or to a file path. We will define a jclass.path path to refer to all JAR files under the lib directory and generated .class files as follows:
      <path id="jclass.path">
        <fileset dir="${lib.dir}/">
          <include name="**/*" />
        </fileset>
        <pathelement location="${build.dir}" />
      </path>

    The <fileset> tag takes a directory location and <include> takes a file name or regular expression. The **/* value means all the directories and files are in ${lib.dir}. The pathelement attribute refers to the bin directory where the compiled class files are put.

  9. Now, we need to compile test files. Add a testcompile target and use the javac command. Pass test.dir as srcdir for compilation. Add <classpath> to refer the jclass.path value. This will compile the test files. Consider the following code snippet:
     <target name="testcompile" depends="compile">
        <javac srcdir="${test.dir}" destdir="${build.dir}">
          <classpath refid="jclass.path" />
        </javac>
      </target>
  10. Add another target to execute the JUnit test. Ant has a junit command to run tests. Pass jclass.path to point the lib directory and generated files as follows:
      <target name="test" depends="testcompile">
        <junit printsummary="on" fork="true" haltonfailure="yes">
          <classpath refid="jclass.path" />
          <formatter type="xml" />
          <batchtest todir="${report.dir}">
            <fileset dir="${test.dir}">
              <include name="**/*Test*.java" />
            </fileset>
          </batchtest>
        </junit>
      </target>

    Issue the ant test command. This command compiles and executes the tests.

    We can set a default task in the build.xml file in the <project> tag. The syntax is <project name="Demo" default="task name" basedir=".">. Now, we don't have to specify a target name.

Our Ant script is ready for compiling Java files, executing tests, and generating reports. In the next section, we will set up Jenkins and use the build scripts.

To explore more on how to compile web archives and learn advanced topics, go to http://ant.apache.org/.