In this chapter, you will be introduced to the concept of Extreme Programming (XP), Continuous Integration (CI), the benefits of CI, and JUnit test automation using various tools.
The following topics will be covered in this chapter:
In college, I was working on a critical steganography (image watermarking) project and simultaneously developing a module on my home computer, where I integrated my changes with other changes on the college server. Most of my time was wasted in integration. After manual integration, I would find everything broken; so, integration was terrifying.
When CI is not available, development teams or developers make changes to code and then all the code changes are brought together and merged. Sometimes, this merge is not very simple; it involves the integration of lots of conflicting changes. Often, after integration, weird bugs crop up and a working module may start to fail, as it involves a complete rework of numerous modules. Nothing goes as planned and the delivery is delayed. As a result, the predictability, cost, and customer service are affected.
CI is an XP concept. It was introduced to prevent integration issues. In CI, developers commit the code periodically, and every commit is built. Automated tests verify the system integrity. It helps in the incremental development and periodic delivery of the working software.
CI is meant to make sure that we're not breaking something unconsciously in our hurry. We want to run the tests continuously, and we need to be warned if they fail.
In a good software development team, we'd find test-driven development (TDD) as well as CI.
CI requires a listener tool to keep an eye on the version control system for changes. Whenever a change is committed, this tool automatically compiles and tests the application (sometimes it creates a WAR file, deploys the WAR/EAR file, and so on).
If compilation fails, or a test fails, or deployment fails, or something goes wrong, the CI tool immediately notifies the concerned team so that they can fix the issue.
CI is a concept; to adhere to CI, tools such as Sonar and FindBugs can be added to the build process to track the code quality, and they automatically monitor the code quality and code coverage metrics. Good quality code gives us confidence that a team is following the right path. Technical debts can be identified very quickly, and the team can start reducing the debts. Often, CI tools have the ability to present dashboards pertaining to quality metrics.
In a nutshell, CI tools enforce code quality, predictability, and provide quick feedback, which reduces the potential risk. CI helps to increase the confidence in the build. A team can still write very poor quality code, even test poor quality code, and the CI will not care.
Numerous CI tools are available on the market, such as Go, Bamboo, TeamCity, CruiseControl, and Jenkins. However, CruiseControl and Jenkins are the widely used ones.
Jenkins supports various build scripting tools. It integrates almost all sorts of projects and is easy to configure. In this chapter, we will work with Jenkins.
CI is just a generic conduit to run the commands; often, build tools are used to execute the commands, and then the CI tool collects the metrics produced by the commands or build tools. Jenkins needs build scripts to execute tests, compile the source code, or even deploy deliverables. Jenkins supports different build tools to execute the commands—Gradle, Maven, and Ant are the widely used ones. We will explore the build tools and then work with Jenkins.
You can download the code for this chapter. Extract the ZIP file. It contains a folder named Packt
. This folder has two subfolders: gradle
and chapter02
. The gradle
folder contains the basic Gradle examples and the chapter02
folder contains the Java projects and Ant, Gradle, and Maven build scripts.
Gradle is a build automation tool. Gradle has many benefits such as loose structure, ability to write scripts to build, simple two-pass project resolution, dependency management, remote plugins, and so on.
The best feature of Gradle is the ability to create a domain-specific language (DSL) for the build. An example would be generate-web-service-stubs or run-all-tests-in-parallel.
A DSL is a programming language specialized for a domain and focuses on a particular aspect of a system. HTML is an example of DSL. We cannot build an entire system with a DSL, but DSLs are used to solve problems in a particular domain. The following are the examples of DSLs:
It's one of the unique selling point (USP) is an incremental build. It can be configured to build a project only if any resource has changed in the project. As a result, the overall build execution time decreases.
Gradle comes up with numerous preloaded plugins for different projects types. We can either use them or override.
Unlike Maven or Ant, Gradle is not XML based; it is based on a dynamic language called Groovy. Groovy is a developer-friendly Java Virtual Machine (JVM) language. Its syntax makes it easier to express the code intent and provides ways to effectively use expressions, collections, closures, and so on. Groovy programs run on JVM; so, if we write Java code in a Groovy file, it will run. Groovy supports DSL to make your code readable and maintainable.
Groovy's home page is http://groovy.codehaus.org/.
Big companies such as LinkedIn and Siemens use Gradle. Many open source projects, such as Spring, Hibernate, and Grails use Gradle.
Java (jdk 1.5 +) needs to be installed before executing a Gradle script. The steps to do this are as follows:
java –version
; if Java is not installed or the version is older than 1.5, install the latest version from the Oracle site.bin
directory. Open the command prompt and go to the bin
directory. You can extract the media to any directory you want. For example, if you extract the Gradle media under D:Softwaregradle-1.10
, then open the command prompt and go to D:Softwaregradle-1.10in
.gradle –v
command. It will show you the version and other configuration. To run the Gradle from anywhere in your computer, create a GRADLE_HOME
environment variable and set the value to the location where you extracted the Gradle media.%GRADLE_HOME%in
(in Windows) to the PATH
variable (export GRADLE_HOME
and PATH
to bash_login
in Linux and bashrc
in Mac).gradle –v
again to check whether the PATH
variable is set correctly.The other option is to use the Gradle wrapper (gradlew
) and allow the batch file (or shell script) to download the version of Gradle specific to each project. This is an industry standard for working with Gradle, which ensures that there's consistency among Gradle versions. The Gradle wrapper is also checked into the source code control along with the build artifacts.
In the programming world, "Hello World" is the starting point. In this section, we will write our first "Hello World" Gradle script. A Gradle script can build one or more projects. Each project can have one or more tasks. A task can be anything like compiling Java files or building a WAR file.
We will create a task to print "Hello World" on the console. Perform the following steps:
task firstTask << { println 'Hello world.' }
Save the file as build.gradle
.
build.gradle
file. Run the gradle firstTask
command, or if you saved the file under D:Packtgradle
, simply open the command prompt and run gradle –b D:Packtgradleuild.gradle firstTask
.The following information will be printed on the command prompt:
:firstTask Hello world. BUILD SUCCESSFUL
Here, task
defines a Gradle task and <<
defines a task called firstTask
with a single closure to execute. The println
command is Groovy's equivalent to Java's System.out.println
.
When we executed the task using its name, the output shows the task name and then printed the Hello world message.
A task can contain many subtasks. Subtasks can be defined and ordered using the doFirst
and doLast
keywords. The following code snippet describes the Java method style task definition and subtask ordering:
task aTask(){ doLast{ println 'Executing last.' } doFirst { println 'Running 1st' } }
Here, we defined a task named aTask
using the Java method style. The task aTask
contains two closure keywords: doLast
and doFirst
.
The doFirst
closure is executed once the task is invoked, and the doLast
closure is executed at the end.
When we run gradle aTask
, it prints the following messages:
:aTask Running 1st Executing last. BUILD SUCCESSFUL
In Ant, we can define a default target; similarly, Gradle provides options for default tasks using the keyword defaultTasks 'taskName1', …'taskNameN'
.
The defaultTasks
'aTask
' keyword defines aTask
as a default task. So now if we only execute gradle
with no task name, then it will invoke the default task.
In Ant, a target depends on another target, for example, a Java code compile task may depend on cleaning of the output folder; similarly, in Gradle, a task may depend on another task. The dependency is defined using the dependsOn
keyword. The following syntax is used to define a task dependency:
secondTask.dependsOn 'firstTask'
Here, secondTask
depends on firstTask
.
Another way of defining task dependency is passing the dependency in a method-like style. The following code snippet shows the method argument style:
task secondTask (dependsOn: 'firstTask') {
doLast {
println 'Running last'
}
doFirst {
println 'Running first'
}
}
Execute gradle secondTask
; it will first execute the dependent task firstTask
and then execute the task secondTask
as follows:
:firstTask Hello world. :secondTask Running first Running last
Another way of defining intertask dependency is using secondTask.dependsOn = ['firstTask']
or secondTask.dependsOn 'firstTask'
.
Each time the gradle
command is invoked, a new process is started, the Gradle classes and libraries are loaded, and the build is executed. Loading classes and libraries take time. Execution time can be reduced if a JVM, Gradle classes, and libraries, are not loaded each time. The --daemon
command-line option starts a new Java process and preloads the Gradle classes and libraries; so, the first execution takes time. The next execution with the --daemon
option takes almost no time because only the build gets executed—the JVM, with the required Gradle classes and libraries is already loaded. The configuration for daemon is often put into a GRADLE_OPTS
environment variable; so, the flag is not needed on all calls. The following screenshot shows the execution of daemon:
Note that the first build took 31 seconds, whereas the second build tool took only 2 seconds.
To stop a daemon process, use gradle –stop
the command-line option.
Build scripts are monotonous, for example, in a Java build script, we define the source file location, third-party JAR location, clean output folder, compile Java files, run tests, create JAR file, and so on. Almost all Java project build scripts look similar.
This is something similar to duplicate codes. We resolve the duplicates by refactoring and moving duplicates to a common place and share the common code. Gradle plugins solve this repetitive build task problem by moving the duplicate tasks to a common place so that all projects share and inherit the common tasks instead of redefining them.
A plugin is a Gradle configuration extension. It comes with some preconfigured tasks that, together, do something useful. Gradle ships with a number of plugins and helps us write neat and clean scripts.
In this chapter, we will explore the Java and Eclipse plugins.
The Eclipse plugin generates the project files necessary to import a project in Eclipse.
Any Eclipse project has two important files: a .project
file and a .classpath
file. The .project
file contains the project information such as the project name and project nature. The .classpath
file contains the classpath entries for the project.
Let's create a simple Gradle build with the Eclipse plugin using the following steps:
eclipse
, then a file named build.gradle
, and add the following script:apply plugin: 'eclipse'
To inherit a plugin nature, Gradle uses the apply plugin: '<plug-in name>'
syntax.
gradle tasks –-all
command. This will list the available Eclipse plugin tasks for you.gradle eclipse
command. It will generate only the .project
file, as the command doesn't know what type of project needs to be built. You will see the following output on the command prompt::eclipseProject :eclipse BUILD SUCCESSFUL
apply plugin: 'java'
to the build.gradle
file and rerun the command. This time it will execute four tasks as follows::eclipseClasspath :eclipseJdt :eclipseProject :eclipse
Eclipse
folder (the location where you put the build.gradle
file). You will find the .project
and .classpath
files and a .settings
folder. For a Java project, a Java Development Tools (JDT) configuration file is required. The .settings
folder contains the org.eclipse.jdt.core.prefs
file.Now, we can launch Eclipse and import the project. We can edit the .project
file and change the project name.
Normally, a Java project depends on third-party JARs, such as the JUnit JAR and Apache utility JARs. In the next section, we will learn how a classpath can be generated with JAR dependencies.
The Java plugin provides some default tasks for your project that will compile and unit test your Java source code and bundle it into a JAR file.
The Java plugin defines the default values for many aspects of the project, such as the source files' location and Maven repository. We can follow the conventions or customize them if necessary; generally, if we follow the conventional defaults, then we don't need to do much in our build script.
Let's create a simple Gradle build script with the Java plugin and observe what the plugin offers. Perform the following steps:
java.gradle
build file and add the apply plugin: 'java'
line.gradle -b java.gradle tasks –-all
. This will list the Java plugin tasks for you.gradle -b java.gradle build
command. The following screenshot shows the output:Since no source code was available, the build script didn't build anything. However, we can see the list of available tasks—build tasks are dependent on compile, JAR creation, test execution, and so on.
Java plugins come with a convention that the build source files will be under src/main/java
, relative to the project directory. Non-Java resource files such as the XML and properties files will be under src/main/resources
. Tests will be under src/test/java
, and the test resources under the src/test/resources
.
To change the default Gradle project source file directory settings, use the sourceSets
keyword. The sourceSets
keyword allows us to change the default source file's location.
A Gradle script must know the location of the lib
directory to compile files. The Gradle convention for library locations is repositories. Gradle supports the local lib
folder, external dependencies, and remote repositories.
Gradle also supports the following repositories:
mavenCentral()
groovy method can be used to load dependencies from the centralized Maven repository. The following is an example of accessing the central repository:repositories { mavenCentral() }
mavenLocal()
method to resolve dependencies as follows:repositories { mavenLocal() }
The maven()
method can be used to access repositories configured on the intranet. The following is an example of accessing an intranet URL:
repositories { maven { name = 'Our Maven repository name' url = '<intranet URL>' } }
The mavenRepo()
method can be used with the following code:
repositories { mavenRepo(name: '<name of the repository>', url: '<URL>') }
A secured Maven repository needs user credentials. Gradle provides the credentials
keyword to pass user credentials. The following is an example of accessing a secured Maven repository:
repositories { maven(name: repository name') { credentials { username = 'username' password = 'password' } url = '<URL>' } }
repositories { ivy(url: '<URL>', name: '<Name>') ivy { credentials { username = 'user name' password = 'password' } url = '<URL>' } }
repositories { flatDir(dir: '../thirdPartyFolder', name: '3rd party library') flatDir { dirs '../springLib', '../lib/apacheLib', '../lib/junit' name = ' Configured libraries for spring, apache and JUnit' } }
Gradle uses flatDir()
to locate a local or network-shared library folder. Here, dir
is used to locate a single directory and dirs
with directory locations separated by commas are used to locate distributed folders.
In this section, we will create a Java project, write a test, execute the test, compile source or test files, and finally build a JAR file. Perform the following steps:
build.gradle
build script file under packtchapter02java
.apply plugin: 'eclipse' apply plugin: 'java'
lib
directory under packtchapter02
and copy the hamcrest-core-1.3.jar
and junit-4.11.jar
JARs (we downloaded these JARs in Chapter 1, JUnit 4 – a Total Recall).lib
directory for JUnit JARs. Add the following lines to the build.gradle
file to configure our repository:repositories { flatDir(dir: '../lib', name: 'JUnit Library') }
We have a single lib
folder; so, we will use flatDir
and dir
conventions.
A repository can have numerous library files, but we may need only some of them. For example, source file compilation doesn't require the JUnit JARs but test files and test execution need them.
Gradle comes with dependency management. The dependencies keyword is used to define dependencies.
The closure dependencies support the following default types:
Each dependency type needs a coordinate: a group, name, and version of a dependent JAR.
Some websites, such as mvnrepository.com, can help us to come up with a ready-to-copy-paste dependency string, such as http://mvnrepository.com/artifact/org.springframework/spring-aop/3.1.1.RELEASE.
build.gradle
file to add the JUnit dependency:dependencies { testCompile group: 'junit', name: 'junit', version: '4.11' testCompile group: '', name: 'hamcrest-core', version: '1.3' }
Or simply add the following lines to the file:
dependencies { testCompile 'junit:junit:4.11', ':hamcrest-core:1.3' }
gradle eclipse
command. The eclipse
command will execute three tasks: eclipseClasspath
, eclipseJdt
, and eclipseProject
.Go to the chapter02java
folder, and you will find a .classpath
and a .project
file. Open the .classpath
file and check whether junit-4.11
and hamcrest-core-1.3.jar
have been added as classpathentry
.
The following screenshot shows the gradle eclipse
command output:
The following screenshot shows the content of the generated .classpath
file:
D:Packtchapter02java
folder and import the project. Eclipse will open the java
project—the Java community's best practice is to keep the test and source code files under the same package but in a different source folder. Java code files are stored under src/main/java
, and test files are stored under src/test/java
. Source resources are stored under src/main/resources
.We need to create the src/main/java
, src/main/resources
, and src/test/java
folders directly under the Java project.
java
and resources
folders under src/main
and src/test
, respectively); a pop-up menu will open. Now, go to Build Path | Use as Source Folder.The following screenshot shows the action:
enum
type depending on the value provided in the properties file. Reading a file from the test is not recommended as I/O operations are unpredictable and slow; your test may fail to read the file and take time to slow down the test execution. We can use mock objects to stub the file read, but for simplicity, we will add two methods in the service class—one will take a String
argument and return an enum
type, and the other one will read from a properties file and call the first method with the value. From the test, we will call the first method with a string. The following are the steps to configure the project:environment.properties
properties file under /java/src/main/resources
and add env = DEV
in that file.enum
file in the com.packt.gradle
package under the /java/src/main/java
source package:public enum EnvironmentType { DEV, PROD, TEST }
package com.packt.gradle; import java.util.ResourceBundle; public class Environment { public String getName() { ResourceBundle resourceBundle = ResourceBundle.getBundle("environment"); return resourceBundle.getString("env"); } }
EnvironmentService
class to return an enum
type depending on the environment setup as follows:package com.packt.gradle; public class EnvironmentService { public EnvironmentType getEnvironmentType() { return getEnvironmentType(new Environment().getName()); } public EnvironmentType getEnvironmentType(String name) { if("dev".equals(name)) { return EnvironmentType.DEV; }else if("prod".equals(name)) { return EnvironmentType.PROD; } return null; } }
The getEnvironmentType()
method calls the Environment
class to read the properties file value and then calls the getEnvironmentType(String name)
method with the read value to return an enum
type.
/src/test/java
in the com.packt.gradle
package. The following is the code:package com.packt.gradle; import static org.junit.Assert.*; import static org.hamcrest.CoreMatchers.*; import org.junit.Test; public class EnvironmentServiceTest { EnvironmentService service = new EnvironmentService(); @Test public void returns_NULL_when_environment_not_configured(){ assertNull(service.getEnvironmentType("xyz")); } @Test public void production_environment_configured(){ EnvironmentType environmentType = service.getEnvironmentType("prod"); assertThat(environmentType, is(EnvironmentType.PROD)); } }
Here, the returns_NULL_when_environment_not_configured()
test passes xyz
to the getEnvironmentType
method and expects that the service will return null
, assuming that there won't be any xyz
environment. In another test, it passes the prod
value to the getEnvironmentType
method and expects that a type will be returned.
gradle build
; it will compile the source and test files, execute the test, and finally create a JAR file.To execute only the tests, run gradle test
.
Open the chapter02javauild
folder, and you will find three important folders:
libs
: This folder contains the build output JARs—Java.jar
reports
: This folder contains the HTML test resultstest-results
: This folder contains the XML format test execution result and the time taken to execute each testThe following screenshot shows the test execution result in the HTML format:
Gradle is an intelligent build tool, and it supports incremental build. Rerun the gradle build
command. It will just skip the tasks and say UP-TO-DATE
. The following is a screenshot of the incremental build:
If we make a change to the test class, only test tasks will be executed. The following are the test tasks: compileTestJava
, testClasses
, test
, check
, and build
.
In next chapters, we will explore more on Gradle. Do you want to dive deep now? If so, you can visit http://www.gradle.org/docs/current/userguide/userguide.html.
Maven is a project build tool. Using Maven, we can build a visible, reusable, and maintainable project infrastructure.
Maven provides plugins for visibility: the code quality/best practices is visible through the PMD/checkstyle plugin, the XDOC plugin generates project content information, the JUnit report plugin makes the failure/success story visible to the team, the project activity tracking plugins make the daily activity visible, the change log plugin generates the list of changes, and so on.
As a result, a developer knows what APIs or modules are available for use; so, he or she doesn't invent the wheel (rather, he or she reuses the existing APIs or modules). This reduces the duplication and allows a maintainable system to be created.
In this section, we will explore the Maven architecture and rebuild our Gradle project using Maven.
A prerequisite for Maven is the Java Development Kit (JDK). Make sure you have JDK installed on your computer.
The following are the steps to set up Maven:
D:Softwareapache-maven-3.1.1
.M2_HOME
and point it to the Maven installation folder. Modify the PATH
variable and append %M2_HOME%in
.PATH
and M2_HOME
environment variables to the .bashrc
file. Open the .bashrc
file and edit it with the following text:export M2_HOME=/home/<location of Maven installation> export PATH=${PATH}:${M2_HOME}/bin
.bash_login
file needs to be modified with following text:export M2_HOME=/usr/local/<maven folder> export PATH=${PATH}:${M2_HOME}/bin
mvn –version
command. This should print the Maven version. The following is a screenshot of the output:Maven is installed so we can start exploring Maven. Eclipse users with the m2eclipse
plugin installed already have Maven, which they can directly use from Eclipse and they don't have to install Maven.
In Maven, Archetype is a project-template generation plugin.
Maven allows us to create a project infrastructure from scratch from a list of predefined project types. The Maven command mvn archetype:generate
generates a new project skeleton.
The archetype:generate
command loads a catalog of available project types. It tries to connect to the central Maven repository at http://repo1.maven.org/maven2, and downloads the archetype catalog.
Follow the ensuing steps to generate a Java project skeleton:
/Packt/chapter02/maven
, open the command prompt, and browse to the /Packt/chapter02/maven
folder.mvn archetype:generate
command; you will see a large list of archetypes being downloaded, each with a number, a name, and a short description.It will prompt you to enter an archetype number. Type in the default maven-archetype-quickstart
archetype. In my case, the number is 343.
The following screenshot shows you that the number 343
is default:
343
or just hit Enter to select the default. Next, it will prompt you to select a version. Hit Enter to select the default.groupId
. A groupId
is the root package for multiple projects, and org.springframework
is the groupId
for all Spring projects. Enter org.packt
as groupId
.artifactId
. This is the project name and aop
is the artifactId
for org.springframework.aop-3.1.1.RELEASE
. Enter Demo
for the artifactId
.1.0-SNAPSHOT
. The version is your project's version, and here 3.1.1.RELEASE
is the version for the org.springframework.aop-3.1.1.RELEASE
project. We will accept the default. Hit Enter to accept the default.com.packt.edu
as package's name.Open the /Packt/chapter02/maven
folder; you will see the Demo
project folder is created with the following file structure:
The Maven convention for the source Java file is src/main/java
and the test source file is src/test/java
.
Maven will automatically create a Java file App.java
under src/main/java/com/packt/edu
and a test file AppTest
under src/test/java/com/packt/edu
.
Also, it will create an XML file pom.xml
directly under Demo
. This file will be used for building the project. In the next section, we will read about the POM file.
Every Maven project contains a pom.xml
file, which is a project metadata file.
A POM file can contain the following sections:
<groupId/>
, <artifactId/>
, <version/>
, <dependency>
, and inheritance through <modules/>
and <parent/>
Open the pom.xml
file in the Demo
folder; it contains the following coordinate details:
<groupId>org.packt</groupId> <artifactId>Demo</artifactId> <version>1.0-SNAPSHOT</version> <packaging>jar</packaging>
<build>
and <reporting>
<name>
, <organization>
, <developers>
, <url>
, and <contributors>
Our generated pom.xml
contains the following details:
<name>Demo</name> <url>http://maven.apache.org</url>
<scm>
, <repository>
, and <mailingList>
In a multimodule project, a project can depend on many other projects. For example, say we depend on JUnit. Maven automatically discovers the required artifact dependencies. This is very useful as we depend on many open source projects. It's always useful, be it an open source or a close source project.
Do you remember the Gradle dependency closure? It has four default types for Compile, Runtime, testCompile, and testRuntime.
Similarly, Maven has the following dependency scopes:
A parent project defines dependencies using the following code snippet:
<dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.11</version> <scope>test</scope> </dependency> </dependencies
All child projects inherit the dependency by just adding the <dependency>
tag as follows:
<dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> </dependency>
The build life cycle clearly defines the process of building and distributing a particular project artifact.
Maven has the following three built-in build life cycles:
Now, we will compile and test our Demo
project.
In this section, we will work with compile, test, and package targets of the default life cycle.
Perform the following steps to compile the project:
Packtchapter02mavenDemo
. Maven needs a pom.xml
file to compile a project.mvn compile
; it will compile the project and create class files under Demo argetclasses
. The following screenshot shows the output:The mvn clean
command removes the target
folder and deletes all the content. Run the command and check that the target
folder has been deleted from Packtchapter02mavenDemo
.
The mvn site
command generates a detailed project report in the HTML format under the target or site. It includes About, Plugin Management, Distribution Management, Dependency Information, Source Repository, Mailing Lists, Issue Tracking, Continuous Integration, Project Plugins, Project License, Project Team, Project Summary, and Dependencies.
Refer to http://maven.apache.org/guides/index.html to explore more on Maven.
The next section covers the Apache Ant.
Ant is a Java-based build tool from the Apache Software Foundation. Ant's build files are written in XML. You need Java to execute an Ant task.
Download Apache Ant from http://ant.apache.org/, extract the media, and create an ANT_HOME
variable and set the value to the extracted location. Edit PATH
and append %ANT_HOME%in
in Windows. For Mac or Linux OS, you need to export ANT_HOME
and PATH
as described in the Installation section of Maven project management earlier in this chapter.
Ant needs a build.xml
file to execute tasks. Ant supports the –f
option to specify a build script; so the ant –f myBuildFile.xml
command will work.
We will create a build script and execute the Maven project (Packtchapter02mavenDemo
) using Ant. Follow the ensuing steps:
build.xml
in Packtchapter02mavenDemo
.build.xml
file:<?xml version="1.0"?> <project name="Demo" basedir="."> <property name="src.dir" location="src/main/java" /> <property name="build.dir" location="bin" /> <property name="dist.dir" location="ant_output" /> </project>
The <project>
tag is a defined tag in Ant. You can name your project, and Demo
is the name of the project. Next, we will set properties; a property can have a name and value or location. Here, src.dir
is a property name, and this property can be accessed from any task using the ${src.dir}
syntax. The location
attribute refers to a relative location from the build.xml
file. Since src/main/java
contains the source file, we set the location value to src/main/java
. The other two properties, build.dir
and dist.dir
, will be used by the Java compiling task to compile class files and generate the JAR file.
clean
target to remove old build outputs, and we will call Ant's <delete>
command to delete directories. Then, using the <mkdir>
command, we will recreate the directories:<target name="clean"> <delete dir="${build.dir}" /> <delete dir="${dist.dir}" /> </target> <target name="makedir"> <mkdir dir="${build.dir}" /> <mkdir dir="${dist.dir}" /> </target>
Note that we added two targets using the <target>
tag. Each target is identified using a name. We will call the clean
target to delete build.dir
(generated .class
files) and dist.dir
(build output JARs).
<target name="compile" depends="clean, makedir"> <javac srcdir="${src.dir}" destdir="${build.dir}"> </javac> </target>
Use the <javac>
command to compile Java files. The <javac>
command accepts srcdir
and destdir
. Compiler reads Java files from srcdir
and generates class files to destdir
.
A target may depend on another, and depends
allows us to pass comma-separated target names. Here, compile target depends on clean
and makedir
.
jar
from the class files using the <jar>
command as follows:<target name="jar" depends="compile"> <jar destfile="${dist.dir}${ant.project.name}.jar" basedir="${build.dir}"> </jar> </target>
The jar
target needs to know the class file's location and destination. The destfile
attribute refers to the destination JAR file name and location and basedir
refers to the class file location. Check whether we used ${dist.dir}${ant.project.name}.jar
to represent the destination JAR file name and folder. Here, ${dist.dir}
refers to the destination folder, and ${ant.project.name}.jar
represents the JAR name. ${ant.project.name}
is the name (Demo
) we mentioned in the <project>
tag.
Packtchapter02mavenDemo
and issue an ant jar
command. Here, jar
depends on compile
and compile
depends on clean
and makedir
. So, the jar
command will create two directories, bin
and ant_output
, compile the Java file and generate the.class
file in the bin folder, and finally create a Demo.jar
JAR in the ant_output
folder.lib
directory for Gradle in Packtchapter02lib
and kept the JUnit 4 JARs in it. We will use this lib
. Add three properties for the test source file directory, library directory, and test report as follows:<property name="test.dir" location="src/test/java" /> <property name="lib.dir" location="../../lib" /> <property name="report.dir" location="${dist.dir}/report" />
Check whether the lib.dir
location is relative to the build.xml
location. The test.dir
attribute points to src/test/main
and test reports will be generated inside ant_output/report
.
jclass.path
path to refer to all JAR files under the lib
directory and generated .class
files as follows:<path id="jclass.path"> <fileset dir="${lib.dir}/"> <include name="**/*" /> </fileset> <pathelement location="${build.dir}" /> </path>
The <fileset>
tag takes a directory location and <include>
takes a file name or regular expression. The **/*
value means all the directories and files are in ${lib.dir}
. The pathelement
attribute refers to the bin
directory where the compiled class files are put.
testcompile
target and use the javac
command. Pass test.dir
as srcdir
for compilation. Add <classpath>
to refer the jclass.path
value. This will compile the test files. Consider the following code snippet:<target name="testcompile" depends="compile"> <javac srcdir="${test.dir}" destdir="${build.dir}"> <classpath refid="jclass.path" /> </javac> </target>
junit
command to run tests. Pass jclass.path
to point the lib
directory and generated files as follows:<target name="test" depends="testcompile"> <junit printsummary="on" fork="true" haltonfailure="yes"> <classpath refid="jclass.path" /> <formatter type="xml" /> <batchtest todir="${report.dir}"> <fileset dir="${test.dir}"> <include name="**/*Test*.java" /> </fileset> </batchtest> </junit> </target>
Issue the ant test
command. This command compiles and executes the tests.
We can set a default task in the build.xml
file in the <project>
tag. The syntax is <project name="Demo" default="task name" basedir=".">
. Now, we don't have to specify a target name.
Our Ant script is ready for compiling Java files, executing tests, and generating reports. In the next section, we will set up Jenkins and use the build scripts.
To explore more on how to compile web archives and learn advanced topics, go to http://ant.apache.org/.