Setup CI

After a successful Terraform deployment, it is time to move onto service configuration. More specifically, Jenkins and the integration pipeline.

Jenkins initialization

With Jenkins running for the first time, we need to complete a short setup routine. First, we need to SSH into the node and retrieve the admin password stored in /var/lib/jenkins/secrets/initialAdminPassword:

Jenkins initialization

We are mainly interested in the pipeline group of plugins which is included with the suggested ones:

Jenkins initialization

After the plugins installation has completed, it's time to create our first user:

Jenkins initialization

With this the initialization process is complete and Jenkins is ready for use:

Jenkins initialization

Writing a demo app

Before configuring the CI pipeline, it will help to have something to do some integration on. A basic Hello World type of PHP code will do, so with a sincere apology to all PHP developers out there, I present you with the source of our demo app:

src/index.php: 
<?php 
 
function greet($name) { 
  return "Hello $name!"; 
} 
 
$full_name = "Bobby D"; 
 greet ($full_name); 

Clapping fades... 
And naturally, a unit test for it: 
tests/indexTest.php: 

<?php 
require_once "src/index.php"; 
 
class IndexTest extends PHPUnit_Framework_TestCase 
{ 
  public function testGreet() { 
    global $full_name; 
    $expected = "Hello $full_name!"; 
    $actual = greet($full_name); 
    $this->assertEquals($expected, $actual); 
    } 
}

There is a third file in our demo-app folder curiously named Jenkinsfile which we will discuss shortly.

Now let us get our code into a repository:

$ aws codecommit create-repository --repository-name demo-app 
      --repository-description "Demo app"
{
"repositoryMetadata": {
"repositoryName": "demo-app",
"cloneUrlSsh": 
    "ssh://git-codecommit.us-east-1.amazonaws.com/v1/repos/demo-app"
...
Then we clone it locally (replace SSH_KEY_ID as before):
$ git clone ssh://[email protected]
      1.amazonaws.com/v1/repos/demo-app
...

Finally, we place our demo-app code into the empty repository, commit and push all changes to codecommit.

Defining the pipeline

It is time to decide on what the CI pipeline is meant to do for us. Here is a list of useful steps as a start:

  1. Checkout application source code from Git
  2. Run tests against it by running PHPUnit inside a Docker container (on the Jenkins host)
  3. Build application artefacts by executing FPM within a container on the Jenkins host
  4. Upload artefacts to an external store (for example, a Yum repository)

Translated into Jenkins pipeline code:

#!groovy 
 
node { 
 
  stage "Checkout Git repo" 
    checkout scm 
   
  stage "Run tests" 
    sh "docker run -v $(pwd):/app --rm phpunit/phpunit tests/" 
  stage "Build RPM" 
    sh "[ -d ./rpm ] || mkdir ./rpm" 
    sh "docker run -v $(pwd)/src:/data/demo-app -v $(pwd)/rpm:/data/rpm --rm tenzer/fpm fpm -s dir -t rpm -n demo-app -v $(git rev-parse --short HEAD) --description "Demo PHP app" --directories /var/www/demo-app --package /data/rpm/demo-app-$(git rev-parse --short HEAD).rpm /data/demo-app=/var/www/" 
 
  stage "Update YUM repo" 
    sh "[ -d ~/repo/rpm/demo-app/ ] || mkdir -p ~/repo/rpm/demo-app/" 
    sh "mv ./rpm/*.rpm ~/repo/rpm/demo-app/" 
    sh "createrepo ~/repo/" 
    sh "aws s3 sync ~/repo s3://MY_BUCKET_NAME/ --region us-east-1 --delete" 
 
  stage "Check YUM repo" 
    sh "yum clean all" 
    sh "yum info demo-app-$(git rev-parse --short HEAD)" 
} 

Generally speaking, defining a pipeline consists of a setting out a series of tasks/stages. Let us review each of the preceding stages:

  • We start with a Git checkout of our demo-app code. The repository address is assumed to be the one of the Jenkinsfile.
  • At the next stage we take advantage of Docker's isolation and spin up a container with everything needed for PHPUnit (ref: https://phpunit.de) to run a test against our demo-app source code. Take a look in the tests/ folder under ${GIT_URL}/Examples/Chapter-4/CodeCommit/demo-app/ if you would like to add more or modify it further.
  • If the tests pass, we move onto building an RPM artefact using a neat, user-friendly tool called FPM (ref: https://github.com/jordansissel/fpm), again in a Docker container. We use the short git commit hash as the version identifier for our demo-app.
  • We move our RPM artefact to a designated repository folder, create a YUM repository out of it using createrepo and sync all that data to an Amazon S3 bucket. The idea is to use this S3 based YUM repository later on for deploying our demo-app.
  • Finally, as a bonus, we check that the package we just synced can be retrieved via YUM.

Our pipeline is now defined but before we can run it, we need to satisfy one (S3) dependency. We need to create a S3 bucket to store the RPM artefacts that the pipeline would produce. Then we need to update parts of the Jenkins and Saltstack code with the address of that S3 bucket.

To interact with S3, we shall use the AWS CLI tool within the environment we configured for Terraform earlier:

$ aws s3 mb s3://MY_BUCKET_NAME

The bucket name is up to you, but keep in mind that the global S3 namespace is shared, so the more unique the name the better.

Next, we update our pipeline definition (Jenkinsfile). Look for the line containing MY_BUCKET_NAME:

sh "aws s3 sync ~/repo s3://MY_BUCKET_NAME/ --region us-east-1 
        --delete"

We also need to update SaltStack (again replacing MY_BUCKET_NAME):

[s3-repo] 
name=S3-repo 
baseurl=https://s3.amazonaws.com/MY_BUCKET_NAME 
enabled=1 
gpgcheck=0 

This repo file will be used in the last stage of our pipeline, as we will see in a moment. At this point you will need to commit and push both changes: the Jenkinsfile to the demo-app repository and the s3.repo file to the SaltStack one. Then you would SSH into the Jenkins node, pull and apply the Salt changes.

Setting up the pipeline

Back to the Jenkins interface. After logging in, we click on the create new jobs link on the welcome page:

Setting up the pipeline

We select Pipeline as a job type and pick a name for it:

Setting up the pipeline

The next screen takes us to the job configuration details. At the top we choose to Discard old builds in order to keep our Jenkins workspace compact. We are saying, only keep details of the last five executions of this job:

Setting up the pipeline

Under Build Triggers we choose to poll our Git repository for changes every 5 minutes:

Setting up the pipeline

Underneath, we choose Pipeline script from SCM, set SCM to Git and add the URL of our demo-app repository (that is https://git-codecommit.us-east-1.amazonaws.com/v1/repos/demo-app ) to be polled:

Setting up the pipeline

No need for extra credentials as these will be fetched via the EC2 IAM Role. Note the Script Path referencing the Jenkins file we mentioned earlier. This is a great new feature which gives us pipeline as code functionality as described here: https://jenkins.io/doc/pipeline/#loading-pipeline-scripts-from-scm.

With that we can keep our application code and the Jenkins pipeline definition conveniently together under revision control.

After we save the pipeline job, Jenkins will start polling the Git repository and trigger an execution whenever a change is detected (or you can click on Build Now to force a run).

Each successful build will result in an RPM package uploaded to our YUM repository. Go ahead and experiment, breaking the build by changing the demo-app source code so that the test fails.

To troubleshoot, look at the Build History list, select the job that failed and examine its Console Output:

Setting up the pipeline

Now that you are familiar with our example pipeline, I encourage you to expand it: Add more stages to it, make some of the tasks execute in parallel, enable chat or email notifications, or link pipelines so they trigger each other.

You will appreciate the benefits of implementing a CI server as you continue to convert more of your daily, manual routines to Jenkins jobs.

You can be sure your teammates will love it too.

Note

Please remember to delete any AWS resources used in the preceding examples (VPC, EC2, S3, IAM, CodeCommit, etcetera) to avoid unnecessary charges.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset