Using Docker to manage test database servers

One advantage Docker gives is the ability to install the production environment on our laptop. If the production environment is a Docker image, that image can be run just as easily on our laptop as on the cloud hosting environment. Generally speaking, it's important to replicate the production environment when running tests. Docker can make this an easy thing to do.

What we'll do in this section is demonstrate making minimal changes to the Docker environment we defined previously and develop a shell script to automate executing the Notes test suite inside the appropriate containers.

Using Docker, we'll be able to easily test against a database, and have a simple method for starting and stopping a test version of our production environment. Let's get started.

Docker Compose to orchestrate test infrastructure

We had a great experience using Docker Compose to orchestrate the Notes application deployment. The whole system, with four independent services, is easily described in compose/docker-compose.yml. What we'll do is duplicate that script, then make a couple small changes required to support test execution.

Let's start by making a new directory, test-compose, as a sibling to the notes, users, and compose directories. Copy compose/docker-compose.yml to the newly created test-compose directory. We'll be making several changes to this file and a couple of small changes to the existing Dockerfiles.

We want to change the container and network names so our test infrastructure doesn't clobber the production infrastructure. We'll constantly delete and recreate the test containers, so to keep the developers happy, we'll leave development infrastructure alone and perform testing on separate infrastructure. By maintaining separate test containers and networks, our test scripts can do anything they like without disturbing the development or production containers.

Consider this change to the db-auth and db-notes containers:

db-auth-test:
  build: ../db-auth
  container_name: db-auth-test
  networks:
    - authnet-test
..
db-notes-test:
    build: ../db-notes
    container_name: db-notes-test
    networks:
      - frontnet-test

This is the same as earlier, but with "-test" appended to container and network names.

That's the first change we must make, append -test to every container and network name in test-compose/docker-compose.yml. Everything we'll do with tests will run on completely separate containers, hostnames, and networks than that of the development instance.

This change will affect the notesapp-test and userauth-test services because the database server hostnames are now db-auth-test and db-notest-test. There are several environment variables or configuration files to update.

Previously, we defined all environment variables in the Dockerfile. But now, we want to reuse the same containers in test and production environments, with slightly tweaked environment variables to reflect where the container is executing. This raises a question over the location, Dockerfile or docker-compose.yml, to define a given environment variable. The environment variables which must be different between the production or test environment must be defined in the corresponding docker-compose.yml, while all other variables can be defined in the corresponding Dockerfile.

userauth-test:
    build: ../users
    container_name: userauth-test
    environment:
      DEBUG: ""
      NODE_ENV: "test"
      SEQUELIZE_CONNECT: "userauth-test/sequelize-docker-mysql.yaml"
      HOST_USERS_TEST: "localhost"
    networks:
      - authnet-test
      - notesauth-test
    depends_on:
      - db-auth-test
    volumes:
      - ./reports-userauth:/reports
      - ./userauth:/usr/src/app/userauth-test
..
notesapp-test:
    build: ../notes
    container_name: notesapp-test
    environment:
      DEBUG: ""
      NODE_ENV: "test"
      SEQUELIZE_CONNECT: "notesmodel-test/sequelize-docker-mysql.yaml"
      USER_SERVICE_URL: "http://userauth-test:3333"
    networks:
      - frontnet-test
      - notesauth-test
    expose:
      - 3000
    ports:
      - "3000:3000"
    depends_on:
      - db-notes-test
      - userauth-test
    volumes:
      - ./reports-notes:/reports
      - ./notesmodel:/usr/src/app/notesmodel-test

Again, we changed the container and network names to append -test. We moved some of the environment variables from Dockerfile to docker-compose.yml. Finally, we added some data volumes to mount host directories inside the container.

The existing variables, corresponding to the ones shown here, must be copied into compose/docker-compose.yml. As you do so, delete the variable definition from the corresponding Dockerfile. What we'll end up with is this arrangement:

  • compose/docker-compose.yml holding environment variables for the production environment
  • test-compose/docker-compose.yml holding environment variables for the test environment
  • Dockerfiles hold the environment variables common to both environments

An option is to not record any environment variables in Dockerfiles, and instead put them all in the two docker-compose.yml files. You'd avoid the decision of what goes where but end up with duplicated variables with identical values. Forgetting to update variable definitions in both locations risks potential disasters.

Another thing to do is to set up directories to store test code. A common practice in Node.js projects is to put test code in the same directory as the application code. That would mean avoiding copying the test code to a production server, or when publishing as an npm module. As it stands the Dockerfiles simply copy everything from the notes and users directories into the corresponding containers. We can either change the Dockerfile, or we can mount the test code into the container as is done with the corresponding volume sections of test-compose/docker-compose.yml. That way, the test code is injected into the container rather than ignored while creating the container.

Let's start with these shell commands:

$ mv notes/test test-compose/notesmodel
$ mkdir test-compose/userauth

The first command moves the Notes models test suite we just created into test-compose, and the second sets up a directory for a test suite we're about to write. With the volume definitions shown earlier, test-compose/notesmodel appears as notesmodel-test in the notes application directory, and test-compose/userauth appears as userauth-test in the users application directory.

Now add the test-compose/userauth/sequelize-docker-mysql.yaml file containing the following:

dbname: userauth
username: userauth
password: userauth
params:
    host: db-auth-test
    port: 3306
    dialect: mysql
    logging: false

This is the same as users/sequelize-docker-mysql.yaml, but for the hostname change.

Similarly we add test-compose/notesmodel/sequelize-docker-mysql.yaml containing the following:

dbname: notes
username: notes
password: notes
params:
    host: db-notes-test
    port: 3306
    dialect: mysql
    logging: false

Again, this is the same as notes/models/sequelize-docker-mysql.yaml but for the hostname change.

Package.json scripts for Dockerized test infrastructure

Now we can add a few package.json lines for the Dockerized test execution. We'll see later how we'll actually run the tests under Docker.

In notes/package.json, add the following to the scripts section:

"test-docker-notes-sequelize-sqlite": "MODEL_TO_TEST=../models/notes-sequelize mocha -R json notesmodel-test/test-model.js >/reports/notes-sequelize-sqlite.json",
"test-docker-notes-sequelize-mysql": "MODEL_TO_TEST=../models/notes-sequelize mocha -R json notesmodel-test/test-model.js >/reports/notes-sequelize-mysql.json",
"test-docker-notes-memory": "MODEL_TO_TEST=../models/notes-memory mocha -R json notesmodel-test/test-model.js >/reports/notes-memory.json",
"test-docker-notes-fs": "MODEL_TO_TEST=../models/notes-fs mocha -R json notesmodel-test/test-model.js >/reports/notes-fs.json",
"test-docker-notes-levelup": "MODEL_TO_TEST=../models/notes-levelup mocha -R json notesmodel-test/test-model.js >/reports/notes-levelup.json",
"test-docker-notes-sqlite3": "rm -f chap11.sqlite3 && sqlite3 chap11.sqlite3 --init models/schema-sqlite3.sql </dev/null && MODEL_TO_TEST=../models/notes-sqlite3 SQLITE_FILE=chap11.sqlite3 mocha -R json notesmodel-test/test-model.js >/reports/notes-sqlite3.json"

We removed the SEQUELIZE_CONNECT variable because it's now defined in test-compose/docker-compose.yml.

The Mocha invocation is different now. Previously, there'd been no arguments, but now, we're executing it with -R json notesmodel-test/test-model.js and then redirecting stdout to a file. Because the test is no longer in the test directory, we must explicitly instruct Mocha the filename to execute. We're also, with the -R option, using a different results reporting format. You can leave the test results reporting as they are, but the test results would be printed on the screen mixed together with a bunch of other output. There'd be no opportunity to collect results data for publishing in a report or showing a success or failure badge on a project dashboard website. The large amount of output might make it hard to spot a test failure.

Mocha supports different reporter modules that print results in different formats. So far, we used what Mocha calls the spec reporter. The HTML reporter is useful for generating test suite documentation. With the JSON reporter (-R json), test results are printed as JSON on the stdout. We're redirecting that output to a file in /reports, a directory which we've defined in the volumes section.

The test-compose/docker-compose.yml file contains volume declarations connecting the /reports container directory to a directory on the host filesystem. What will happen is these files will be stored on the host in the named directories, letting us easily access the JSON test results data so that we can make a report.

In users/package.json, let's make a similar addition to the scripts section:

"test-docker": "mocha -R json userauth-test/test.js >/reports/userauth.json"

We still haven't written the corresponding test suite for the user authentication REST service.

Executing tests under Docker Compose

Now we're ready to execute some of the tests inside a container. In test-compose, let's make a shell script called run.sh:

docker-compose up --build --force-recreate -d

docker exec -it notesapp-test npm install [email protected] [email protected]

docker exec -it notesapp-test npm run test-docker-notes-memory
docker exec -it notesapp-test npm run test-docker-notes-fs
docker exec -it notesapp-test npm run test-docker-notes-levelup
docker exec -it notesapp-test npm run test-docker-notes-sqlite3
docker exec -it notesapp-test 
     npm run test-docker-notes-sequelize-sqlite
docker exec -it notesapp-test 
     npm run test-docker-notes-sequelize-mysql

docker-compose stop

Tip

It's common practice to run tests out of a continuous integration system such as Jenkins. Continuous integration systems automatically run builds or tests against software products. The build and test results data is used to automatically generate status pages. Visit https://jenkins.io/index.html, which is a good starting point for a Jenkins job.

After quite a lot of experimentation, these "docker-compose up" options were found to most reliably execute the tests. These options ensure that the images are rebuilt and new containers are built from the images. The "-d" option puts the containers in the background, so the script can go on to the next step and execute the tests.

Next, the script uses "docker exec" to execute commands inside the notesapp-test container. With the first we ensure that Mocha and Chai are installed, and with the subsequent commands, we execute the test suite. We ran these tests earlier outside the container, which was easier to do but at the cost of test execution running in a different environment than we have in production.

We've also been able to add test execution on the Sequelize MySQL combination. If you remember, that combination was left out of our test matrix earlier because it was "too difficult" to set up a test database. With test-compose/docker-compose.yml, we no longer have that excuse. But, we're still a little lazy because we've left the MongoDB model untested.

Testing on MongoDB would simply require defining a container for the MongoDB database and a little bit of configuration. Visit https://hub.docker.com/_/mongo/ for the official MongoDB container. We'll leave this as an exercise for you to try.

To run the tests, simply type:

$ sh -x run.sh

Lots of output will be printed concerning building the containers and executing the test commands. The test results will be left in the directories named as volumes in test-compose/docker-compose.yml.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset