Continuous Integration Testing on Docker Cloud

Datetime:2016-08-23 02:30:12          Topic: Docker  Continuous Integration           Share

This is a guest post by  Stephen Pope & Kevin Kaland  from Project Ricochet

Docker Cloud is a SaaS solution hosted by Docker that gives teams the ability to easily manage, deploy, and scale their Dockerized applications.

The Docker Cloud service features some awesome continuous integration capabilities, especially its testing features. Once you understand the basics, I’ve found they are remarkably easy to use. The fact is, continuous integration covers a wide range of items — like automated builds, build testing , and automated deployment. The Docker Cloud service makes features like automated builds and deployment quite obvious, but the testing features can be a little harder to find, even though they are in plain sight!

In this piece, my aim is to walk you through the Docker Cloud service’s testing capabilities in a straightforward manner. By the end, I hope you’ll agree that it’s really dead simple!

So, let’s begin with the first task. Before we can test our builds, we need to automate them. We’ll use GitHub to set this up here, but note that it works the same way in Bitbucket.

Setup an Automated Build

1. Log into Docker Cloud using your Docker ID.

2. On the landing page (or in the left-hand menu), click on Repositories.

3. If you don’t already have a repository, you’ll need to click the Create button on the Repository page.

4. Click the Builds tab on the Repository page. If this is your first autobuild, you should see this screen:

To connect your GitHub account, click theLearn more link.

5. Once on the Cloud Settings page, look for the Source Providers section. Click on the plug  icon to connect your GitHub account. Authorize the connection on the screen that follows.

6. When your GitHub account is connected, go back to the Repository page and click Configure Automated Builds. Now we are in business!

7. Select the GitHub source repository you want to build from.

8. In the Build Location section, choose the option to Build on Docker Cloud’s infrastructure and select a builder size to run the build process on. Accept the default Autotest option for now (we’ll describe the Autotest options in detail in a moment).

Make sure you are satisfied with the Tag Mappings; these map your Docker image build tags (e.g. latest, test, production, etc.) to your GitHub branches. Ensure that Autobuild is enabled. If your Dockerfile needs any Environment Variables at build time, you can add them here. (Ours doesn’t.) Once you’ve set everything up, click Save.

The specified tag will now be built when you push to the associated branch:

Setup Automated Deployment

After the build images are created, you can enable automated deployment.

If you are inclined to build images automatically, you may also want to automate the deployment of updated images once they are built. Docker Cloud makes this easy:

1. To get started, you will need a service to deploy (a service is a collection of running containers of a particular Docker image). A good example of a service might be our production node app, running 7 containers with a set of environment variables setup for that specific instance of the app. You might also have an equivalent service for development and testing (where you can test code before production). Here is a good read on starting your first service

2. Edit the service that is using the Docker image.

3. In the General Settings section, ensure that Autoredeploy is enabled:

4. Save changes and you should be set.

Autotest Builds before Deployment

Remember when I said testing your builds was dead simple? Well, check this out. All you need to do is enable Autotests.

On the Repository page, navigate to the Builds tab and then click Configure Automated Builds. Within the Autotest section, three options are available :

  • Off will test commits only to branches that are using Autobuild to build and push images.
  • Source repository will test commits to all branches of the source code repository, regardless of their Autobuild setting.
  • Source repository and external pull requests will test commits to all branches of the source code repository, including any pull requests opened against it.

Before you turn that on, you’ll need to set up a few assets in your repository to define the tests and how they should be run. You can find examples of this in our Production Meteor using Docker Git repo.

This boils down to a single basic file — plus some optional ones in case you need them.

Our docker-compose.test.yml will serve as the main entry point for testing. It lets you define a “sut” service. This enables you to run the main tests and various other services that may be needed to test your build. In our example, you may notice that it simply outputs “test passed” — but that line is where the magic happens. If your test returns 0, your test has passed. If it returns a 1, it hasn’t. Essentially, you are performing a simple call from the YAML file, or if more complex tests are done, in a more robust bash script.

Let’s review a YAML compose file example from a blog on automated testing that uses a bash script and some additional features:

sut:  build: .  dockerfile: Dockerfile.test  links:    – web web:  build: .  dockerfile: Dockerfile  links:    – redis redis:  image: redis

Here, we define a sut service, along with some build instructions and an additional dockerfile for the tests. With this, you should be able to build a separate image for testing, instead of using the image for your build. That enables you to have different packages and files for testing that won’t be included in your application build.

Docker.test

FROM ubuntu:trusty

RUN apt-get update && apt-get install -yq curl && apt-get clean

WORKDIR /app

ADD test.sh /app/test.sh

CMD [“bash”, “test.sh”]

Here you’ll notice the final CMD is a test.sh bash script. This script will execute and return a 0 or 1 based on the test results.

Let’s take a quick look at the test.sh script:

Test.sh

sleep 5 if curl web | grep -q ‘<b>Visits:</b> ‘; then  echo “Tests passed!”  exit 0 else  echo “Tests failed!”  exit 1 fi

You’ll see the script is doing a simple curl call against the test application to see if some text appears on the page. If it does, the test passed. If not, the test will fail.

Remember how easy I said this was to implement on Docker Cloud? That’s all there is to it! Additionally once you’ve mastered the basics, more advanced integrations can be done with builds hooks .

Of course, building the tests for a complete application will be a much larger task than described here, but the point is you’ll be able to focus on the tests, not how to squeeze them into your CI workflow. Docker Cloud makes the setup and implementation super easy. Once you understand these basic components, you should be able to set up our test Meteor service up in a matter of minutes.

Alright, that’s it for now. I hope this piece helped guide you through the process fairly easily, and more importantly showcases the cool testing CI workflow Docker Cloud has to offer. If you have additional questions or comments, make sure to head over to the Docker Cloud Forum , Docker technical staff will be glad to help. Here are some related posts that should prove helpful on your journey. Enjoy!

Get Docker Cloud for Free – https://cloud.docker.com/





About List