Automating Habitat With AWS CodePipeline

Datetime:2016-08-23 00:56:30          Topic: AWK  Docker           Share

This article outlines a proof-of-concept (POC) for automating Habitat operations from  AWS CodePipeline . Habitat is Chef's new application automation platform that provides a packaging system that results in apps that are "immutable and atomically deployed, with self-organizing peer relationships." Habitat is an innovative technology for packaging applications, but a Continuous Delivery pipeline is still required to automate deployments. For this exercise, I've opted to build a lightweight pipeline using CodePipeline and  Lambda .

An in-depth analysis of how to use Habitat is beyond the scope of this post, but you can get a good introduction by following their tutorial . This POC essentially builds a CD pipeline to automate the steps described in the tutorial and builds the same demo app (mytutorialapp). It covers the "pre-artifact" stages of the pipeline (Source, Commit, Acceptance), but keep an eye out for a future post that will flesh out the rest.

Also be sure to read the article " Continuous deployment with Habitat ," which provides a good overview of how the developers of Habitat intend it to be used in a pipeline, including links to some repos to help implement that vision using  Chef Automate .

Technology Overview

Application

The application we're automating is called mytutorialapp. It is a simple "hello world" web app that runs on Nginx. The application code can be found in the hab-demo repository.

Pipeline

The pipeline is provisioned by a CloudFormation stack and implemented with CodePipeline. The pipeline uses a Lambda function as an action executor. This Lambda function delegates command execution to an EC2 instance via an  SSM Run Command:  aws:runShellScript . The pipeline code can be found in the  hab-demo-pipeline repository. Here is a simplified diagram of the execution mechanics:

Stack

The CloudFormation stack that provisions the pipeline also creates several supporting resources.  Check out the pipeline.json template for details, but here is a screenshot to show what's included:

Pipeline Stages

Here's an overview of the pipeline structure. For the purpose of this article, I've only implemented the Source, Commit, and Acceptance stages. This portion of the pipeline will get the source code from a git repo, build a Habitat package, build a Docker test environment, deploy the Habitat package to the test environment, run tests on it and then publish it to the Habitat Depot. All downstream pipeline stages can then source the package from the Depot.

  • Source
    • Clone the app repo
  • Commit
    • Stage-SourceCode
    • Initialize-Habitat
    • Test-StaticAnalysis
    • Build-HabitatPackage
  • Acceptance
    • Create-TestEnvironment
    • Test-HabitatPackage
    • Publish-HabitatPackage

Action Details

Here are the details for the various pipeline actions. These action implementations are defined in a "pipeline-runner" Lambda function and invoked by CodePipeline. Upon invocation, the scripts are executed on an EC2 box that gets provisioned at the same time as the code pipeline.

Commit Stage

Stage-SourceCode

Pulls down the source code artifact from S3 and unzips it.

aws configure set s3.signature_version s3v4
aws s3 cp s3://bettinger-pipeline-artifactbucket-1afmvg2ziqzwx/bettinger-pipeline-H/SourceOutp/SWuupfm.zip /tmp/SourceOutput.zip
rm -rf /tmp/SourceOutput && mkdir /tmp/SourceOutput && unzip /tmp/SourceOutput.zip -d /tmp/SourceOutput

Initialize-Habitat

Sets Habitat environment variables and generates/uploads a key to access my Origin on the Habitat Depot.

export HAB_AUTH_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
export HAB_ORIGIN=$(awk -F= '/^pkg_origin/{print $2}' /tmp/SourceOutput/plan.sh)
hab origin key generate $(awk -F= '/^pkg_origin/{print $2}' /tmp/SourceOutput/plan.sh)
hab origin key upload $(awk -F= '/^pkg_origin/{print $2}' /tmp/SourceOutput/plan.sh)

Test-StacticAnalysis

Runs static analysis on plan.sh using bash -n.

bash -n /tmp/SourceOutput/plan.sh

Build-HabitatPackage

Builds the Habitat package

cd /tmp/SourceOutput
hab pkg build .
mkdir -p /tmp/pipeline/hab && cp -r /tmp/SourceOutput/results \"$_\""

Acceptance Stage

View raw Build-HabitatPackage.sh hosted by GitHub

Build-TestEnvironment

Creates a Docker test environment by running a Habitat package export command inside the Habitat Studio.

export HAB_ORIGIN=$(awk -F= '/^pkg_origin/{print $2}' /tmp/SourceOutput/plan.sh)
purge_containers=$(if [ $(docker ps -a -q | wc -l) -gt 0 ]; then docker rm -f -v $(docker ps -a -q); fi)
purge_images=$(if [  $(docker images -q | wc -l) -gt 0 ]; then docker rmi -f $(docker images -q); fi)
hab studio run "hab pkg export docker $(awk -F= '/^pkg_origin/{print $2}' /tmp/SourceOutput/plan.sh)/$(awk -F= '/^pkg_name/{print $2}' /tmp/SourceOutput/plan.sh)"
docker run -it -d -p 8080:8080 --name $(awk -F= '/^pkg_name/{print $2}' /tmp/SourceOutput/plan.sh) $(awk -F= '/^pkg_origin/{print $2}' /tmp/SourceOutput/plan.sh)/$(awk -F= '/^pkg_name/{print $2}' /tmp/SourceOutput/plan.sh)

Test-HabitatPackage

Runs a Bats test suite which verifies that the webserver is running and the "hello world" page is displayed.

bats --tap /tmp/SourceOutput/test.bats

Publish-HabitatPackage

Uploads the Habitat package to the Depot. In a later pipeline stage, a package deployment can be sourced directly from the Depot.

cd /tmp/pipeline/hab/results
export HAB_AUTH_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,
hab pkg upload $(awk -F= '/^pkg_artifact/{print $2}' /tmp/pipeline/hab/results/last_build.env)

Wrapping up

This post provided an early look at a mechanism for automating Habitat deployments from AWS CodePipeline. There is still a lot of work to be done on this POC project so keep an eye out for later posts that describe the mechanics of the rest of the pipeline.





About List