DevOps Meets Security: Security Testing Your AWS Application: Part I - Unit Testing

Datetime:2016-08-23 02:50:36          Topic: Unit Testing  DevOps           Share

The adoption of DevOps practices allows organizations to be agile while deploying high-quality software to customers on a regular basis. The CI/CD pipeline is an important component of the DevOps model. It automates critical verification tasks, making fully automated software deployments possible. Security tests are critical to the CI/CD pipeline. These tests verify whether the code conforms to the security specifications. For example, a security unit test could be used to enforce that a given software component must use server-side encryption to upload objects to an Amazon S3 bucket. Similarly, a security integration test could be applied to verify that the same software component always enables S3 bucket versioning.

In this three-part blog post series, we’ll do a deep dive on automated security testing for AWS applications. In this first post, we’ll discuss how AWS Java  developers can create security unit tests to verify the correctness of their AWS applications by testing individual units of code in isolation. In part II, we’ll go one step further and show how developers can create integration tests that, unlike unit tests, interact with real software components and AWS resources. Finally, in part III, we’ll walk through how the provided security tests can be incorporated into a CI/CD pipeline (created through AWS CodePipeline ) to enforce security verification whenever new code changes are pushed into the code repository. Even though we focus on security , the tests provided can be easily generalized to other domains.

S3 Artifact Manager

We start by introducing a simple S3 wrapper component built to illustrate the security tests discussed in this series. The wrapper, represented by a Java class named S3ArtifactManager ( full source code can be accessed here ), uses AWS SDK for Java APIs to provide a more secure way to store objects in S3 .

Here we show an excerpt of class S3ArtifactManager that describes a method called upload() that can be used to securely upload objects to an S3 bucket. The method uses S3 bucket versioning to make sure  each new upload of the same object will preserve all previous versions of that object. A versionId is returned to clients each time an object (or a new version of it) is stored in the bucket so that specific versions can be retrieved later. Versioning is enabled by using a SetBucketVersioningConfigurationRequest object that takes a BucketVersioningConfiguration(BucketVersioningConfiguration.ENABLED) instance as parameter and by calling s3.setBucketVersioningConfiguration() passing the request object (lines 8-11). 

In addition, method upload() uses the server-side encryption with Amazon S3-managed encryption keys (SSE-S3) feature to enforce that objects stored in the bucket are encrypted. We simply create a metadata object setting - objectMetadata.setSSEAlgorithm() - as the encryption algorithm and attach the metadata object to the PutObjectRequest instance used to store the S3 object (lines 14-17). In line 20, the object is uploaded to S3 and its versionId is returned to client in line 22. 

public String upload(String s3Bucket, String s3Key, File file) 
   throws AmazonServiceException, AmazonClientException {
   if (!s3.doesBucketExist(s3Bucket)) {
      s3.createBucket(s3Bucket);
   }

   // enable bucket versioning
   SetBucketVersioningConfigurationRequest configRequest = 
      new SetBucketVersioningConfigurationRequest(s3Bucket, 
         new BucketVersioningConfiguration(BucketVersioningConfiguration.ENABLED));
   s3.setBucketVersioningConfiguration(configRequest);

   // enable server-side encryption (SSE-S3)
   PutObjectRequest request = new PutObjectRequest(s3Bucket, s3Key, file);
   ObjectMetadata objectMetadata = new ObjectMetadata();
   objectMetadata.setSSEAlgorithm(ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
   request.setMetadata(objectMetadata);

   // upload object to S3
   PutObjectResult putObjectResult = s3.putObject(request);

   return putObjectResult.getVersionId();
}

Because security is key in the cloud, security components like our S3ArtifactManager might interest individuals and organizations responsible for meeting security compliance requirements (for example, PCI ). In this context, developers and other users of such components must be confident that the security functionality provided behaves as expected. A bug in the component (for example, an object that is stored unencrypted or overwrites a previous version) can be disastrous. In addition, users must remain confident as new versions of the component are released. How can confidence be achieved continuously?

It turns out that DevOps practices improve confidence. In a traditional software development approach, coding the logic of the method upload() and running a few manuals test might be enough, but in a DevOps setting, this  is not acceptable. DevOps practices require that mechanisms that automatically verify code behavior are in place. In fact, these mechanisms are just as important as the code’s main logic. Which mechanisms are we talking about? Unit and integration tests!

In the next section, we’ll discuss how unit tests can be leveraged to verify the security behavior of our S3ArtifactManager wrapper. In parts II and III of this series, we’ll dive deep into integration tests and CI/CD automation, respectively.

Security Unit Tests

Next, we’ll create a suite of security unit tests to verify the behavior of our upload() method. We’ll leverage two popular test frameworks in Java named JUnit and Mockito to code the unit tests.

The primary purpose of unit tests is to test a unit of code in isolation. Here we define unit as the Java class under test (in our case, the S3ArtifactManager class). In order to isolate the class under test, we mock all other objects used in the class, such as the S3 client object. Mocking means that our unit tests will not interact with a real S3 resource and will not upload objects into a S3 bucket. Instead, we’re using a mock object with predefined behavior.

Verifying bucket versioning enablement on S3 buckets

The first security unit test is named testUploadWillEnableVersioningOnExistingS3Bucket . It verifies whether method upload() enables bucket versioning on an existing bucket upon uploading an object to that bucket. Note that we are using a mock object in Mockito instead of a real object to represent an S3 client instance. For this reason, we need to specify the behavior of the mock object for the functionality used by method upload(). In line 5, we use Mockito’s when statement to return true when s3Client.doesBucketExist() is called because this is the condition we want to test. In line 8, method upload() is called using test values for S3 bucket, key, and file parameters. 

@Test
public void testUploadWillEnableVersioningOnExistingS3Bucket() {
   
   // set Mock behavior
   when(s3Client.doesBucketExist(s3Bucket)).thenReturn(true); 
   
   // call object under test
   String versionId = s3ArtifactManager.upload(s3Bucket, s3Key, file);
   
   // assert versionID is the expected value
   assertEquals("VersionId returned is incorrect", 
      VERSION_ID, versionId);
   
   // assert that a new bucket has NOT been created
   verify(s3Client, never()).createBucket(s3Bucket);
   
   // capture BucketVersioningConfigurationReques object 
   ArgumentCaptor<SetBucketVersioningConfigurationRequest> 
      bucketVerConfigRequestCaptor = ArgumentCaptor.forClass(
         SetBucketVersioningConfigurationRequest.class);
   verify(s3Client).setBucketVersioningConfiguration(
      bucketVerConfigRequestCaptor.capture());
   
   // assert versioning is set on the bucket
   SetBucketVersioningConfigurationRequest bucketVerConfigRequest = 
      bucketVerConfigRequestCaptor.getValue();
   assertEquals("Versioning of S3 bucket could not be 
      verified", BucketVersioningConfiguration.ENABLED, 
         bucketVerConfigRequest.getVersioningConfiguration()
            .getStatus());
}

The first verification we perform is shown in lines 11-12. It verifies that the versionId value returned matches the constant value expected in the test. In line 15, we verify that a call to s3Client.createBucket() has never been made because the bucket already exists (as mocked using when in line 5). These are standard verifications, not related to security.

In line 18, we start verifying security behavior. We use Mockito’s argument captor feature to capture the parameter passed to setBucketVersioningConfiguration , which is a real object (lines 18-22). Later, in lines 25-30, we check whether bucket versioning is enabled in that object by comparing the value captured with constant BucketVersioningConfiguration . ENABLED. If this security verification fails, it means that versioning was not correctly configured. In this scenario, because a critical security assertion could not be verified, the CI/CD pipeline should be blocked until the code is fixed.

We also created a security unit test to verify bucket versioning enablement for newly created buckets . We’ve omitted the code for brevity, but you can download full source here . This test is similar to the one we just discussed. The main differences are in line 4 (which now returns false ) and line 13 which verifies that the createBucket API was called once ( verify(s3Client, times(1)).createBucket(s3Bucket) ).

Verifying server-side-encryption of uploaded S3 objects

The second security unit test verifies that uploaded S3 objects use server-side-encryption with Amazon S3 encryption keys (SSE-S3). In lines 8-12, the security unit test verification starts by once again using Mockito’s argument captor to capture the request object passed to putObject() . This object is used later in two ways: first, in lines 16-18, to verify that no customer keys were provided (because upload() is expected to use SSE-S3) and then in lines 22-26 to assert that the object’s metadata is not null and returns AES256 as the encryption algorithm, the value expected for SSE-S3 encryption. Once again, if this security verification fails, the CI/CD pipeline should be blocked until SSE-S3 code implementation is fixed and verified.

@Test
public void TestUploadAddsSSE_S3EncryptedObjectToBucket() {
   
   // call object under test
   s3ArtifactManager.upload(s3Bucket, s3Key, file);
   
   // capture putObjectRequest object
   ArgumentCaptor<PutObjectRequest> putObjectRequestCaptor = 
      ArgumentCaptor.forClass(PutObjectRequest.class);
   verify(s3Client).putObject(putObjectRequestCaptor.capture());
   PutObjectRequest putObjectRequest = 
      putObjectRequestCaptor.getValue();
   
   // assert that there's no customer key provided as 
   /// we're expecting SSE-S3
   assertNull("A customer key was incorrectly used (SSE-C). 
      SSE-S3 encryption expected instead.", 
         putObjectRequest.getSSECustomerKey());
   
   // assert that the SSE-S3 'AES256' algorithm was set as part of 
   // the request's metadata 
   assertNotNull("PutObjectRequest's metadata object must be non-null 
      and enforce SSE-S3 encryption", putObjectRequest.getMetadata());
   assertEquals("Object has not been encrypted using SSE-S3 (AES256 
      encryption algorithm)", AES256, putObjectRequest.getMetadata()
         .getSSEAlgorithm());
}

Running the Security Tests Locally

Setting Up

Follow these steps in your local workstation to run the unit tests:

Running the Unit Tests

You can use Maven to run the provided security unit tests locally.

  • Navigate to the root directory where you installed the source code (this is where the pom.xml file resides)
  • Type mvn verify –DskipIntegrationTests=true to run the security unit tests

Expected output:

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

Running amazonaws.samples.testing.unit.S3ArtifactManagerUnitTest

Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.155 sec

You’ll see in the output that all three security unit tests passed. That is, the individual units of code tested are behaving as expected in isolation.

Final Remarks

In the first part of this series, we have discussed how AWS SDK for Java developers can create unit tests that verify the behavior of individual software components in their AWS applications. We used mocks to replace actual objects (for example, an S3 client) and used Maven to trigger test execution.

In the second part of this series, we’ll discuss integration tests that will use real AWS objects and resources.  





About List