devopsaws github, codebuild, continuous_integration, ci

In this post, I’d like to show how to achieve continuous integration using CodeBuild service.

Main goals to achieve:

  • Run unit tests in a Docker container
  • Kick off build automatically when code is pushed
  • Show build status
  • Send notifications when tests fail

Codebuild Setup

Step 01: Project Configuration

Specify a unique name and make sure to tick “Enable build badge” checkbox

Step 02: Source Selection

In this step select GitHub as source provider and select “Repository in my GitHub account” option. AWS will use OAuth and redirect to GitHub to ask for authoization to access your repositories. After granting access you should be able to see your repositories in the dropdown list:

I left “Source version” field blank as I want the build to run for all branches and for all commits.

Step 03: Webhook Configuration

Tick the “Rebuild every time a code change is pushed to this repository” checkbox and select PUSH from the event type list. You can select more events to trigger builds for CI purposes it should be enough to run after every time code is pushed.

What this does is add a CodeBuild webhook in GitHub that looks like this:

Whenever a push happens in the repository, GitHub posts the details to CodeBuild webhook and that triggers a build.

Step 04: Environment configuration

The build takes place in a Docker container so we have to provide a Docker image with build tools installed. Alternatively we can use one of the managed images that AWS provides. In this example I’ll use a managed image:

Step 05: Other configuration

In this example, I will accept the defaults for the rest of the settings because I don’t need to generate artifacts to run unit tests. It’s generally wise to enable CloudWatch logs so that you can monitor the build process closely. Since I accept the default path for buildspec.yml, I have to place it at the root of the repository.

Running the tests

The core responsibility of CI pipeline is to run the unit tests. CodeBuild is a generic service and it doesn’t come with any tools to run unit tests as it’s application and environment dependent. The way we configure is by using buildspec.yml file. In this example I’m running a dotnet core project and making sure the unit tests are run first is as easy as this:

version: 0.2

phases:
  install:
    runtime-versions:
        dotnet: 2.2
  build:
    commands:
      - dotnet restore
      - dotnet test
      - dotnet publish Sample.UI.Web -c Release -o ./output

This way CodeBuild will execute the steps above and all the unit tests in the solution will be run.

Run build automatically

Now that the GitHub repository and CodeBuild projects are both ready, let’s see if we can kick off a build by pushing some code changes:

Also after pushing code to a feature branch I was able to see that build was triggered.

So now I could confirm running the build automatically.

As a side note, the failed ones were failing due to this error

YAML_FILE_ERROR: This build image requires selecting at least one runtime version.

The solution was to specify the runtime in the buildspec.yml file explicitly by adding this bit:

install:
  runtime-versions:
      dotnet: 2.2

Also it’s worth noting that the source version value in Codebuild corresponds to commit hash. For example the commit hash shown below

appears on CodeBuild as

Showing build status

Showing the build status on GitHub repository is very easy. We already enabled build badge while creating the build project. Now we have to copy the badge URL as shown below:

Then in the GitHub repository, edit the readme.md file and add the following:

![Build status](badge URL copied from AWS console)

Now if you go to the GitHub repository page and refresh you can see the latest status of the build (of master branch):

After I fixed the error and merged into master branch I could see the build passing badge as well:

Notifications

It would be nice to have a direct integration with notifications. This can be achieved using CloudWatch Events. In this sample project I’m going to use SNS to send email notifications.

First, I went to CloudWatch and created a rule and added the build state changed events. Just gave it a name and created the rule so that it looked like this:

After that I broke the test intentionally and received email for build failure in JSON format.

Conclusion

In this post I wanted to show a continuous integration pipeline using GitHub and CodeBuild. It can further be improved by posting the build status to Slack so that the whole team can get the notifications instantly. For the time being I achieved the goals I set out for initially so I’ll wrap it up here.

Source Code

Source code can be found in the repo below under blog/CodeBuild_CI_Pipeline folder

Resources

devopsaws cloudformation, cdk

Introduction

Using AWS is great as it simplifies and improves infrastructure provisioning and maintenance quite a lot. As you depend more and more on AWS you quickly realize that managing everything through AWS Management Console is not an ideal way.

Earlier this year, I published this post about AWS CloudFormation. This post will also discuss using CloudFormation but by taking it a higher level by using AWS Cloud Development Kit (CDK).

Levels of infrastructure

I liked the way the several approaches of infrastructure management is illustrated as levels here: Develop a Web App Using Amazon ECS and AWS Cloud Development Kit (CDK) - AWS Online Tech Talks (YouTube Video)

In a nutshell, those levels are:

Level 0: By hand

This approach is simply using AWS Management Console user interface to manage the infrastructure.

Pros:

  • Simple and easy
  • Helps to get results faster for exploratory projects

Cons:

  • Hard to reproduce
  • Possible inconsistencies based on people’s preferences
  • Error-prone
  • Slow for complex systems

Level 1: Imperative Infrastructure as Code

In this approach you write your own scripts using AWS SDK and manage the resources programmatically.

Pros:

  • Repeatable and reusable
  • Can be source-controlled

Cons:

  • Lots of boilerplate code
  • It needs to address all edge cases

Level 2: Declarative Infrastructure as Code

Describe the infrastructure as a script (in JSON or YAML) and use an resource provisioning engine such as AWS CloudFormation or HashiCorp Terraform which in turn use AWS SDK to manage the infrastructure.

Pros:

  • No boilerplate
  • Creating and updating resources is handled automatically

Cons:

  • Templates can become verbose
  • Implementing logic is limited to some built-in helper functions

Level 3: AWS Cloud Development Kit

In this approach software is developed using AWS CDK which generates the input for AWS CloudFormation.

The application can be developed in a number of languages. At the time of this writing the following languages are supported: TypeScript, JavaScript, Python, Java, C#.

Pros:

  • Handles creation of underlying resources. For example, when creating a VPC it also automatically generates YAML for all other networking resources (routing tables, NAT gateways etc) that are required by VPC.
  • Helps with local workflow
  • CDK constructs are reusable. Can be developed by AWS or third parties and can be installed separately.
  • Ability to use familiar programming languages

Cons:

  • Extra installation

Basic Concepts

  • Construct: Basic building block for an AWS CDK app. Represents a “cloud component” and encapsulates everything AWS CloudFormation needs to create the component. They can be developed or downloaded from AWS Construct Library
  • Stack: Constructs need to be created within the scope of a Stack. This corresponds to a CloudFormation template.
  • App: Stacks are created in the scope of an App. An App can contain multiple stacks.

CDK Basics

  • In order to create an application that uses CDK we need to install the CDK
  • The application can have one of the 3 supported templates:
    • app: General purpose application. This is the default template
    • lib: Used to develop a CDK construct
    • sample-app: Creates an application already populated with a sample CDK application.
  • The following diagram illustrates the app lifecycle:

Installing CDK

AWS CDK is developed using TypeScript. It’s available via npm:

npm install -g aws-cdk

Using CDK

In my example, I will use C#. A new project can be created by using cdk init command:

It expects 2 parameters: Language (Can be one of these: csharp, fsharp, java, javascript, python or typescript).

cdk init app --language csharp

To test my first app, I followed the sample app provided and added an S3 construct to my code:

public class CdkWorkoutStack : Stack
{
    public CdkWorkoutStack(Construct parent, string id, IStackProps props) : base(parent, id, props)
    {
        _ = new Bucket(this, "CdkBucket", new BucketProps
        {
            Versioned = true
        });
    }
}

Next step is to synnthesize a CloudFormation template by running

cdk synth

This generates a folder called cdk.out which contains a file named CdkWorkoutStack.template.json and with the following contents:

{
  "Resources": {
    "CdkBucket2FB0D10E": {
      "Type": "AWS::S3::Bucket",
      "Properties": {
        "VersioningConfiguration": {
          "Status": "Enabled"
        }
      },
      "UpdateReplacePolicy": "Retain",
      "DeletionPolicy": "Retain",
      "Metadata": {
        "aws:cdk:path": "CdkWorkoutStack/CdkBucket/Resource"
      }
    }
  }
}  

Final step is to create the resources by running

cdk deploy

This produces the following results:

And not surprisingly a CloudFormation stack with the bucket is created in AWS.

In the next version I’m going to change some properties of the bucket such as:

public class CdkWorkoutStack : Stack
{
    public CdkWorkoutStack(Construct parent, string id, IStackProps props) : base(parent, id, props)
    {
        _ = new Bucket(this, "CdkBucket", new BucketProps
        {
            Versioned = true,
            BlockPublicAccess = new BlockPublicAccess(new BlockPublicAccessOptions()
            {
                BlockPublicAcls = true,
                BlockPublicPolicy = true
            })
        });
    }
}

After this change in the code we can review what’s going to be updated by running the following command:

cdk diff

and it produces the following result outlining the changes between the current code and the deployed version:

And the stack can be deleted by running the following command:

cdk destroy

In the CDK version I’m using the default deletion policy is to retain. That’s why when I deleted the stack via CDK it didn’t delete the bucket. In order to change that behaviour I had to change the value to delete as such:

public class CdkWorkoutStack : Stack
{
    public CdkWorkoutStack(Construct parent, string id, IStackProps props) : base(parent, id, props)
    {
        var bucket = new Bucket(this, "CdkBucket", new BucketProps
        {
            Versioned = true,
            BlockPublicAccess = new BlockPublicAccess(new BlockPublicAccessOptions()
            {
                BlockPublicAcls = true,
                BlockPublicPolicy = true
            })
        });

        var resource = bucket.Node.FindChild("Resource") as Amazon.CDK.CfnResource;
        resource.ApplyRemovalPolicy(RemovalPolicy.DESTROY);
    }
}

This time it produces a CloudFormation template with a different policy:

{
  "Resources": {
    "CdkBucket2FB0D10E": {
      "Type": "AWS::S3::Bucket",
      "Properties": {
        "PublicAccessBlockConfiguration": {
          "BlockPublicAcls": true,
          "BlockPublicPolicy": true
        },
        "VersioningConfiguration": {
          "Status": "Enabled"
        }
      },
      "UpdateReplacePolicy": "Delete",
      "DeletionPolicy": "Delete",
      "Metadata": {
        "aws:cdk:path": "CdkWorkoutStack/CdkBucket/Resource"
      }
    }
  }
}

This time after destroy command it did delete the bucket as well.

Conclusion

I think infrastructure as code is definitely the way to manage cloud resources and CDK provides a great way to simplify the process. It’s at early stages for the time being as the NuGet packages are in devpreview mode but it’s good enough to rely on and start developing with CDK.

Resources

hobby lego, mindstorms, ev3

In this I made an introduction to building robots with Lego Mindstorms. In this post I will not develop my own robot but will build one of Lego’s creations: Track3r

Lego Track3r

Building

Instructions are quite simple and I was able to build in a few hours.

Lab in use

Connecting

I tested it with IR remote control and it works but I think the easiest and best way to control the robot is the iOS app.

To control the robot via app, first you need to have Bluetooth enabled both on your phone and EV3 brick:

After you install the app, connect it to the brick:

Once connected, selected the robot you want to control (Track3r in this case):

and the rest is just taking it to a test drive:

Conclusion

One problem I had was I ran out of batteries very quickly. EV3 brick take 6 AA batteries and I use rechargable batteries. After playing around with the robot for about half an hour I ran out of batteries. Unfortunately the design of the robot doesn’t take replacing batteries into account. It seems it would be a bit painful to remove the brick only to replace the batteries and put it back in. Well, there are other models I wanted to build anyway at least I have now more experience with building and controlling the pre-designed robots. I think the most fun is programming the brick so I will explore more of that in the future.

Resources