-->

aws certification

I’ve been working with AWS for years. Since I love everything about it and planning to use it for the foreseeable future, I’ve decided to go ahead and get the official certificates. This is to make sure I’ve covered all the important aspects of AWS fully. Also it motivates me to devleop more projects and blog posts on it.

Overview

There are 2 main categories of tracks:

  • Role-based Certifications
  • Specialty Certifications

The tracks and exam paths to take are shown in the diagram below:

My plan is to start with Cloud Practitioner exam, continue with AWS Solutions Architect track and move on to developer and sysops tracks.

Costs

I think it’s important to analyze costs first to assess whether or not this is a journey you want to start.

Individual Exam Costs

|Exam Name|Cost|Notes |AWS Certified Cloud Practitioner|100 USD|Optional |Associate-level exams|150 USD| |Professional-level exams|300 USD| |Specialty exams|300 USD| |Recertification exams|75 USD|Recertification is required every two years for all AWS Certifications |Associate-level practice exams|20 USD| |Professional-level practice exams|40 USD|

Total Tracks Costs

|Exam Track|Total Cost|With VAT|Notes| |AWS Solutions Architect|450 USD|540 USD| |AWS Certified DevOps Engineer|450 USD|540 USD| |All Associate Level Exams |300 USD|360 USD|3 Exams |All Professional Level Exams |600 USD|720 USD|2 Exams (There’s no professional level for developer, both associate level exams lead to DevOps engineer) |All Exams|1150 USD|1380 USD|Includes the optional Cloud Practitioner exam

The total cost is quite cheap but I think in the end it’s worth it.

Taking the Exams

It all starts with aws.training site. Just sign in with your Amazon account or create a new one. This allows you to take the online free courses. To take the exams you’d need a new account. I think this is because they partnered with a 3rd party to provide this.

Registration is quite similar. Just provide name and address and search for an exam centre.

Online Training

Free Courses

AWS Training

This is the official certification site of AWS. It allows the user to enroll to courses and view their transcript.

It’s a bit hard to find the actual course after you enrol because you can’t jump to contents from search results. What you should do is first go to My Transcript and under current courses you should be able to see the course and a link that says “Open”. Clickking that link takes to the actual content.

It has more content in the site. I’ll discover more as I go along.

edX

edX have recently launched 3 free AWS courses.

Various Training Resources

Conclusion

As my favourite saying goes: “It’s not about the destination, it’s about the journey”.

AWS certification for me is not a destination. It just plays a role for me to stay on course and stay motivated to create more projects and blog posts in a timely manner.

I’m hoping to see this journey to completion. I’ll be posting more on AWS and my journey on certification soon.

Resources

devaws api_gateway

API Gateway is Amazon’s managed API service. Serverless architecture is growing more on me everyday. I think leveraging infinite auto-scaling and only paying for what you use makes perfect sense. But to have an API that will be customer-facing first thing that needs to be setup is a custom domain which might be a bit involved when SSL certificates come in to play. In this post I’d like to create an API from scratch and use a custom domain name assigned to it.

Step 1: Create an API

Creating an API is straightforward: Just assign a meaningful and description. However, to me it was a bit confusing when it came to choosing the endpoint type.

The two options provided are: Regional and Edge optimized.

  • Edge-optimized API endpoint: The API is deployed to the specified region and a CloudFront distribution is created. API requests are routed to the nearest CloudFront Point of Presence (POP).

  • Regional API endpoint: This type was added in November 2017. The main goal is to prevent a roundtrip for in-region requests. API requests are targeted directly to the region-specific API Gateway without going through any CloudFront distribution.

Custom domain names are supported for both endpoint types.

In this example, I’ll use Regional endpoint type. For further reading, here’s a nice blog post about endpoint types.

Step 2: Create a resource and method

For demonstration purposes I created a resource called customer and a GET method that is which calls a mock endpoint.

Step 3: Deploy the API

From the Actions menu in Resources tab, I selected Deploy API.

Deployment requires a stage. Since this is the first deployment, I had to create a new stage called test. A new stage can be created while deploying. After the deployment test stage looks like this:

At this point API Gateway assigned a non-user-friendly URL already:

https://81dkdt6q81.execute-api.eu-west-2.amazonaws.com/test

This is the root domain of the API. So I was able to call the endpoint like this:

https://81dkdt6q81.execute-api.eu-west-2.amazonaws.com/test/albums

My goal was to get it working with my own domain such as:

https://hmdb.myvirtualhome.net/albums

Step 4: Generate the certificate in ACM

I’m using Route53 for all my domains and using ACM (AWS Certificate Manager) for generating SSL/TLS certificates. Before creating the custom domain name I needed my certificate available.

The wizard is quite simple: I just added the subdomain for the API and selected DNS validation.

After the review comes the validation process. Since I’m using Route 53 and ACM plays well with it, it simply provided a nice big button that said Create record in Route 53.

After clicking and confirming I got this confirmation message:

After waiting for about 3 minutes, the cerficate was issued already:

Step 5: Create Custom Domain Name in API Gateway

Now that the certificate was ready I had to go back to API Gateway to create the custom domain name and associate it with the newly created cert.

First, I clicked on Custom Domain Names on left menu and filled out the details. Make sure that your subdomain matches the one the certificate was generated for.

I assigned /test path to the test stage I had created earlier. I will use root path for the production stage when I deploy the final version.

After creating the custom domain, take note of Target Domain Name generated by AWS.

Step 6: Create A Record in Route 53

I had to also point DNS to the domain generated by API Gateway.

Since I was using a regional endpoint I had to map the custom domain name to the target domain name mentioned in the previous step.

Now the problem was when I tried to do it via AWS Management Console, it failed as explained in this StackOverflow answer.

So I had to do it via CLI as below:

aws route53 change-resource-record-sets --hosted-zone-id {ZONE_ID_OF_MY_DOMAIN} --change-batch file://changedns.json

whereas the contents of changedns.json were

{
  "Changes": [
    {
      "Action": "CREATE",
      "ResourceRecordSet": {
        "Name": "api.hmdb.myvirtualhome.net",
        "Type": "A",
        "AliasTarget": {
          "DNSName": "d-xyz.execute-api.eu-west-2.amazonaws.com",
          "HostedZoneId": "ZJ5UAJN8Y3Z2Q",
          "EvaluateTargetHealth": false
        }
      }
    }
  ]
}

In the JSON above, DNSName is the Target Domain Name created by AWS is Step 5. The HostedZoneId (ZJ5UAJN8Y3Z2Q), on the other hand, is the zone ID of API Gateway which is listed here.

UPDATE

If you are having issues running the command above that might mean you don’t have a default profile setup which has permissions to change DNS settings. To fix that:

1. Create a new user with no permissions

Go to IAM console and create a new user. Skip all the steps and download the credentials as .csv in the last step.

2. Assign required permissions

Create a new policy using the JSON template below and attach it to the new user

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "route53:ChangeResourceRecordSets",
            "Resource": "arn:aws:route53:::hostedzone/{ZONE ID OF YOUR DOMAIN}"
        }
    ]
}

3. Create a new profile for the user

aws configure --profile temp-route53-profile

and set the Access/Secret keys along with the region of your hosted zone.

Then you run the first CLI command with providing profile name:

aws route53 change-resource-record-sets --hosted-zone-id {ZONE_ID_OF_MY_DOMAIN} --change-batch file://changedns.json --profile temp-route53-profile

An important point here is to get your hosted zone ID from Route53. In the API Gateway, it shows a hosted zone ID which is actually AWS API Gateway zone ID. We use that zone ID in our DNS configuration (which is in changedns.json file in this example) but when we provide the hosted zone ID on the command line we provide our domain ID which can be found in Route53.

Step 7: Test

So after creating the alias for my API I visited the URL on a browser and I was able to get the green padlock indicating that it loaded the correct SSL certificate.

Resources

devaws route53, angular, dotnet_core, dynamic_dns, csharp

A few years back I developed a project called DynDns53. I was fed up with the dynamic DNS tools available and thought could easily achieve the same functionality since I had already been using AWS Route53.

Fast forward a few years, due to some neglect on my part and technology moving so fast the project started to feel outdated and abandoned. So I decided to revise it.

Key improvements in this version are:

  • Core library is now available in NuGet so anyone can build their own clients around it
  • A new client built with .NET Core so that it runs on all platforms now
  • A Docker version is available that runs the .NET Core client
  • A new client built with Angular 5 to replace the legacy AngularJS
  • CI integration: Travis is running the unit tests of core library
  • Revised WPF and Windows Service clients and fixed bugs
  • Added more detailed documentation on how to set up the environment for various clients

Also kept the old repository but renamed it to dyndns53-legacy. I might archive it at some point as I’m not planning to support it any longer.

Available on NuGet

NuGet is a great way of installing and updating libraries. I thought it would be a good idea to make use of it in this project so that it can be used without cloning the repository.

With DotNetCore it’s quite easy to create a NuGet package. Just navigate to project folder (where .csproj file is located) and run this:

dotnet pack -c Release

The default configuration it uses is Debug so make sure you’re using the correct build and a matching pack command. You should be able to see a screen similar to this

Then push it to Nuget

dotnet nuget push ./bin/Release/DynDns53.CoreLib.1.0.0.nupkg -k {NUGET.ORG API_KEY} -s https://api.nuget.org/v3/index.json

To double-check you can go to your NuGet account page and under Manage Packages you should be able to see your newly published package:

Now we play the waiting game! Becuase it may take some time for the package to be processed by NuGet. For exmaple I saw the warning shown in the screenshot 15 minutes after I pushed the package:

Generally this is a quick process but the first time I published my package, I got my confirmation email about 7 hours later so your mileage may vary.

If you need to update your package after it’s been published, make sure to increment the version number before running dotnet pack. In order to do that, you can simply edit the .csproj file and change the Version value:

  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <PackageId>DynDns53.CoreLib</PackageId>
    <Version>1.0.1</Version>
    <Authors>Volkan Paksoy</Authors>
    <Company></Company>
  </PropertyGroup>

Notes

  • Regarding the NuGet API Key: They recently changed their approach about keys. Now you only have one chance to save your key somewhere else. If you don’t save it, you won’t be able to access ti via their UI. You can create a new one of course so no big deal. But to avoid key pollution you might wanna save it in a safe place for future reference.

  • If you are publishing packages frequently, you may not be able to get the updates even after they had been published. The reason for that is the packages are cached locally. So make sure to clean your cache before you try to update the packages. On Mac, Visual Studio doesn’t have a Clean Cache option as of this writing (unlike Windows) so you have to go to your user folder and remove the packages under {user}/.nuget/packages folder. After this, you update the packages and you should get the latest validated version from Nuget.

.NET Core Client

Prerequisites

First, you’d need an IAM user who has access to Route53. You can use the policy template below to give the minimum possible permissions:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "route53:ListResourceRecordSets",
                "route53:ChangeResourceRecordSets"
            ],
            "Resource": "arn:aws:route53:::hostedzone/{ZONE ID}"
        }
    ]
}

Only 2 actions are performed so as long as you remmeber to update the policy with the new zone IDs if you need to manage other domains this should work fine work you.

Usage

Basic usage is very straightforward. Once compiled you can supply the IAM Access and Secret Keys and the domains to update with their Route53 Zone IDs as shown below:

dotnet DynDns53.Client.DotNetCore.dll --AccessKey {ACCESS KEY} --SecretKey {SECRET KEY} --Domains ZoneId1:Domain1 ZoneId2:Domain2 

Notes

  • .NET Core Console Application uses the NuGet package. One difference between .NET Core and classis .NET application is that the packages are no longer stored along with the application. Instead they are downloaded to the user’s folder under .nuget folder (e.g. on a Mac it’s located at /Users/{USERNAME}/.nuget/packages)

Available on Docker Hub

Even though it’s not a complex application I think it’s easier and hassle-free to run it in a self-contained Docker container. Currently it only supports Linux containers. I might need to develop a multi-architecture image in the future in need be but for now Linux only is sufficient for my needs.

Usage

You can get the image from Docker hub with the following command:

docker pull volkanpaksoy/dyndns53

and running it is very similar to running the .NET Core Client as that’s what’s running inside the container anyway:

docker run -d volkanpaksoy/dyndns53 --AccessKey {ACCESS KEY} --SecretKey {SECRET KEY} --Domains ZoneId1:Domain1 ZoneId2:Domain2 --Interval 300

The command above would run the container in daemon mode so that it can keep on updating the DNS every 5 minutes (300 seconds)

Notes

  • I had an older Visual Studio 2017 for Mac installation and it didn’t have Docker support. The setup is not very granular to pick specific features. So my solution was to reinstall the whole thing at which point Docker support was available in my project.

  • After adding Docker support the default build configuration becomes docker-compose. But it doesn’t work straight away as it throws an exception saying

      ERROR: for dyndns53.client.dotnetcore  Cannot start service 	dyndns53.client.dotnetcore: Mounts denied: 
      The path /usr/local/share/dotnet/sdk/NuGetFallbackFolder
      is not shared from OS X and is not known to Docker.
      You can configure shared paths from Docker -> Preferences... -> File Sharing.
      See https://docs.docker.com/docker-for-mac/osxfs/#namespaces for more info.
    

I added the folder it mentions in the error message to shared folders as shown below and it worked fine afterwards:

  • Currently it only works on Linux containers. There’s a nice articlehere about creating multi-architecture Docker images. I’ll try to make mine multi-arch as well when I revisit the project or there is an actual need for that.

Angular 5 Client

I’ve updated the web-based client using Angular 5 and Bootstrap 4 (Currently in Beta) which now looks like this:

I kept a copy of the old version which was developed with AngularJS. It’s available at this address: http://legacy.dyndns53.myvirtualhome.net/

Notes

  • After I added AWS SDK package I started getting a nasty error:

      ERROR in node_modules/aws-sdk/lib/http_response.d.ts(1,25): error TS2307: Cannot find module 'stream'.
    

    Fortunately the solution is easy as shown in the accepted answer here. Just remove “types: []” line in tsconfig.app.json file. Make sure you’re updating the correct file though as there is similarly named tsconfig.json in the root. What we are after is the tsconfig.app.json under src folder.

  • In this project, I use 3 different IP checkers (AWS, DynDns and a custom one I developed myself a while back and running on Heroku). Calling these from other clients is fine but when in the web application I bumped into CORS issues. There are possible solutions for this:

    1. Create you own API to return the IP address: In the previous version, I created an API with AWS API Gateway which uses a very simple Lambda function to return caller’s IP address

       exports.handler = function(event, context) {
         	 context.succeed({
               "ip": event.ip
           })
       }
      

      I create a GET method for my API and used the above Lambda function. Now that I had full control over it I was able to enable CORS as shown below:

    2. The other solution is “tricking” the browser by injecting CORS headers by using a Chrome extension. There is an umber of them but I use the one aptly named “Allow-Control-Allow-Origin: *”

      After installed you just enable it and the getting external IP works fine.

      It’s a good practice to filter it for your specific needs so that it doesn’t affect other sites (I had some issues with Google Docs when this is turned on)

CI Integration

I created a Travis integration which is free since my project is open-source. It runs the unit tests of the core library automatically. Also added the shiny badge on the project’s readme file that shows the build status.

Resources