awsdev route53, angularjs

I’ve been using my own dynamic DNS application (which I named DynDns53 and blogged about it here). So far it had a WPF application and I was happy with it but I thought if I could develop a web-based application I wouldn’t have to install anything (which is what I’m shooting for these days) and achieve the same results.

So I built a JavaScript client with AngularJS framework. The idea is exactly the same, the only difference is it’s all happening inside the browser.

DynDns53 web client

Ingredients

To have a dynamic DNS client you need to have the following

  1. A way to get your external IP address
  2. A way to update your DNS record
  3. An application that performs Step 1 & 2 perpetually

Step 1: Getting the IP Address

I have done and blogged about this several times now. (Feels like I’m repeating myself a bit, I guess I have to find something original to work with. But first I have to finish this project and have closure!)

Since it’s a simple GET request it sounds easy but I quickly hit the CORS wall when I tried the following bit:

app.factory('ExternalIP', function ($http) {
    return $http.get('http://checkip.amazonaws.com', { cache: false });
}); 

In my WPF client I can call whatever service I want whenever I want but when running inside the browser things are a bit different. So I decided to take a detour and create my own service that allowed cross-origin resource sharing.

AWS Lambda & API Gateway

First I thought I could do it even without Lambda function by using the HTTP proxy integration. I could return what the external site returns:

Unfortunately this didn’t work because it was returning the IP of the AWS machine that’s actually running the API gateway. So I had to get the client’s IP from the request and send it back in my own Lambda function.

Turns out in order to get HTTP headers you need to fiddle with some template mapping and assign the client’s IP address to a variable:

This can be later referred to in the Lambda function through event parameter:

exports.handler = function(event, context) {
    context.succeed({
        "ip": event.ip
    })
}

And now that we have our own service we can allow CORS and be able call it from our client inside the browser:

Step 2: Updating DNS

This bit is very similar to WPF version. Instead of using the AWS .NET SDK I just used the JavaScript SDK. AWS has a great SDK builder which lets you to select the pieces you need:

It also shows if the service supports CORS. It’s a relief that Route53 does so we can keep going.

The whole source code is on GitHub but here’s the gist of it: Loop through all the subdomains, get all the resource records in the zone, find the matching record and update it with the new IP:

  $scope.updateAllDomains = function() {
      angular.forEach($rootScope.domainList.domains, function(value, key) {
        $scope.updateDomainInfo(value.name, value.zoneId);
      });
  } 
  $scope.updateDomainInfo = function(domainName, zoneId) {
    var options = {
      'accessKeyId': $rootScope.accessKey,
      'secretAccessKey': $rootScope.secretKey
    };
    var route53 = new AWS.Route53(options);
    
    var params = {
      HostedZoneId: zoneId
    };

    route53.listResourceRecordSets(params, function(err, data) {
        if (err) { 
          $rootScope.$emit('rootScope:log', err.message);
          console.log(err.message);
        } else {
          angular.forEach(data.ResourceRecordSets, function(value, key) {
              if (value.Name.slice(0, -1) == domainName) {
                var externalIPAddress = "";
                ExternalIP.then(function(response){
                     externalIPAddress = response.data.ip;
                     $scope.changeIP(domainName, zoneId, externalIPAddress)
                 });
              }
          });
        }
    });
  }
  $scope.changeIP = function(domainName, zoneId, newIPAddress) {
    var options = {
      'accessKeyId': $rootScope.accessKey,
      'secretAccessKey': $rootScope.secretKey
    };

    var route53 = new AWS.Route53(options);
    var params = {
      ChangeBatch: {
        Changes: [
          {
            Action: 'UPSERT',
            ResourceRecordSet: {
              Name: domainName,
              Type: 'A',
              TTL: 300,
              ResourceRecords: [ {
                  Value: newIPAddress
                }
              ]
            }
          }
        ]
      },
      HostedZoneId: zoneId
    };

    route53.changeResourceRecordSets(params, function(err, data) {
      if (err) { 
        $rootScope.$emit('rootScope:log', err.message); 
      }
      else { 
        var logMessage = "Updated domain: " + domainName + " ZoneID: " + zoneId + " with IP Address: " + externalIPAddress;
        $rootScope.$emit('rootScope:log', logMessage);
      }
    });
  }

The only part that trippped me up was that I wasn’t setting the TTL in the changeResourceRecordSets parameters and I was getting an error but found a StackOverflow question that helped to get past the issue.

Step 3: A tool to bind them

Now the fun part: An AngularJS client to call these services. I guess the UI is straight-forward. Basically it just requires the user to enter AWS IAM keys and domains to update.

I didn’t want to deal with the hassle of sending the keys to a remote server and host them securely. Instead I thought it would be simpler just to use browser’s local storage with HTML5. This way the keys never leave the browser.

It also only updates the IP address if it has changed so saves unnecessary API calls.

Also it’s possible to view what’s going on in the event log area.

I guess I can have my closure now and move on!

Resources

aws ssl, aws_certificate_manager, acm

Paying a ton of money to a digital certificate, which costs nothing to generate, has always bugged me. Fortunately it isn’t just me and recently I heard about Let’s Encrypt.

I was just planning to give it a go but I noticed a new service on AWS Management Console:

Apparently AWS is now issuing free SSL certificates, which was too tempting to pass on so I decided to dive in.

Enter AWS Certificate Manager

Requesting a certificate just takes seconds as it’s a 3-step process:

First, enter the list of domains you want the certificates for:

Wildcard SSL certificates don’t cover the zone apex so I had to enter both. (Hey it’s free so no complaints here!)

Then review and confirm and request has been made:

A verification email has been sent to the email addresses listed in the confirmation step.

At this point I could define MX records and use Google Apps to create a new user and receive the verification email. The problem is I don’t want all this hassle and certainly don’t need another email account to monitor.

SES to the rescue

I always considered SES as a simple SMTP service to send emails but while dabbling with alternatives I realized that now we can receive emails too!

To receive emails you need to verify your domain first. Also an MX record pointing to AWS SMTP server must be added. Fortunately since everything here is AWS it can be done automatically using Route53:

After this we can move on, we’ll receive a confirmation email once the domain has been verified:

In the next step we decide what to do with the incoming mails. We can bounce them, call a Lambda function, create a SNS notification etc. These all sound fun to experiment with but in this case I’ll opt for simplicity and just drop them to a S3 bucket.

Great thing is I can even assign a prefix so I can a single bucket to collect emails from a bunch of different addresses all separated into their own folders.

In step 3, we can specify more options. Another pleasant surprise was to see spam and virus protection:

After reviewing everything and confirming we are ready to receive emails to our bucket. In fact nice folks at AWS are so considerate that they even sent us a test email already:

Back to certificates

OK, after a short detour we are back on getting our SSL certificate. As I didn’t have my mailbox setup during the validation step I had to go to actions menu and select Resend validation email.

And after requesting it I immediately received the email containing a link to verify ownership of the domain.

After the approval process we get ourselves a nice free wildcard SSL certificate:

Test drive

To leverage the new certificate we need to use CloudFront to create a distribution. Here again we benefit from the integrated services. The certificate we have been issued can be selected from the dropdown list:

So after entering simple basics like the domain name and default page I created the distribution and pointed the Route53 records to this distribution instead of the S3 bucket.

And finally, after waiting (quite a bit) for the CloudFront distribution to be deployed we can see that little green padlock we’ve been looking forward to see!:

UPDATE 1 [03/03/2016]

Yesterday I was so excited about discovering this I didn’t look any further like downloading the certificate and using it on your servers.

Today unfortunately I realized that the usability is quite limited: It only works with AWS Elastic Load Balancer and CloudFront. I was hoping to use it with API Gateway but even that’s another AWS service it’s not integrated with ACM yet.

I do hope they make the cert bits available so we can have full control over them and deploy to wherever we want. So I guess Let’s Encrpt is a better option for now considering this limitation.

Resources

dev csharp, gadget, fitbit, aria
"What gets measured, gets managed." - Peter Drucker

It’s important to have goals, especially SMART goals. The “M” in S.M.A.R.T. stands for Measurable. Having enough data about a process helps tremendously to improve that process. To this effect, I started to collect exercise data from my Microsoft band which I blogged about here.

Weight tracking is also crucial for me. I used to record my weight manually on a piece of paper but, for the obvious reasons, I abandoned it quickly and decided to give Fitbit Aria a shot.

Fitbit Aria Wi-Fi Smart Scale

Aria is basically a scale that can connect to your Wi-Fi network and send your weight results to Fitbit automatically which can then be viewed via Fitbit web application.

Setup

Since it doesn’t have a keyboard or any other way to interact directly setup is carried out by running a program on your computer

It’s mostly just following the steps on the setup tool. You basically let it connect to your Wi-Fi network so that it can synchronize with Fitbit servers.

Putting the scale into setup mode proved to be tricky in the past though. Also it was not easy to change Wi-Fi so I had to reset the go back to factory settings and ran the setup tool again.

Getting the data via API

Here comes the fun part! Similar to my MS Band workout demo, I developed a WPF program to get my data from Fitbit’s API. Ultimately the goal is to combine all these data in one application and make sense of it.

Like MS Health API, FitBit uses OAuth 2.0 authorization and requires a registered application.

The endpoint that returns weight data accepts a few various formats depending on your needs. As I wanted a range instead of a single day I used the following format:

https://api.fitbit.com/1/user/{user ID}/body/log/weight/date/{startDate}/{endDate}.json

This call returns an array of the following JSON objects:

{
	"bmi": xx.xx,
	"date": "yyyy-mm-dd",
	"fat": xx.xxxxxxxxxxxxxxx,
	"logId": xxxxxxxxxxxxx,
	"source": "Aria",
	"time": "hh:mm:ss",
	"weight": xx.xx
}

Sample application

The bulk of the application is very similar to MS Band sample: It first opens an authorization window and once the client consents for the app to be granted some privileges it uses the access token to retrieve the actual data.

There are a few minor differences though:

  • Unlike MS Health API it requires Authorization header in the authorization code request calls which is basically Base64 encoded client ID and client secret
string base64String = Convert.ToBase64String(Encoding.UTF8.GetBytes($"{Settings.Default.ClientID}:{Settings.Default.ClientSecret}"));
request.AddHeader("Authorization", $"Basic {base64String}");
  • It requires a POST request to redeem URL. Apparently RestSharp has a weird behaviour. You’d think a method called AddBody could be used to send the request body, right? Not quite! It doesn’t transmit the header so I kept getting a missing field error. So instead I used AddParameter:
string requestBody = $"client_id={Settings.Default.ClientID}&grant_type=authorization_code&redirect_uri={_redirectUri}&code={code}";
request.AddParameter("application/x-www-form-urlencoded", requestBody, ParameterType.RequestBody);

I found a lot of SO questions and a hillarious blog post addressing the issue. It’s good to know I wasn’t alone in this!

The rest is very straightforward. Make the request, parse JSON and assign the list to the chart:

public void GetWeightData()
{
    var endDate = DateTime.Today.ToString("yyyy-MM-dd");
    var startDate = DateTime.Today.AddDays(-30).ToString("yyyy-MM-dd");
    var url = $"https://api.fitbit.com/1/user/XXXXXX/body/log/weight/date/{startDate}/{endDate}.json";
    var response = SendRequest(url);

    ParseWeightData(response);
}

public void ParseWeightData(string rawContent)
{
    var weightJsonArray = JObject.Parse(rawContent)["weight"].ToArray();
    foreach (var weightJson in weightJsonArray)
    {
        var weight = new FitbitWeightResult();
        weight.Weight = weightJson["weight"]?.Value<decimal>() ?? 0;
        weight.Date = weightJson["date"].Value<DateTime>();
        WeightResultList.Add(weight);
    }
}

And the output is:

Conclusion

So far I managed to collect walking data from MS Band, weight data from Fitbit Aria. In this demo I limited the scope with weight data only but Fitbit API can be used to track sleep, exercise and nutrition.

I currently use My Fitness Pal to log what I eat. They too have an API but even though I requested twice they haven’t given me a key yet! Good news is Fitbit has a key and I can get my MFP logs through Fitbit API. I also log my sleep on Fitbit manually so next step is to combine all these in one application to have a nice overview.

Resources