devaws ses, lambda

A few years ago AWS announced a new SES feature: Incoming Emails. So far I have only used it once to receive domain verification emails to an S3 bucket but haven’t built a meaningful project. In this blog post my goal is to develop a sample project to demonstrate receiving emails with SES and processing those emails automatically by triggering Lambda functions.

As a demo project I will build a system that automatically responds to a sender with my latest CV as shown in the diagram below

Receiving Email with Amazon Simple Email Service

Amazon Simple Email Service (SES) is Amazon’s SMTP server. It’s core functionality has been sending emails but Amazon kept adding more features such as using templates and receiving emails.

Step 1: Verify a New Domain

First, we need a verified domain to receive emails. If you already have one you ca skip this step.

  • Step 1.1: In the SES console, click Domains –> Verify a New Domain
  • Step 1.2: Enter the domain name to verify and click Verify This Domain

  • Step 1.3: In the new dialog click Use Route 53

(This is assuming your domain is in Route53. If not you have to verify it by other means)

  • Step 1.4: Make sure you check Email Receiving Record checkbox and proceed

  • Step 1.5 Confirm verification status

Go back to Domains page in SES console and make sure the verification has been completed successfully

In my example, it only took about 2 minutes.

Step 2: Create a Lambda function to send the CV

In the next step we will continue to configure SES to specify what to do with the received email. But first we need the actual Lambda function to do the work. Then we will connect this to SES so that it runs everytime when we receive an email to a specific email.

  • Step 2.1: Create a Lambda function from scratch

  • Step 2.2: Create an SNS topic

SES will publish emails to this topic. We will do the plumbing and give necessary permissions later.

  • Step 2.3: Create subscription for the Lambda function to SNS topic

Now we tie the topic to our Lambda by creating a subscription

  • Step 2.4: Attach necessary permissions to the new role

In my example, I store my CV in an S3 bucket. So the policy would need to receive SNS notifications, read access to S3 bucket and permissions to send emails.

By default a new Lambda role comes with AWSLambdaBasicExecutionRole attached to it

First add this to have read-only access to a single bucket:

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
            "Resource": [
                "arn:aws:s3:::{BUCKET NAME}",

Then this to be able to send emails

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
            "Resource": "*"

I like to keep these small, modular policies so that I can reuse then in other projects.

After adding the policies you should be able to see these in your Lambda function’s access list when you refresh the function’s page:

Step 3: Develop the Lambda function

In this exmample I’m going to use a .NET Core and C# 2.0 to create the Lambda function.

  • Step 3.1: Install Lambda templates

In Windows, AWS Lambda function templates come with AWS Visual Studio extension but in Mac we have to install them via command line.

dotnet new -i Amazon.Lambda.Templates::*
  • Step 3.2: Create Lambda function
dotnet new lambda.EmptyFunction --name SendEmailWithAttachmentFromS3 --profile default --region eu-west-1
  • Step 3.3:

Now it’s time for the actual implementation. I’m not going to paste the whole code here. Best place to get it is its GitHub repository

  • Step 3.4 Deploy the function

Create an IAM user with access to Lambda deployment and create a profile locally named deploy-lambda-profile.

dotnet restore
dotnet lambda deploy-function send_cv

Step 4: Create a Receipt Rule

Now that we have a verified domain, we need a rule to receive emails.

In my example project, I’m going to use an email address that will send my latest CV to a provided email adress.

  • Step 4.1: In the Email Receiving section click on Rule Sets –> Create a Receipt Rule

  • Step 4.2: Add a recipient

  • Step 4.3: Add an Action

Now we choose what to do when an email is received. In this example I want it to be published to an SNS topic that I created earlier. I could invoke the Lambda function directly but leveraging publish/subscribe gives me more flexibility as in I can change the subscriber in the future or add more stuff to do without affecting the rule configuration.

Since it supports multiple actions I could choose to invoke Lambda directly and add more actions here later on if need be but I’d like to use a standard approach which is all events are published to SNS and the interested parties subscribe to the topics.

I chose UTF-8 because I’m not expecting any data in the message body so it doesn’t matter too much in this example.

  • Step 4.4 Give it a name and create the rule.

Step 4: Test end-to-end

Now that it’s all set up, it is time to test.

  • Step 4.1: Send a blank email to (Or any other address if you’re setting up your own)

  • Step 4.2:

Then after a few seconds later, receive an email with the attachment:

The second email is optional. Basically, I creted an email subscriber too. So that whenever a blank email is received I get notified by SNS directly. This helps me to keep an eye on traffic if there is any.


aws certification

I’ve been working with AWS for years. Since I love everything about it and planning to use it for the foreseeable future, I’ve decided to go ahead and get the official certificates. This is to make sure I’ve covered all the important aspects of AWS fully. Also it motivates me to devleop more projects and blog posts on it.


There are 2 main categories of tracks:

  • Role-based Certifications
  • Specialty Certifications

The tracks and exam paths to take are shown in the diagram below:

My plan is to start with Cloud Practitioner exam, continue with AWS Solutions Architect track and move on to developer and sysops tracks.


I think it’s important to analyze costs first to assess whether or not this is a journey you want to start.

Individual Exam Costs

|Exam Name|Cost|Notes |AWS Certified Cloud Practitioner|100 USD|Optional |Associate-level exams|150 USD| |Professional-level exams|300 USD| |Specialty exams|300 USD| |Recertification exams|75 USD|Recertification is required every two years for all AWS Certifications |Associate-level practice exams|20 USD| |Professional-level practice exams|40 USD|

Total Tracks Costs

|Exam Track|Total Cost|With VAT|Notes| |AWS Solutions Architect|450 USD|540 USD| |AWS Certified DevOps Engineer|450 USD|540 USD| |All Associate Level Exams |300 USD|360 USD|3 Exams |All Professional Level Exams |600 USD|720 USD|2 Exams (There’s no professional level for developer, both associate level exams lead to DevOps engineer) |All Exams|1150 USD|1380 USD|Includes the optional Cloud Practitioner exam

The total cost is quite cheap but I think in the end it’s worth it.

Taking the Exams

It all starts with site. Just sign in with your Amazon account or create a new one. This allows you to take the online free courses. To take the exams you’d need a new account. I think this is because they partnered with a 3rd party to provide this.

Registration is quite similar. Just provide name and address and search for an exam centre.

Online Training

Free Courses

AWS Training

This is the official certification site of AWS. It allows the user to enroll to courses and view their transcript.

It’s a bit hard to find the actual course after you enrol because you can’t jump to contents from search results. What you should do is first go to My Transcript and under current courses you should be able to see the course and a link that says “Open”. Clickking that link takes to the actual content.

It has more content in the site. I’ll discover more as I go along.


edX have recently launched 3 free AWS courses.

Various Training Resources


As my favourite saying goes: “It’s not about the destination, it’s about the journey”.

AWS certification for me is not a destination. It just plays a role for me to stay on course and stay motivated to create more projects and blog posts in a timely manner.

I’m hoping to see this journey to completion. I’ll be posting more on AWS and my journey on certification soon.


devaws api_gateway

API Gateway is Amazon’s managed API service. Serverless architecture is growing more on me everyday. I think leveraging infinite auto-scaling and only paying for what you use makes perfect sense. But to have an API that will be customer-facing first thing that needs to be setup is a custom domain which might be a bit involved when SSL certificates come in to play. In this post I’d like to create an API from scratch and use a custom domain name assigned to it.

Step 1: Create an API

Creating an API is straightforward: Just assign a meaningful and description. However, to me it was a bit confusing when it came to choosing the endpoint type.

The two options provided are: Regional and Edge optimized.

  • Edge-optimized API endpoint: The API is deployed to the specified region and a CloudFront distribution is created. API requests are routed to the nearest CloudFront Point of Presence (POP).

  • Regional API endpoint: This type was added in November 2017. The main goal is to prevent a roundtrip for in-region requests. API requests are targeted directly to the region-specific API Gateway without going through any CloudFront distribution.

Custom domain names are supported for both endpoint types.

In this example, I’ll use Regional endpoint type. For further reading, here’s a nice blog post about endpoint types.

Step 2: Create a resource and method

For demonstration purposes I created a resource called customer and a GET method that is which calls a mock endpoint.

Step 3: Deploy the API

From the Actions menu in Resources tab, I selected Deploy API.

Deployment requires a stage. Since this is the first deployment, I had to create a new stage called test. A new stage can be created while deploying. After the deployment test stage looks like this:

At this point API Gateway assigned a non-user-friendly URL already:

This is the root domain of the API. So I was able to call the endpoint like this:

My goal was to get it working with my own domain such as:

Step 4: Generate the certificate in ACM

I’m using Route53 for all my domains and using ACM (AWS Certificate Manager) for generating SSL/TLS certificates. Before creating the custom domain name I needed my certificate available.

The wizard is quite simple: I just added the subdomain for the API and selected DNS validation.

After the review comes the validation process. Since I’m using Route 53 and ACM plays well with it, it simply provided a nice big button that said Create record in Route 53.

After clicking and confirming I got this confirmation message:

After waiting for about 3 minutes, the cerficate was issued already:

Step 5: Create Custom Domain Name in API Gateway

Now that the certificate was ready I had to go back to API Gateway to create the custom domain name and associate it with the newly created cert.

First, I clicked on Custom Domain Names on left menu and filled out the details. Make sure that your subdomain matches the one the certificate was generated for.

I assigned /test path to the test stage I had created earlier. I will use root path for the production stage when I deploy the final version.

After creating the custom domain, take note of Target Domain Name generated by AWS.

Step 6: Create A Record in Route 53

I had to also point DNS to the domain generated by API Gateway.

Since I was using a regional endpoint I had to map the custom domain name to the target domain name mentioned in the previous step.

Now the problem was when I tried to do it via AWS Management Console, it failed as explained in this StackOverflow answer.

So I had to do it via CLI as below:

aws route53 change-resource-record-sets --hosted-zone-id {ZONE_ID_OF_MY_DOMAIN} --change-batch file://changedns.json

whereas the contents of changedns.json were

  "Changes": [
      "Action": "CREATE",
      "ResourceRecordSet": {
        "Name": "",
        "Type": "A",
        "AliasTarget": {
          "DNSName": "",
          "HostedZoneId": "ZJ5UAJN8Y3Z2Q",
          "EvaluateTargetHealth": false

In the JSON above, DNSName is the Target Domain Name created by AWS is Step 5. The HostedZoneId (ZJ5UAJN8Y3Z2Q), on the other hand, is the zone ID of API Gateway which is listed here.


If you are having issues running the command above that might mean you don’t have a default profile setup which has permissions to change DNS settings. To fix that:

1. Create a new user with no permissions

Go to IAM console and create a new user. Skip all the steps and download the credentials as .csv in the last step.

2. Assign required permissions

Create a new policy using the JSON template below and attach it to the new user

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "route53:ChangeResourceRecordSets",
            "Resource": "arn:aws:route53:::hostedzone/{ZONE ID OF YOUR DOMAIN}"

3. Create a new profile for the user

aws configure --profile temp-route53-profile

and set the Access/Secret keys along with the region of your hosted zone.

Then you run the first CLI command with providing profile name:

aws route53 change-resource-record-sets --hosted-zone-id {ZONE_ID_OF_MY_DOMAIN} --change-batch file://changedns.json --profile temp-route53-profile

An important point here is to get your hosted zone ID from Route53. In the API Gateway, it shows a hosted zone ID which is actually AWS API Gateway zone ID. We use that zone ID in our DNS configuration (which is in changedns.json file in this example) but when we provide the hosted zone ID on the command line we provide our domain ID which can be found in Route53.

Step 7: Test

So after creating the alias for my API I visited the URL on a browser and I was able to get the green padlock indicating that it loaded the correct SSL certificate.