-->

hobby raspberry_pi, game

I used to love my Commodore 64 when I was a child. Now that we have the ability to emulate old machines and memories I decided to give it a go. Apparently creating a MAME is a popular subject. I’ve found this C64 emulator: http://www.mascal.it/rpi64_e.html.

It’s pretty straightforward. Download the rom, burn it to an SD card using a tool (I used Win32DiskImager). Then upload your ROMs to RPi and let the good times roll!

One of my favourite games was Donkey Kong so I decided to start with that.

Donkey Kong

Donkey Kong

It loaded nice and dandy but couldn’t play it with the keyboard. So either I’m going to buy an old Joystick and figure out a way to connect it to the RPi or try to dig a little deeper to find out the key mapping.

aws s3

We all know backups are good but most of the time you won’t need a backup from a year ago. Just keep enough copies to recover from a possible failure and get rid of the rest. The other day I was working on cleaning up old security camera images which become meaningless very quickly. The images are uploaded to Amazon S3. My first approach was to delete the older ones by a scheduled script but then I discovered an easier and more effective way.

Let AWS do the work!

It’s possible to loop through thousands of objects and delete them but the alternative is to set an expiration date for each object. To activate this select the folder and make sure the properties panel is visible. Expand the Lifecycle section and click Add rule. Add a number of days for the expiration. Make sure “Apply to Entire Bucket” is checked so that any newly uploaded files comply with this rule. It’s easy as that!

S3 Lifecycle

One thing to note is that this process runs once a day. So don’t expect to get your bucket cleaned up immediately. But also don’t forget to check the next to ensure everything is working as expected!

Resources

devopsaws github_pages, static_website

Sometimes you need a microsite with no server-side code. All you need is to display a pretty-looking entry page. In such scenarios you don’t need to use your own servers and use your precious resources on such trivial sites. The 2 ways to achieve this (that I know about) are:

  • Using Amazon Web Services S3
  • Using GitHub Pages

Both methods are very well documented. You can find the links to the official tutorials.

AWS S3

First, my favourite IT company: Amazon! S3 method requires to create a bucket with the name of your domain or subdomain. In the properties enable Static Website Hosting and point to your index document. In order for this to work, you have to use AWS Route 53 as your DNS provider. In Route 53, all you have to do is define an A record as an alias and select the S3 bucket that contains your site. If you have multiple accounts make sure that Route 53 DNS records and S3 bucket are under the same account. Otherwise you cannot point to the bucket as en endpoint.

GitHub Pages

GitHub method is also quite easy. All it takes is create a public repository, create a branch called “gh-pages” and check in your source code. To let GitHub know that you want to host a site there, you have to create a file called CNAME which includes the domain name. And in your DNS settings you have to point your site to GitHub’s IP address. The downside of this method is, obviously, your site will stop working if GitHub decides to change their IP address.

Resources