Virtualization comments edit

VMWare is one of my favourite IT companies. They are specialized in one area and they create very nice products. And they mind their own business. I mean you don’t read about them in patent dispute related news. As virtualization is the key technology behind cloud computing in a way VMWare is one of the pioneers to make it happen. They say Microsoft is advancing with HyperV 3.0 but currently I’ll stick to VMWare Workstation for now. As of version 8.0 VMWare Workstation comes with a cool feature called VM Sharing. As the name implies, you can sharing a whole machine and connect to it from another workstation application and manage that machine as if it was a local machine. So if you need to access a virtual machine from multiple computers you can accomplish it without creating multiple copies of the machine. All you have to do is open the VM you want to share and select VM -–> Manage –-> Share. Keep in mind that the machine must be powered off.

VMWare

Sharing wizard is very simple. It asks if you want to clone the machine and move it under the shared VM folder. I like moving it because I don’t want to deal with multiple copies. Then from the client side select File –-> Connect to server.

VMWare

Then provide the hostname / IP address along with administrator credentials and you can see the shared VMs under (not surprisingly) Shared VMs menu at the bottom of the left menu.

VMWare

The rest is exactly the same as the regular process. You can manage the remote virtual machine as if it resides in your local environment.

VMWare

Amazon Web Services, Cloud Computing, Development, Tips & Tricks comments edit

AWS must be short for awesome! I love using it. It makes managing virtual machines so much easier yet provides full power to the user through its API. Thanks to vision of Jeff Bezos, every function you see on the management console can be accessed via API as well. Back in 2002 Jeff Bezos mandated that all teams will expose their data and functionality through service interfaces. This approach make complete sense. It makes separation of layers much more easier, makes the code testable. That’s why I’m currently big on ServiceStack and WebAPI but that’s a discussion for another post. In this post I’d like to share some of the tips & tricks that I picked during my involvement with AWS. Of course, as many IT related things, this is an ongoing process and I may post sequel to this one in the future. Currently my tips are as follows:

TIP 01: Always create production servers with termination protection on If there is one thing I don’t like about AWS is that in the management console there is no way of separating the production and test/staging machines. So first use a clear naming convention to distinguish them but sometimes that’s not enough. In the heat of the moment you can attempt to stop or terminate a production instance. If you don’t have termination protection enabled this attempt would become a tragedy but if you have it on simply nothing happens and you get to keep your job. If you forgot to turn it on while creating an instance you can always change it by right-clicking on the instance and selecting Change Termination Protection.

AWS Termination Protection

TIP 02: You can change instance type in a few minutes One of my favourite features is that you can stop the instance and change it’s type. This way you can upgrade or downgrade a machine within minutes. So don’t worry if you are not sure what instance size you would need for a specific job. Just ballpark it, observe and upgrade/downgrade at an idle time.

TIP 03: Use auto-scaling This feature is not available via management console but it’s possible with API. You can write your application but it’s even easier by using command line developer tools. Basically you create a scaling policy for scaling up and one for scaling down. You define the alarm conditions and when these conditions are met the policy you specify is executed. This way if your web servers are under heavy load, for example, you can automatically launch another machine. They all have to be under the same load balancer of course. You can find more about auto-scaling here: http://aws.amazon.com/autoscaling/

TIP 04: Use Multi-AZ (Availability Zone) deployment Regions have several availability zones in them. Although you cannot create cross-datacentre systems, you can create instances using different AZs. So  if one data centre goes down other instances can still be responsive. It’s the simple principle of not putting all the eggs in the same basket.

TIP 05: Customize management console AWS management console comes with a cool feature: It enables you to pin your favourite services on top of the page for easy access. There are a bunch of them but most likely you’ll need EC2 and S3 available at all times. At least I do. You can pin them by simply dragging the service name and dropping it onto the top bar. After pinning them on top, they are always one click away.

AWS Customize Menu

TIP 06: Change disk size while creating the image This is especially handy for Windows instances as they demand more space than Linux ones. The default size for a Windows Server is 35GB. It’s actually quite enough for a standard Windows installation but I guess Amazon is reserving some of the space for some reason because when you launch the machine you only get around 3GB free disk space which to me sounds terrifying. If a log file gets out of hand a little bit it can bring down the whole machine. So it’s best to get some free space upfront. At least for the peace of mind if nothing else.

AWS Change Disk Size

AWS Change Disk Size

TIP 07: Don’t forget to delete manually attached EBS volumes When you terminate an instance make sure you delete all the attached EBS volumes that are not set to auto-delete. The default volume that comes with the instance has Delete on termination option checked in the wizard so they are automatically cleaned up. But if you create a volume manually and attach it to an instance there is no option to set this flag. So you have to delete them manually. AWS is kind enough to warn you to delete them when deleting the instance. If you don’t take care of them immediately and you have auto-scaling you may end up with terminating lots of instances that leaves unused disks that you keep paying for.

AWS Delete Instance

TIP 08: Reserve as early as you can This is another budget tip. If you are certain about the size of an instance then buy a reserved instance for that type. Reserved instance is not a technical concept. When you buy one you start paying less by the hour for an instance of that type. For a comparison to see how much you can save check out here: http://aws.amazon.com/pricing/ec2/

Development comments edit

It’s been a while since I’ve started using StyleCop in my projects. Last year I managed to sneak it in to my company’s projects as well. Applying it to existing projects and fixing all the errors was a tiring process at first but I believe it was worth it. It really helps for consistency. Regardless of the developer of a certain block of code it’s very easy to read it because everybody has to adhere to same rules across the company. Here are a few tips to manage this:

01. Force StyleCop warnings to be treated as errors. I hate warnings completely actually. That’s why I set treat warnings as errors to All on the projects I work on. This helps to eliminate many potential bugs before they become an issue.

Treat warnings as errors

Unfortunately, StyleCop errors are not included in this. But with a little tweak we can turn on this feature for StyleCop warnings as well. Just add the following line to your project’s .csproj file inside the first PropertyGroup tag:

<stylecoptreaterrorsaswarnings>false</stylecoptreaterrorsaswarnings>

The wording is the opposite of Visual Studio’s (treat errors as warnings instead of the other way around) so we have to set this to false. After reloading the project, you won’t be able to successfully build your project without fixing all the StyleCop rule violations (which is a good thing!)

02. Integrate StyleCop to MSBuild Naturally if the process is not automatic it won’t work. If, as a developer, it’s left to me to right-click on the project and run StyleCop manually I’d forget it after a few times. The easiest way to integrate it with MSBuild is adding StyleCop.MSBuild NuGet package to your project. Alternatively if you have installed full StyleCop application you have StyleCop.Targets file under your installation directory. By adding that file to the project you can achieve MSBuild integration.

For multiple developer environment it’s best to use a fixed path so that when someone new starts working on the project they can still build the project. To accomplish that, we mapped R drive to a folder that contains the targets file so that the build doesn’t break. Of course needless to say new developers have to do the mapping to make this work.

03. Run StyleCop on the server as well The problem with manually enabling treating warnings as errors feature on the developer system is that it can be easily forgotten or can be temporarily disabled for some reason. When the developer forgets to re-enable it,he/she can check-in code that violates code convention rules. To avoid that we should reject code on the source control during the check-in process.This is where StyleCopSVN comes in. Of course as the name implies this solution works only for SVN. I haven’t yet looked into other source control systems like Git or TFS for this feature yet. You can get SVNStyleCop here: http://svnstylecop.codeplex.com/

The way it works is quite simple and the official page has a good tutorial about it. Mainly you override the pre-commit hook and run StyleCop from before the code is submitted. The problem with this is that you have to maintain a separate copy of rules and StyleCop files so when you update your rules you have to remember to update it on the server as well.

04. Use symbolic links to maintain one global rule set Windows Vista (and above) comes with a handy utility called mklink. By entering the following command you can create a symbolic link to Settings.StyleCop file anywhere you please.

mklink /H Settings.StyleCop {Path for the actual file}

This way all projects are going use the same settings file. The problem is it’s a tad cumbersome especially if your project involves lots of projects.

05. A better approach for one rule set to rule them all I was pondering for minimizing user efforts to deploy StyleCop and it hit me! Our beloved NuGet could take care of it as well. StyleCop has already a package in the official NuGet repository but the problem with it is that it comes with its own StyleCop rule file so it’s not quite suitable for a team environment. Even not for a single developer because all projects will have different rules and it can quickly become a maintenance nightmare. The idea of using NuGet is creating a package that contains StyleCop rules and libraries. When the package is installed it copies the libraries, rules and targets file under the project. Also an install script can be used to add the import project and treat warnings as errors settings mentioned in tips 1 & 2. The advantages of this method are:

  • All projects installing the package will be using the same rule set downloaded from server
  • MSBuild integration is done automatically
  • Treat warnings as errors update is done automatically
  • No configuration needed (i.e: Mapping drives, creating symbolic links etc)

The disadvantage is if rules are updated the package needs to be re-installed for he projects. It’s still not perfect but compared to other methods I think it’s a neat way of distributing and enforcing StyleCop rules.

Tips & Tricks comments edit

I like Windows Live Writer and I use it for blogging. The problem is I start multiple posts at once, take some notes on them and save them as drafts. Sometimes when I’m on a different machine I want to add some notes on the existing drafts but (you guessed it) the drafts are saved locally on a different machine. I already have Dropbox installed almost on my machines so I decided to harness it to the task.

STEP 01: Delete the My Blog Posts folder in the destination machine. The local folder is created automatically under %UserProfile%\Documents\My Weblog Posts. Delete this folder. Make sure LiveWriter is closed before deleting it.

STEP 02: Create a directory junction A directory junction is a mapping to another folder. In Windows 7 you can use mklink command to create directory junctions (as well as symbolic and hard links)

mklink /D "%UserProfile%\Documents\My Weblog Posts" {PATH_TO_DROPBOX_ROOT}\My Weblog Posts"

Enter the correct path to your dropbox folder and that’s it. Now you can enjoy the ease of synchronized blog drafts.

Security comments edit

I learned a neat trick to force Windows check a USB device plugged in to be able to log on the system. The tool to use for that is syskey, an ancient tool introduced to Windows with Windows NT SP3. Here’s how to do it:

  1. Insert your USB drive. As syskey only supports floopy disk change the drive letter to A.

  2. Run syskey (From command prompt or by pressing WinKey + R then entering syskey)

  3. Select Store Startup Key on Floppy Disk

SysKey

After you restart the machine, Windows will check your “floppy” USB drive and if it is not there it will display the error message: “This computer is configured to use a floppy disk during startup. Please insert the disk and click OK”. After you insert the disk you can logon by entering your password.

Testing comments edit

Know thy limits! This is especially important when you’re developing a system that expects a high traffic. Moving systems onto the cloud makes it easier to adapt and scale out to match the load but you have to prepare for node failures and instant spikes in the traffic. Also you have to make sure that your system is responsive under long heavy load. Below I recommend 3 tools to test your system against such situations:

01. JMeter

Apache JMeter, is a Java-based open-source desktop application. I submitted a basic introduction to JMeter here. But it has many advanced feature which I’m planning to cover in a post in the near future.

02. Siege

Siege is an HTTP load testing tool. It’s not complex as JMeter but works the job well and it is very simple to use. It supports UNIX variants but not on Windows. It can obtained from here.

03. Chaos Monkey

Chaos Monkey

Originally developed by Netflix and open-sourced later. It is AWS specific tool. What is does is connect to your AWS system and terminate instances randomly. So that you can observe your system in such worst case scenarios. The good thing about it is, it selects its “victims” by looking at a tag you provide. So if you don’t want a single node such as a database server, you can easily exclude it. Source code can be downloaded from here.

Security comments edit

Yubikey

I love my Yubikey so much that I recently bought another one. I couldn’t find a good use for it yet but I’m sure I will someday :-)

If you don’t know what a Yubikey is, check out its vendor here. Basically it is a one-time password (OTP) generator. It has a USB input device. It doesn’t require batteries to operate so you can use one everywhere without having to worry about such issues. I’m trying to incorporate using it into my daily life so that I can leverage two-factor authentication as much as possible.

Today, I found another usage. Yubikey Wordpress plugin. By using this plugin now I can login to my blog using my password and OTP generated by Yubikey. Yubikey has a web API and the plugin calls the API to authenticate your device. To learn more about the settings visit the plugin’s site: http://wordpress.org/extend/plugins/yubikey-plugin/installation/

Testing comments edit

One of the key goals when developing a web application is to make it scalable. Meaning that it should handle lots of traffic without hindering the performance. But most of the time we only care about performance when it becomes a problem and generally it’s then too late to make radical design changes. Therefore, an upfront automated load testing is very helpful to gauge your application’s performance and being aware of its limits. One popular tool used for load testing is JMeter.

JMeter Basics

  • Thread Group: Each thread acts like a single user. All elements must be under a thread group.
  • Listener: Allows access to the information gathered by JMeter. Some listener examples are Aggregate Report, Graph Report and Summary Report
  • Logical Controller: They allow you to add construct to control the flow of your tests such as If, While, ForEach
  • Sampler: They tell jMeter to send requests to server and wait for a response.

When you launch JMeter there are 2 items on the left menu: Test Plan and Workbench. Test Plan is the real deal. That is the actual sequence of events that are fired. Workbench is where you can store test elements.

Creating a load test plan can be accomplished in 2 simple steps:

  1. Create a thread group: Everything runs under a thread group. Think of each thread as a user.

    JMeter

  2. Insert an HTTP request: Set the host name and page you want to call.

    JMeter

That’s it! If all you need is to create some heavy load you can create a few different HTTP requests and start bombarding your server right away.

A trivia about the JMeter is that it is mentioned in the book titled “We Are Anonymous”. Apparently it can also be used as a DDoS tool!

Anonymous

Resources

Online Education comments edit

Last week another online course started at Stanford University called An Introduction to Computer Networks. It started on 8th of October and they released a good deal of materials for the first week. I hope I’ll follow it until the end. If you’re interested you better hurry up because it’s not easy to catch once the videos pile up!

Here’s the link to access the site: https://class.stanford.edu/networking

UPDATE: The above link seems to have stopped working. This should be the current one now: Stanford CS144 Networking Class

Productivity comments edit

Focusing on tasks and getting them done is crucial. This simple fact is so easy to comprehend it’s a no brainer. But the reality is it’s much harder to accomplish than it sounds like. To organize my tasks I’ve been seeking methods for a long time. My favourite approach is J.D. Meier’s system which he describes in his book Getting Results the Agile Way (http://gettingresults.com). It’s freely available on this site. You can also purchase a hard-copy here.

The key of this system is choosing 3 most important tasks every day. This is very simple and easy to apply to real life. Another important aspect is 3 weekly goals. These are of course more complex tasks than the daily ones. Also an important point is reviewing the progress on Fridays. He calls this pattern Monday Vision, Daily Outcomes and Friday Reflection. This approach is very effective and realistic. The key to succeed with this system is being realistic. Know your limitations and don’t stuff your lists with everything you want to get done. It’s impossible to complete everything at once. If you pick 3 and actually get them done you feel better about yourself as the progress would be visible.

I also recommend an excellent episode of Hanselminutes which can be found here. That episode introduced me to Pomodoro technique which is a simple and effective way to boost performance and focus. The idea is dividing your tasks into small work units called Pomodoros which are 25 minutes (This is the default value. You can change it to your preference). During this 25 minutes you sever all your ties with the outside world (no email, no twitter, no nothing!) and focus on one task only. As focus-impaired people like me would know this is not very easy to do. I tried setting it an hour hoping that I’d get a big task done without any disruptions but ended up with getting lost along the way. So I believe the default value is quite realistic. After you complete the Pomodoro, you have a 5 minute break. I try to consider tasks in terms of Pomodoros. It helps me planning my daily outcomes too.

Site news comments edit

I’ve been postponing this for a long time but finally I did it: I started using AWS (Amazon Web Services). My blog doesn’t get too much traffic so actually I don’t need the scalability which is EC2’s strongest point but I wanted to play with the cloud so decided to move my blog first. It’s quite easy to do so and I’ll explain how to migrate your existing blog to cloud within minutes:

STEP 01: Backup your existing blog

mysqldump -u root –p {database name} > blog_backup.sql

Download your wordpress installation folder and the backup you just created from your existing server. I Used WinSCP to get my files. It can be downloaded here.

STEP 02: Create AWS EC2 instance

Fun part starts now. Login to your AWS account and go to EC2 console. Click Launch Instance and follow the wizard. For my blog, I selected a Micro instance but it depends on your needs. I selected a 64-bit Amazon Linux AMI for the instance.

STEP 03: Assign an IP to your server*</span>

Our machine has just started running but to update our domain’s DNS records we need an IP. On the left menu, click the Elastic IPs link and allocate a new IP address. The IPs assigned to EC2 instances are free.

AWS

Right click on the IP and select Associate and choose your instance. After this step we have a running machine with a public IP.

STEP 04: SSH into the machine

By default SSH is enabled you must have created a keypair to access your machine during the Step 2. I use putty to as my SSH client which can be downloaded from here. It’s best to switch to root during the installations. So type:

sudo su

STEP 05: Install required programs

First install Apache:

yum install httpd

Then PHP:

yum install php php-mysql

Then MySQL:

yum install mysql-server

If you use SSL like me you also need to install SSL module for Apache:

yum install mod_ssl

Start Apache and MySQL

service httpd start

service mysqld start

STEP 06: Customize MySQL and import your blog

  1. Run the following command to set root password and harden the default installation

    mysql_secure_installation

  2. Login to MySQL

    mysql -u root –p

  3. Create your database, user and grant access to that user

    create database {database name};

    create user ‘{user name}’@’localhost’ identified by ‘{password}’;

    grant all privileges on {database name}.* to ‘{user name}’@’localhost’ with grant option;

  4. Switch to database and import data

    use {database name}’;

    source {path/to/mysqldump file you uploaded}

STEP 07: Copy blog files

Copy the Wordpress files you uploaded under /var/www/html/{directory name}

STEP 08: Configure Apache

Enter the command to edit configration file:

vi /etc/httpd/conf/httpd.conf

Go down to the end of the file and create a new virtual host by this:

NameVirtualHost *:80
<VirtualHost *:80>
	ServerAdmin webmaster@localhost
	DocumentRoot /var/www/html/{directory name}
	ServerName {Your domain name}
	ServerAlias www.{Your domain name}
</VirtualHost>

And restart the Apache service:

service httpd restart

STEP 09: Enable access to FTP, HTTP and HTTPS

One last step before testing your blog is opening port 21 (for installing themes, plugins etc.) 80 (for viewing!), and 443 (if you’re going to use SSL) on the AWS EC2 console. For this, click on the Security Groups on the left menu. Add the ports and press Add Rule and then Apply Rule Changes.

STEP 10: Install FTP server

  1. Enter the following command and install the FTP server:

    yum install vsftpd

  2. Create a certificate to be used with FTPS connections:

    openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout /etc/vsftpd/vsftpd.pem -out /etc/vsftpd/vsftpd.pem

  3. Edit the configuration file:

    vi /etc/vsftpd/vsftpd.conf

    Disable anonymous access and add these lines to the end of the file. Then save the file and exit:

     ssl_enable=YES
     allow_anon_ssl=NO
     force_local_data_ssl=NO
     force_local_logins_ssl=NO
     ssl_tlsv1=YES
     ssl_sslv2=NO
     ssl_sslv3=NO
     rsa_cert_file=/etc/vsftpd/vsftpd.pem
    
  4. Start the service

    service vsftpd start

  5. Create a user for FTP access and set the password

    useradd {user name} passwd {user name}

STEP 11 (Optional):

If you don’t want to enter the FTP credentials every time, you can
set them in the wp-config.php file:

 define('FTP_BASE', '/var/www/html/{folder name}/');
 define('FTP_CONTENT_DIR', '/var/www/html/{folder name}/wp-content/');
 define('FTP_PLUGIN_DIR ', '/var/www/html/{folder name}/wp-content/plugins/');
 define('FTP_USER', '{user name}');
 define('FTP_PASS', '{password});
 define('FTP_HOST', {hostname]');
 define('FTP_SSL', true);

STEP 12 (Optional): Install SSL certificate for the blog

This step is optional, but using SSL is strongly recommended when connecting to your blog as administrator

Upload your private key and certificate files to your server and copy them under SSL folder:

mkdir /etc/ssl/private<br />
mv filename.key /etc/ssl/private/<br />
mv filename.crt /etc/ssl/certs/<br />
mv CARootCert.crt /etc/ssl/certs/<br />

Modify Apache configuration file for SSL:

NameVirtualHost *:443
<VirtualHost *:443>
 DocumentRoot /var/www/html/{folder name}
 ServerName {domain name}
 ServerAlias www.{domain name}
 SSLEngine on
 SSLProtocol all –SSLv2
 SSLCipherSuite ALL:!ADH:!EXPORT:!SSLv2:RC4+RSA:+HIGH:+MEDIUM:+LOW
 SSLCertificateFile /etc/ssl/certs/filename.crt
 SSLCertificateKeyFile /etc/ssl/private/filename.key
 SSLCACertificateFile /etc/ssl/certs/CARootCert.crt
</VirtualHost>

STEP 13 (Optional):

If you get an “Unable to locate Themes directory” error add the following snippet to wp-config.php.

if(is_admin()) {
 add_filter('filesystem_method', create_function('$a', 'return &quot;direct&quot;;' ));
 define( 'FS_CHMOD_DIR', 0751 );
 }

STEP 14: Enjoy!

That’s it! You installed a Wordpress application, imported your old posts, secured your blog with FTPS and HTTPS access. Time to celebrate!

Online Education, Review comments edit

Recently there is an explosion in the online courses for higher education. It’s a great chance for people looking for comprehensive academic classes. I started with a bunch of these classes but time proved that I should have selected wisely. Because they are not fluffy little tutorials. You really have to take the time to watch the videos, take the exams and hand in the assignments. So here are the ones that I find useful mixed with some of the older resources I used for similar purposes.

Udacity

Udacity is one of my favourites. It’s very forgiving about the deadlines. It uses Youtube extensively. You can even take the quizes and answer questions right on the video and submit your results. It has very nice courses including Applied Cryptography and Artificial Intelligence.

Coursera

Coursera has a ton of Stanford classes like Cryptography, Machine Learning and Computer Networks. I had started Machine Learning course when it was in sort of a beta phase and was running on ml-class.org. Princeton university offers nice computer science classes as well under this site.

MitX

I don’t know if this was a beta site but it offers only one class by MIT: Circuits and Electronics. More about MIT courses below…

EdX

edX has a small but quality selection of courses from MIT, Harvard and Berkeley. Software as a Service and Artificial Intelligence courses from Berkeley look delicious.

Academic Earth

I personally haven’t tried this site but heard from a friend. From the looks of it I think they have mostly old material. They have some video courses from Stanford which I had seen a few years ago from iTunesU. Still it may worth keeping an eye on.

iTunesU

iTunesU was a gold mine in my eyes when I first discovered it. Watching course from Stanford while commuting was a great way making good use of time. Now the more interactive courses overshadowed it but it’s not dead yet. Far from it. Apple released an iTunesU app which allows to download your favourite courses very easily.

Khan Academy

Frankly I’m not a big fan it. Compared to the rest it has rather simple and entry-level tutorials. But still shoulders an important mission and helps a lot of people get some valuable resources so I thought it’s worth mentioning.

Resources

Heavy Metal comments edit

This was my second festival experience after Wacken ‘10. I was quite satisfied and hopefully I’ll be back next year. Here are my notes about the festival. Hopefully I can remember to check back this post next year to apply todo list.

Arrival

After a 3 hour drive, we arrived at the festival area via coach. After about 15 minutes of waiting I got my wristband and got in. First order of business was finding a spot to pitch my tent. This was a daunting task because I’ve never done it before. After struggling with the instructions for 10 minutes I started setting it up. It was easier than I was expecting and looks like all the tents work the same way, so I guess next time will be a breeze. After the tent was setup, the boring hours began! It was 2 P.M. and there was absolutely nothing to do around as the festival area was going to open at 5. I turned to my book and read a few hours (only reading session I had throughout my stay). The rest of the time passed with eating junk food and drinking beer while watching some “warm-up” bands.

Festival

Friday morning. After a very uncomfortable sleep, started a nice shiny day with a ton of nice gigs ahead. After getting an airbed from the festival area, I started patrolling the festival area. There are a number of choices for food but they were triple the normal price. I don’t understand why they try to rip off people at festivals. You already have thousands of people in a confined space who has nothing to do all day. Even with regular prices they can make a fortune but they are greedy as always. Saturday and Sunday were exact copies of Friday actually. Only the bands changed but my activities between the gigs were pretty much the same.

Bands

Here are my notes on the bands I’ve seen:

Friday – Ronnie James Dio Stage

Malefice : The opening band. I hadn’t heard about them before but they sounded strong and is definitely worth having a look at their albums.

Moonsorrow: I’m not too big on black metal but I’ve been enjoying their music for a decade now. I haven’t seen them live before so I was wondering how it’s going to be. They didn’t communicate with the audience (as none of the black metal bands). Overall it was a nice watch.

Moonsorrow

Iced Earth: I don’t know anything about their work after Something Wicked This Way Comes. But I’m happy to see them for the first time. Apparently they have a new lead singer. One funny moment was him dropping the microphone in the middle of a song. That was a unique incident thinking about all the concerts I’ve seen so far. He was quite chatty and I think he’s a good frontman. May worth checking out their new stuff.

Iced Earth

Sepultura: Another legend and another first live show for me. I can’t believe how I managed to miss them over all these years but finally caught them. They have enough material for a much longer playlist but they were only allowed for 30 minutes. They killed with Roots as the closing song!

Sepultura

Watain: Never had seen them before and never will again! Classic black metal with similar songs. One interesting scene was the singer bringing a cup full of some liquid looked like blood and spilling it over the crowd in the front row.

Watain

Behemoth: I love these guys! They deserved being an headliner with a great show. Flames and burning crosses were nice visual elements of the show.

Behemoth

Saturday – Ronnie James Dio Stage

Benediction: I wonder why they got the earliest spot of the day. I think they are a great band. Even at 11 in the morning, they managed to create mosh pits in the crowd. It was my first time seeing them live, so it felt good to be able to mark another band as seen.

Benediction

I am I: With a lead singer resembling Captain Jack Sparrow, it seemed to be a good band but power metal was never my thing!

I am I

Cthonic: A band from Taiwan. Definitely not a country on metal map IMO. Didn’t like the band except for the beautiful bass player. It was funny when lead singer tried to make the crowd shout “Cheers” in Taiwanese. Cthonic

Crowbar: I used to have an album of this band (in cassette format!) that was one of the most boring albums I’ve ever listened to. The show was very boring too. Good thing I watched it from the beer tent!

Mayhem: I cannot understand how this band is still around. It was horrible! Worst metal band ever. They don’t deserve that spot in the middle of the day in the main stage. They should open the New Blood stage! Boring pointless screams with no melody or lyrics.

Mayhem

Sanctuary: Apparently they are an old band. Singer mentioned they were touring in the UK 20 years ago with Megadeth. I didn’t like their music much, wasn’t surprise why I never heard of them but rather how they survived all these years with this music. I hated them after the singer asked for the crowd to chant their name and said “Thank you for being so obedient” afterwards. I hate arrogant people!

Sanctuary

Hatebreed: One of my top bands ever. I’ve seen them before and knew about the brutal pits but couldn’t resist again. Classical playlist. It was fun to watch when Jamie invited a 7 year old kid and his dad to the stage. (I think the organization tried to ban swearing on the stage after this. Jamie was swearing as usual with the boy on the stage)

Hatebreed

Testament: Classic testament. I’ve seen them a couple of times before. Chuck Billy was awesome again with his stupid grin on his face :-). During the show they put a banner saying “Randy Free” which meant nothing to me at the moment. Later I learned about his incident here.

Testament

Machine Head: Another first show for me. I’m glad I finally saw them. I think they are the new Metallica (during their career peak – Master of Puppets and And Justice For All era). Beautiful show. I hope I can catch them some time soon.

Machine Head

Sunday – Ronnie James Dio Stage

Corrosion of Conformity: Pure rock ‘n roll. They have no crew so the lead singer/bass player had to deal with the technical issues during the show.

Evile: A very nice thrash metal band. Definitely under my watch now. I hope I can catch them around.

Nile: Pure death metal. I’m not a big fan of them. It wasn’t too bad for passing time but nothing special about it.

The Black Dahlia Murder: Not one of my favourite bands but the frontman had a nice dialog with the audience. One interesting detail was one guy trying to crowd-surf completely naked!

Paradies Lost: I used to like them in the old days, not so much anymore but still I enjoy their music. I didn’t understand the Nick Holmes’ attitude though. Making fun of festival name, trying to undermine common concert acts like asking the crows a question and make them make some noise. A dying band!

Dimmu Borgir: Another first show. And a great one. It wasn’t dark when they took the stage but still they managed to get the crowd in the mood. Good dialog with the audience which I didn’t expect. Alice Cooper: Amazing rock show. I’ve seen him in Wacken but this time I was much closer to the stage and enjoyed it a lot more. Best show I’ve ever seen. Even though I’m not a fan of his music, I enjoyed the classics.

Friday - Sunday – Other stages

SaDoKa

During the breaks at the main stage, I sneaked into the other stages. Couldn’t see many interesting events except Sa-Da-Ko. I really enjoyed their music and got hem under my radar now. I’ll follow their albums.

After the live shows, there was a party in Sophie Lancaster stage with The 5 DJs of the Apocalypse. They played almost all the classic metal hits. The dancer girls on the stage on Sunday night were very hot!

Bloodstock 2013

I don’t know if I can make it there, but if I do here is my todo list based on my current experiences:

  • Try to get a closer camping spot to the showers.
  • Gear to buy:
    • Same tent
    • A better/thicker sleeping bag
    • Airbed (from the festival area the first day)
    • Suntan cream
  • Check the weather first, bring rubber boots if it may rain (This year I was lucky)
  • Get more info about the bands before the festival
  • Don’t bring a camera. Use phone or just Google for the photos and videos. There are plenty of them all around.

Off the Top of My Head comments edit

I’ve been using my own domain in my home network for a few years now but I hadn’t tried Windows Server Update Services (WSUS) until recently. After re-organizing my network, I ended up having almost 20 virtual machines. Since most of them are running various flavours of Windows, keeping them up-to-date became an issue. That’s when I recalled the existence of WSUS. What it does is basically allowing you to download Windows updates on one of your own servers and then distributing them to other machines over LAN. So no more downloading 300M of service packs over and over again.

Installing WSUS is quite easy. You just have to open server manager and add the Windows Server Update Services role.

SQL Server Roles

Then you can select for which products you want to download updates. And also what types of updates you want to download.

SQL Server SQL Server

The most tricky part is enforcing the client machines in the domain to download the updates from your server. That goal is achieved by Group Policy Management. TechNet has a nice article describing the necessary actions: http://technet.microsoft.com/en-us/library/cc708574(WS.10).aspx

Normally, its target is not home users obviously. But after I’ve seen the benefits of it, I strongly recommend it to anyone running his own domain.

Development comments edit

I was planning to play around with Visual Studio Lightswitch for a while. Finally, I could spare some time. For me the best way of learning is by doing so before I started playing I had to imagine a project first.

To find an appropriate project of course we have analyse what Lightswitch is and is it good for: In a nutshell, Visual Studio Lightswitch is a fast and easy way to develop data-driven Line-of-Business (LOB) applications. As a developer, I am generally not very fond of such tools because they impose many limitations whereas while writing code we have unlimited freedom. But I also hate wasting time with boiler-plate code. Creating add/edit/search screens for some entities is such a trivial and boring task. Such forms should always be  generated by a tool to sustain consistency. Otherwise, especially in large applications and organizations. By the way, this reminded me the one of the worst forms I have ever seen in a Microsoft application. It’s the reporting form in Team Foundation Server 2010 as shown below. Even the tabs in the same form are inconsistent. But, I digress! Let’s move on.

TFS Reporting

For a long time I was looking for an nice open-source software to manage my movie collection. I tried a few but couldn’t exactly find what I wanted. So while trying to find a project idea I decided to create a simple movie manager application. It’s mainly data entry and search so it sounded an ideal project type for Lightswitch.

The result was amazing! Not that I created a complex and fully-functioning application but within minutes I had a simple database and two forms to enter movie and director information and a movie search form.

Movie Entry

Search Entry

The screens are customizable but the even the default templates create very satisfying results. They cover all the basic needs for data entry, validation and search. So having a such tool in my arsenal and always preferring to develop my own software instead of using someone else’s I decided to develop my movie and TV show management program with Lightswitch. Thinking this is only version 1.0, I think there’s great potential in it for there are so many applications to develop but not enough time.

Off the Top of My Head comments edit

After I downloaded the bits and I tried to install it for the first time on a Virtual Machine, I sadly discovered that there was something wrong with the ISO image because it gave an error while extracting files at %62. A quick hash-check (which should have been done right after the download) showed that the file was corrupt and there went the 5GB of my bandwidth down the drain! Lesson learned: Always perform a hash-check and verify the file if a hash is provided by the source.

Anyway, after downloading it successfully, I installed it on a VMware virtual machine. I followed the step-by-step guide at http://vlkn.me/q5jyJN, but there aren’t any tricks actually. I started using it a on the virtual machine a little bit and as everybody else, I hated it! Obviously, the tile interface is designed for tablets or touch-enabled screens to be more generic.

So, my first interaction with it was negative for 2 reasons:

  1. It’s always a bit off-putting testing out a new OS on VM. Because you always revert back to your original OS all the time and you don’t have the entire experience.
  2. In this case, it’s clear that without a touch screen I wouldn’t enjoy using it much. (I’m not sure how well people will react when it RTMs)

Then I suddenly remembered that I already had a touch-screen notebook! 3 years ago, i bought an 12.1” screen notebook which back then was called a tablet PC because it had a rotating screen. Until this day I only used its touch capabilities a few times for experimental purposes. My initial intention to use it as an e-book reader soon proved to be preposterous as it weighs a solid 2.2 kg.! (An iPad 2 is 600 grams by the way)

But my long wait to find a legitimate use for it is over! Finally I can use it’s tabletish functionalities!

First I installed it on a VHD and booted it off of it. Scott Hanselman has a great blog post walking you through the steps: http://vlkn.me/n5Opva

So I immediately installed it but again i was not satisfied! The performance wasn’t so good and it also messed up with the boot loader. Even though I could opt to boot from my old Win 7 installation, it just restarted and couldn’t load it. I tried to repair using Win 7 DVD but to no avail. But using the repair functionality of the Windows 8 did the trick and I could boot it to my Win 7 again. And to fix it once and for all, I changed default OS from system settings

BCD

Now I could freely choose the OS I wanted but the performance issues still remained. Then I decided to use the hard drive of my broken PS3. So I switched disks and installed it for the third time but on its own personal disk this time!

Performance is still not outstanding but then again my notebook never had a outstanding performance regardless of the OS running. Finally, here’s a little video I took. Screen is resistive so it’s a bit hard to use with finger but still it’s the closest thing to a tablet I have at the moment running Windows 8.

I will be playing around with it now and hopefully post more on this subject later.

Off the Top of My Head comments edit

Recently i decided to buy an surveillance camera to setup in my room. For two reasons mainly:

  1. Security: Shocking but true!
  2. Research: I always wondered how these devices operate, how they are installed what protocols they use etc.

I ordered one but the shipment never arrived. After waiting for two months, and a battle for refund i ended up where i started. (By the way, in this instance I was too cheap to shop at a company called TMart.com. I’m glad i finally could get a refund but I’m never ever going to shop there again. I strongly recommend everyone to stay away.)

In the meantime, it occurred to me that i had 2 laptops with webcams and a external USB webcam that i plug into my desktop PC. With 3 cameras i should be able to setup a small security system. So i started searching for some software to turn my cameras into a security system. Surprisingly i found an open-source one. It’s called iSpyConnect (http://www.ispyconnect.com). Better yet it’s written in C#. It supports cool features like uploading to YouTube. But most features that require server support requires a subscription. In the free version you are allowed to upload pictures via FTP to one server. But since I have the source code I’m planning to make my own changes.

So for now i have webcams and required software. I can’t use MacBook and it’s webcam since it’s not supported but I tested it with two webcams (One facing the door and the other facing the window) When it detected motion it started recording the video. Also uploaded pictures to my FTP server on the Internet. So even if the burglar notices the system and somehow manages to delete the local copy of the video feed, there’s still good evidence safe and sound out in the cloud.

I am planning to improve the system and I will be posting more details about it as I go along.

Off the Top of My Head comments edit

I use RSS feeds extensively to follow the tech news. I love Google Reader and i’ve been using it since forever. But lately i realized that i didn’t have much experience in tweaking the settings. I didn’t feel the urge to go into settings and manage my subscriptions. Until 10 ten days ago.

I decided to eliminate some feeds because they seemed to be inactive for a long time. So i clicked on Manage subscriptions link which, by the way has a horrible place from a UI standpoint. It is not even always visible. When you hover on feeds the URL of the feed covers the button.

Google Reader

After fiddling a little with the labels, I made a horrible mistake: I selected all items and clicked Unsubscribe. As one may easily guess, it deleted all my subscriptions.

Google Reader

I had an OPML backup long ago but i don’t even know where it is now. Even if I looked for and found it would probably be out-dated beyond use. Lesson learnt: Start backing up RSS feeds regularly and automatically. While i was desperately pondering what i should do to recover my beloved little messengers, it hit me! I had an application on My iPad called Mr. Reader. It syncs with Google Reader so i also had my entire list of feeds on my iPad. I was hoping the app to support OPML exports so that everything would get back to normal in 5 minutes. Unfortunately, it didn’t! At least i was lucky that iPad was offline at the moment so it couldn’t sync and kept my feeds on the device. (Needless to say, i immediately turned off network access, quarantining my list!) I contacted the support of app’s company, which is the developer himself and he was very kind to respond quickly and offering me a few solutions. One of them was extracting the data from iPad by using a tool called JuicePhone (http://www.addpod.de/juicephone). It’s a free application. I installed it to my Mac immediately, hooked up my iPad and extracted all my data from it. Lesson learnt: Start backing up iPad regularly via JuicePhone as well as iTunes.

After a quick examination, I found out that the app is using an SQLite database to store its data. I downloaded SQLite Expert (http://www.sqliteexpert.com)

sqlite expert

It has a free version called Personal Edition and it seems to have a quite nice UI. Browsing through tables and viewing their data I felt quite relieved when I saw that the list of my feeds safe and sound.

sqlite expert

Now that I have all my feeds, I think it’s a great chance to organize and add or remove them controllably. By the way, after I completed getting my list I sent an email to the author of the app thanking and telling him that i managed to extract my data. A few days later the software updated itself mentioning some change about database. Then i added a new feed and applied the same steps above, to use if it still works, but the database seemed to be the same. I mean the app synced and deleted all my subscriptions and added the new test feed. But the list on the table is the same as before. Maybe he decided to keep its data privately somewhere else to keep it from people like me. Anyway, his advice worked out for me perfectly so I thank him again from here.

Off the Top of My Head comments edit

Recently I was looking for a software to manage my backups. I came across GoodSync. (http://www.goodsync.com/). It is very effective and supports a wide variety of channels. (I will try to review GoodSync and my other favorite tools in detail in another blog post.)

After the trial period, it started to impose limitations. Since I was happy with the tool I decided to purchase it. It’s not very pricey. I think it well deserves $30 but they also provide another option which is called pay by TrialPay.

I vaguely remembered the term when I saw it. But i had never tried or examined it thoroughly before. Basically there are a list if options to select from such as subscribing a service or buying a product. After you select one and complete the required steps you wait until TrialPay confirms it. And after that, voila! They send you your product key and that’s it. Of course, if the TrialPay offers don’t tickle your fancy you might find it wasteful but the for me the list was quite attractive. For example, one offer was to try usenet.nl free 14-day trial. I subscribed for free and i got a license for GoodSync now. Also, another nice offer is registering at GoDaddy and making a purchase of at least $5. I used this offer too to buy another software. Since i was already planning to buy a few domain names, the timing couldn’t be better. And it didn’t take much to convince my brother to signup as long as i will be paying

So, from now on whenever I see a TrialPay option, I will jump right in to see the available offers at the moment. If you’re interested in purchasing GoodSync via TrialPay here’s the link:

http://www.goodsync.com/trialpay_dl_bn.html

UPDATE: Link above is removed as it was broken

Review comments edit

I recently finished reading Mark Russinovich’s technothriller or cyberthriller novel Zero day.

Zero Day

Frankly speaking I wouldn’t expect such an intriguing book from such a high-talented technical person. The book does not require any technical skills to follow. The main idea behind the book is very compelling itself. The world we live in is tied up and bound to computers that any attack to cripple our digital lives also would have a huge impact on every aspect of our lives.

I personally enjoyed the fast pace of the book. Even though it’s not a technical book it’s closely related and a real mind-opener. It sure convinced me to invest more of my time on security since it would be catastrophic both on personal and professional levels (Image you’re a Sony developer that is responsible for one of those hacks!). The only thing that i don’t like about the book is perfect characters. Our heroes are both athletic, gorgeous and genius! They have it all. It’s not envy (or maybe a little) but I’d prefer Sue and Vlad as my heroes! They sound more real and personable to me.

I heard that Mark Russinovich is working on the sequel of the novel. Well, my guess is this time it will contain botnets and Chinese hackers since to me it sounds like a huge threat. Of course, we’ll wait and see…