-->

C# 6.0 New Features - Auto-Properties with Initializers

dev csharp

Currently, in Visual Studio 2013, if you have a line like this

public int MyProperty { get;  }


you’d get a compilation error like this:

But the same code in VS 2015 compiles happily. The reason to add this feature is to not get in the way of immutable data types.

Another new feature about auto-properties is initializers. For example the following code would compile and run with new C#:

public class AutoInit
{
public string FirstName { get; } = "Unknown";
public string LastName { get; } = "Unknown";

public AutoInit()
{
Console.WriteLine(string.Format("{0} {1}", FirstName, LastName));
FirstName = "Volkan";
LastName = "Paksoy";
Console.WriteLine(string.Format("{0} {1}", FirstName, LastName));
}
}


and the output is unsurprisingly looks like this:

When I first ran this code successfully I was surprised how I managed to set values without a setter. Looks like under the covers it’s generating a read-only backing field for the property and just assignning the value to the field instead of calling the setter method. It can easily be seen using a decompiler:

As it’s a read-only value it can only be set inside the constructor. So if you add the following method it wouldn’t compile:

public void SetValue()
{
FirstName = "another name";
}


It’s a small improvement providing an alternative way to write the same code in less lines.

C# 6.0 New Features

dev csharp

A new Microsoft

These are exciting times to work with Microsoft technologies as the company seems to be changing their approach drastically. They are open-sourcing a ton of projects including the .NET Framework itself. Maybe the best of it all is the next version of ASP.NET will be cross-platform. There are already some proof of concept projects that run a ASP.NET vNext application on a Raspberry Pi. I like the direction they are taking so I think it’s a good time to catch up with these new instalments of my favorite IDE and programming language.

New features in a nutshell

Looking at the new features it feels like they are all about improving productivity and reducing the clutter with shorthands and less code overall. (It’s also confirmed by Mads Torgersen in his video on Channel 9)

If you check out the resources at the end of the this post you’ll notice that there is quite a flux in the features mentioned in various sources. I’ll use Channel 9 video as my primary source. It features a PM in the language team and it’s the most recent source so sounds like the most credible among all.

Here’s the list of new features:

1. Auto-Properties with Initializers
2. Using statements for static classes
3. String interpolation
4. Expression-bodied methods
5. Index initializers
6. Null-conditional operators
7. nameof operator
8. Exception-handling improvements

I was planning to go over all of these features in this post but with sample code and outputs it quickly became quite lengthy so I decided to create a separate post for each of them. Watch this space for the posts about each feature.

Resources

aws s3, powershell

Cloud computing is a relatively new concept, especially when compared to FTP which dates back to 70s (History of FTP server). So not every device supports S3 uploads. If you cannot force a device to upload directly to S3 and have control over the FTP server machine (and assuming it’s running Windows) you can create a simple PowerShell script to upload files to S3.

FTP to S3

First you need to install AWS Tools for Windows. I tested on a couple of machines and the results were dramatically different. My main development machine is running Windows 8.1 and has PowerShell v4 on it. I had no issues with using AWS commandlets in this environment. The VM I tested has PS v2 on it and I had to make some changes first.

v2 vs. v4

The problem is AWS module is not loaded and you have to do it yourself with this command

import-module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"


After this command you can use AWS commandlets but when you close and open another shell it will be completely oblivious and will deny knowing what you’re talking about! To automate this process you need to add it to your profile. AWS documentation tells you to edit your profile straight away but the profile did not exist in my case. So first check if the profile exists:

Test-Path $profile  If you get “False” like I did then you need to create a new profile first. To create the profile run the following command: New-Item -path$profile -type file –force


then you can edit the profile by simply running

notepad $profile  Add the import-module command above and save the file. From now on every time you run PowerShell it will be ready to run AWS commandlets. Time to upload Now that we got over with troubleshooting we can finally upload our files. The commandlet we need is called Write-S3Object. The parameters it requires are the target bucket name, source file, target path, and the credentials. Write-S3Object -BucketName my-bucket -File file.txt -Key subfolder/remote-file.txt -CannedACLName Private -AccessKey accesskey -SecretKey secretKey  Most likely you would like to upload a bunch of files under a folder. In order to accomplish that you can create a simple PowerShell script like this one: $results = Get-ChildItem .\path\to\files -Recurse -Include "*.pdf"
foreach ($path in$results) {
Write-Host $path$filename = [System.IO.Path]::GetFileName($path) Write-S3Object -BucketName my-bucket -File$path -Key subfolder/\$filename -CannedACLName Private -AccessKey accessKey -SecretKey secretKey
}