Monday, November 23, 2015

Using Powershell to work with Json

Today I decided to look into working with Json in Powershell.  Json is rapidly overtaking Xml as the preferred format for describing projects and build artifacts, so it makes sense to learn how to integrate it with tools such as AppVeyor scripts, Visual Studio Team Services Build Tasks or Octopus deployment steps.

A quick search online led me to discover the following two Poweshell cmdlets that can be used when working with Json:

• ConvertFrom-Json
• ConvertTo-Json

Using cmder, I created a new Powershell tab and started typing:

> cd \temp
> md jsontests
> new-item "testjson.js"
> notepad "testjson.js"

I then added the following content to the file:

    Name: "Darren Neimke",
    Age: "42",
    Gender: "Male" 

Flicking back to the console, I typed the following Powershell command to confirm that I could read the content:

Get-Content "testjson.js"

Piping the raw content to ConvertFrom-Json produced the following:

To expand my use of Powershell, I opened the Powershell ISE and created the following script:

$path = ".\testjson.js"
$raw = Get-Content $path -raw

$obj = ConvertFrom-Json $raw
$obj.Age = 45     # I always lie about my age!

Write-Host $obj   # Dump obj to console

Set-Content $path $obj

The ISE amazed me in how it was able to infer the schema of the $obj instance and provided me with Intellisense after that!

Running that script updated the value of the Age property and saved it back to the file.

Things I Learned:

  • Using ISE to create a Powershell script
  • How to pass the content of a file to another cmdlet using piping and variables
  • Updating Json content using variables
  • Saving a file


EntityFramework and the challenge of Entity Serialization

Let's take the following couple of entities:

public class Parent
    public int Id { get; set; }

    public string Name { get; set; }

    public List Children { get; set; }

public class Child
    public int Id { get; set; }

    public string Name { get; set; }

    public int ParentId { get; set;  }

    public Parent Parent { get; set; }

And pass them through an Entity Framework query that looks like this:

var parent = db.Parent
                 .Include(par => par.Children)
                 .Where(par => par.Name == "Somename")

It's interesting to see that we can then write the following LINQ to query the result:

var result = parent.Children[0]

Here, the result variable refers to a Parent type which will have a collection of Children which will have a Parent ... oh never-mind, I'm sure you see where this ends!

When building a Web API application, we might think of exposing this type of query through a Controller action.  In such a case, how should the serializer deal with the cascading references.

One solution is to use a Serialization solution such as the ReferenceLoopHandler that is found in the Json.Net library to ignore circular references.  This switch tells the serializer to exclude reference properties after they have been found the first time.

Another solution is to shape the data to return specific fields from the service operation.

var parentView = new 
    ParentId = parent.Id,
    ParentName = parent.Name,
    ChildCount = parent.Children.Count,
    Children = parent.Children.Select(c =>
            new {
                Id = c.Id,
                Name = c.Name

This approach helps to control the shape of the data and to have greater certainty over what is being returned.

Taking this one step further we would create custom Data Contract classes and return those instead of the loosey-goosey approach of returning anonymous types.

var parentDataContract = db.Parent.Include(par => par.Children)
                            .Select(par =>
                                new ParentView
                                    Id = par.Id,
                                    Name = par.Name,
                                    Children = par.Children.Select(c =>
                                        new ParentView.ChildView
                                            Id = c.Id,
                                            Name = c.Name

This approach gives us better static checks across the application, allows for reuse of Data Contracts across separate operations, and allows us to see where different contracts are being used.  From a versioning and maintenance point of view, this would be the gold standard.

What is your approach to designing service endpoints?   Do you mix RESTful with RPC-style design all in the same Controllers or do you separate them out into their own classes of service?

Saturday, November 21, 2015

Developing from the Command Line

As I have mentioned, the developer workflow has changed quite a bit.  In case you haven't heard or kept up, it looks something like this:

  • From the command line, use Yeoman to generate a new project  > yo webapp
  • From the command line, initialize the folder as a new Git repository  > git init
  • From the command line, open the new project using an Editor of your choice  > code .

As you can see, much more is being done from the command line.  New tools such as cmder are being used to gain quick access to command windows for Powershell/Node/etc to assist and speed up this flow.  Cmder is great because it has transparency, allows you to have multiple tabs, and is easily summoned and hidden away using CTRL+`.

For my task today I decided to initialize a Git repository, add a file, make changes to the file, and commit those files to a Git branch all from the command line.  I wanted to use Powershell to create the project folder and the initial file so that I could tick my "One thing per day" goal of using PS for something at least once per day!

I found the New-Item (alias: ni) cmdlet which allows you to create a variety of item types.  To create a new folder, give it the -ItemType of 'directory' and then the name of the folder that you wish to create.  E.g.

> New-Item -ItemType directory myNewDirectory

I then went ahead and used Git to initialize a repo in the new folder:

> git init

New-Item can also be used to create files, just give it the name of the file that you want to create:

> ni "file1.txt"
> notepad "file1.txt"

This adds a new file named file1.txt and opens it in Notepad.

> git add .
> git commit -a -m "Adding file1.txt"

This will commit the changes of to your Git repo.  It's easy to visualize what's happening in Git Extensions:

The folder can be opened using VS Code using code and a dot "." to open the folder that you are currently in:

> code .

After playing around with Git for a while, I wanted to delete my test folder so I typed Remove- and pressed CTRL-SPACE to find out if Powershell had a Remove-Item command

And sure enough it did.  So I finished with the following PS command to blow away my test folder:

> rm "\testdir" -force

What I Learned:

  • When using cmder, I can start typing the name of a command and then use CTRL+SPACE to find all matching cmdlets

Friday, November 20, 2015

Doing 1 thing each day using Powershell

While at Ignite on the Gold Coast this week, it became very obvious to me that there are some really key technologies that I need to be across.  Watching people develop code and seeing them zip around using the Command line and various package managers highlighted where things are at with the developer workflow.

Technologies that I have committed to being across are:
  • Yeoman
  • Powershell
  • Git
  • Grunt/Gulp
  • Chocolatey
  • Visual Studio Team Services
To ensure that I remain curious and stay on track, I'm going to try and do one thing with Powershell each day.  Today's task is…

Delete project.lock.json files from a solution
This was a bigger issue in the past than it seems to be now, but I found that I regularly needed to manually delete the DNX lock files that were being generated by dnu.

The final product:

gci "\repos\dneimke\EF7Demo.CoffeeStore\*" -include "project.lock.json" -recurse | foreach($​_) {rm $ _​.fullname}

What I Learned:
  • The Get-Help cmdlet is a great resource for learning about how other cmdlets work
  • Get-Alias lists the aliases for all cmdlets
  • Get-ChildItem takes a -Path which makes it easy to list items in a folder - e.g. Get-ChildItem -Path \repos\test
  • Include seems to be the better way to target a specific pattern of file rather than -Filter


Wednesday, November 18, 2015

Getting Started with EF7 - Adding EF7 to a new Project

While on the Gold Coast at Ignite, I presented on the new version of Entity Framework (EF7) while my colleague Jon spoke about the broader topic of .NET vNext.

In my talk, I gave 4 demos:
  1. Walkthrough showing how to add EF7 to a new project
  2. Using SQL Profiler to show the queries that EF7 generates for various scenarios
  3. Adding Migrations and Seeding to your application
  4. Using the new InMemoryProvider to easily unit test code that depends on EF7 data contexts
For the first demo I really wanted to show how simple it is to get started with EF7 and how the new component architecture works with respect to Nuget packaging.

For my first demo, I started off by creating a new Console Application (Package) project from the Web Templates.

The Solution must be configured so that the runtime version is aligned with a runtime that you have installed and Nuget is knows which Package Source contains the versions of the dependencies that you want to use.

To find which runtime versions you are installed on your machine, use the DNVM list command:

This shows that my machine is currently configured to use the 1.0.0-rc2-16183 clr x64 runtime, so the first thing I do is to change the global.json solution file to match.

  "projects": [ "src", "test" ],
  "sdk": {
    "version": "1.0.0-rc2-16183"

Nuget needs to know where to look when it restores packages.  This can be done by adding it in the Nuget.Config file.

In this case the ASPNETCIDev source which is hosted on MyGet is where I want to get the EF packages from as contains the most recent packages from the ASPNET daily CI build process.

<?xml version="1.0" encoding="utf-8"?>
    <clear />
    <add key="" value="" />
    <add key="aspnetcidev" value="" />

The last thing to do is to add references for the EF7 dependencies that I need. I this case I'm after the following 3 packages:

  1. EntityFramework.Core: Contains core logic for DbSets, DbContexts, DataAnnotations, Querying, ChangeTracking, and Configuration among other things. 
  2. EntityFramework.Commands: A set of commands that can be used to create Migrations, Update Databases, and to scaffold an application from an existing database.
  3. EntityFramework.MicrosoftSqlServer: A database provider for using EF against a Microsoft SQL Server database.

Each of these packages represent a single project in the EntityFramework repository which is available to view in this GitHub repository.

"dependencies": {
  "EntityFramework.Core": "7.0.0-*",
  "EntityFramework.Commands": "7.0.0-*",
  "EntityFramework.MicrosoftSqlServer": "7.0.0-*"
"commands": {
  "ConsoleApp10": "ConsoleApp10",
  "ef": "EntityFramework.Commands"

The "ef" command which is added to the project commands can be used to run the EF commands using DNX.

At this point, EF7 is configured and available to use. To test this, jump to the command line at the root of your project and type dnx ef. You should see the Magic Unicorn splash screen.

I finished the demo by creating a simple DbContext which contained a  couple of DbSet entities and created a Migration  using the EntityFramework Commands so that I could start working against a database.

What I hoped to achieve through this demo was to show the configuration points and how they connect the application to its environment and dependencies.

It is important to take note of the benefits that are achieved from the EF7 'ground up' rewrite which has delivered multiple, lightweight packages which empowers application developers to take only what they need in terms of dependencies.  Don't need Commands?  Simple, don't take that dependency!  Over time, this will enable more rapid innovation from Microsoft's end and increase flexibility and performance on the applications side.

If you are interested in taking a look at the sample code from my demos, you can find it in this GitHub repository.

Sunday, July 14, 2013

Problem trying to run a Web Application using Visual Studio 2013 Preview

I'm putting this here in the hope of saving some other poor soul the hour I lost this afternoon while playing around with a fresh Visual Studio 2013 Preview installation (Visual Studio Version 12).

I created a new Web Application (MVC Template) and pressed F5 to start it in debug mode.

Launch a new web application from within Visual Studio
 I was immediately presented with the following error dialog from within Visual Studio.

Microsoft Visual Studio.  
Process with an Id of xxxx is not running. 

I also saw the following text in my debug console window: The program '[xxxx] iisexpress.exe' has exited with code 0 (0x0)

The problem turned out to be an incorrect configuration for an IIS Express application pool setting which I found in my \users{your username}\mydocuments\IISexpress\config applicationhost.config configuration file.

The managedRuntimeVersion was mis-configured by default
I needed to change the framework version for the default app pool to match the current framework version that is installed for .NET 4.  In my case, that meant changing the version to v4.0.30319.

Thursday, September 13, 2012

How to Use Google Plus Circles

A common question I hear is "How can I get value from Circles in Google+" so I thought I'd list a few ways that I find them helpful.   Firstly here's an image that shows what my Circle strategy looks like:

As you can see, I don't have hundreds of Circles - although I wouldn't necessarily be opposed to having that many either.  Here are my top 5 reasons for using Circles:

An obvious initial reason for segregating users into Circles is to limit the scope of information that you share.  Again, this is a key driver behind my [Colleagues] and [Family] Circles.  Having the ability to post information directly to those groups allows me to limit information from being seen by people that it might not be relevant for.  

Example: I'm at the beach and I use my mobile phone to take some photos.  When I get home, the photos are instantly sync'd with my G+ account as soon as my phone hits the Wifi.  From there it's just a couple of clicks to share those photos with my family members by posting them to the [Family] Circle.

Noise filtering
I know how frustrated I feel when I see other people clogging up my feed with their own personal interests (e.g. excessive posts about cats) and so, I believe that it is really important to be aware of and manage to the amount of "noise" that I emit to other individuals.  

Example: I'm confident my [Hockey] friends don't mind me posting several updates a day about hockey related stuff (e.g. pictures, embedded YouTube video's, etc.).  However add that up with my [Developer Community] related posts, and a few other general posts and suddenly I'm at risk of having people de-circle me because I'm too noisy.  

Having a [Hockey] circle allows me to post hockey-specific stuff to just those members and thus reduces the amount of "noise" that I'm sending to people with no interest in hockey whatsoever.

Tip: A neat feature is [Your Circles].  [Your Circles] is one step back from [Public] and allows you to easily share information with the widest scoped audience.  In Settings you can manage how wide that scope is by managing which Circles are included in the [Your Circles] scope:

The Internet provides us with unlimited opportunities to access information, but managing the signal to noise level is a constant challenge.  [News and Information] and [Tech News] are Circles where I've add lots of providers and therefore receive a great deal of information.  To deal with the resultant "noise", I then tune the volume of information I receive from them in my main feed by using the following tools.  

Tune the amount of information displayed in the main feed

Click on filters to display all items for a given Circle

Drag Circles to change their order so that most common ones are displayed first

Search and Organize
In addition to the above mentioned benefits, posting to Circles acts as a way of grouping so that content can  easily be found later from among the masses of other posted content.  

Example: Although I may not remember the exact content of something posted, I may be able to find it by recalling that it was [Hockey] related.  Given that knowledge I could filter my main feed by the [Hockey] Circle and then scroll through the reduced amount of information to locate a post I'm after.

Integration across the Google landscape of products 
Given the integration of G+ across the Google sphere of products, it shouldn't come as a surprise that your investment Circles can be leveraged in other applications.  

Example: Circles flow through into Gmail, and therefore provide a useful way to find and organize communication from contacts by filtering based on the Circles they belong to.  This is a key driver behind having my [Hockey], [Family], and [Business] Circles.