Wednesday, January 4, 2012

My thoughts on Boiler Plate code

During the same conversation about how I structure code the topic of “what boiler plate(s) do you recommend?”.
My answer: none!
You may be asking why. This comes back to the idea of vertical, context specific, slices. Any boiler plate code would be covered by the infrastructure MVC/P, data access, logging, etc. And these frameworks are really boiler plate as they are implementing compound design patterns.
And if there is some form of boiler plate code that emerges from the vertical slices encapsulate it into it’s own object and inject the component into the calling code. Favor composition over inheritance. It pays off every time.

Architectural Design

I was asked recently “how do you go about designing an application”? Let’s assume we know the problem domain and what the application needs to do. The next step is how we go about solving the problem.
Traditionally you have stacked “layers”
traditional ntier
The problem here is that you must assume one builds on top of the other and that all functionality must fall into a specific layer. You typically end up with Façade classes ending up as God classes. Anything that could ever be done to a user goes in the UserFacade class. Orders => OrderFacade… you get the idea. I find this happens because it lines up nice with the picture of n-tier you are imagining in your head.
Instead I like to think in vertical slices where each feature is a slice. Think of a slice as a specific context in which the user accesses the system. Search is a very common feature, so is placing an order. These features will most likely consist of components touching all the “layers” of n-tier, but they are only used in one specific context. (sorry can’t find an image to demonstrate what I’m writing.)
How does this impact the code. When designing by the slice you end up with many small context specific objects all designed to do one specific task. So you have many objects, but each object is only responsible for a small, one, aspect of the system. reading code becomes so much easier and managing the code becomes easier as well because you can keep all the components next to each other.

Tuesday, December 20, 2011

Up and running with PS and Git

Today’s post by Phil Haack mentioned using powershell over git bash. Up until now I thought I had to use git bash to communicate with github and use git. I like the idea of using powershell. it falls in line with other windows conventions.
So I finally got it all up and running. To start I had to edit my profile script and replaced my explicit path to git set-alias git "c:\Program Files (x86)\Git\bin\git.exe" with a general path to $env:path += ";" + (Get-Item "Env:ProgramFiles(x86)").Value + "\Git\bin".  This allows access to all the executables in the git directory!
Next I installed PsGet. Which is dead simple. After that I cloned posh-git to my user folder and ran .\install.ps1. This updates my powershell profile with a link to some ps scripts by Keith.
I finally fell like I have a command friendly command line perfect for development. 

Thursday, December 15, 2011

Taking a few steps back

I’ve realized something over the last few nights: 1. altering/extending isn’t as easy as it seems 2. I should take my time and learn the tools I’m using.

Common sense really. Don’t start a new project with new technology and new practices and expect everything to magically work the first time.

So I decided to take a step back and work on the first piece of the puzzle. getting powershell properly configured. The key is profiles. Once you know the term to search there is plenty of information on the topic. I happened to stumble across the term.

First thing to add to my user profile is git

set-alias git “c:\Program Files (x86)\Git\bing\git.exe”

Simple enough.

Up next is getting nuget power tools configured. I want to use this feature to manage my dependencies for projects.

Monday, December 12, 2011

Update on OSS project(s)

The Rhino.Etl enhancement to support dynamics didn’t go as smoothly as I planed. I need to back up and re-think how exactly to do this.

So on to project/idea #2. A RavenDb job store to Quartz. This lead to some new adventures with NuGet. it’s actually quite simple to use, however I don’t like how it dumps everything into a single directory. I like to separate the release libraries from the develop libraries, but I’m sure I can resolve that with some simple powershell scripting.

Now that the basics are in place I will review the AdoJobStore included with Quartz and then real development can begin. If your interested check out the repository: https://github.com/jmeckley/Quartz.RavenJobStore.

Thursday, December 8, 2011

“Gitting” up and running… slowly

For my first real project using github I decided to fork Rhino.Etl, my favorite ETL library, and try to update it a bit. To start I wanted to target 4.0. Then I wanted to replace Boo references with the DLR. I don’t use Boo scripts and wanted to remove my dependency on this assembly. I may clean up with code a bit too. There seems to be a lot of noise, but that could just be features I don’t use. Finally I still want to keep Rhino.Dsl in the mix because it is an existing feature of Rhino.Etl.
On top of all this I am learning Git and xunit and psake and … well, there is plenty to learn. So after two evenings of battling with the command line I finally have
  1. Rhino.Etl targeting 4.0
  2. The build script running
I do have 3 failing tests which I need to look into as well, but the build script is working!
To get to this point I first forked hibernatingrhinos/rhino-etl. I saw this hasn’t been touched in a while. I then issued a pull request from Nathan Palmer. His fork was using a newer version of psake. That caused some conflicts which I then resolved via notepad++ (ouch!).
Then came the task of targeting the 4.0 framework. Step one: change the target framework of each project. Step two: build from VS. That was a quick win. Now the build script.
For reasons I don’t understand msbuild was pointing to v3.5. It didn’t know what 4.0 was and kept reverting to 3.5. I tried setting the $framework_version, but it kept reverting. Finally I used brute force to resolve this and used the absolute path to v4.0 msbuild. Another step closer.
Then came xunit. I needed to use xunit.console.clr4.exe to target the 4.0 framework. Otherwise the build would fail stating the framework was built using an earlier version.

Tuesday, December 6, 2011

Time to get coding

VS 2010 Pro arrived! Time to install and begin fleshing out the ideas bouncing around in my head.