I had a particular set of tests that when run with a large test data set seemed to be taking a long time to run and so I wanted to check that it was simply the setup and teardown of the tests was taking the time and not the tests itself.
I could have manually put some instrumentation into my tests but I’d rather use a tool for the job and therefore I wanted to run the profiler for my test. It turns out you can do this for MSTest fairly easily. First load the test view (Test –> Windows –> Test View) and seeing as I was using Resharper test runner, this is a fairly important step! Now from the context menu of the test you want to run select ‘Create Performance Session’ :
Up pops the Performance Wizard which basically asks whether to Sample or Instrument the process – I chose the later for accuracy. You now have the ‘Performance Explorer’ window with the configured performance profile :
Now all I have to do is click the third button on the toolbar ‘Launch with Profiling’ which runs the process and once completed presents me with a summary of the worst performers :
At this point all you have to do is analyse the results and figure out where the slowest sections are. I actually found it easiest to change the view from ‘Summary’ to ‘Modules’ then drill down to the method I new was the method under test and confirm that it was running within an acceptable timescale.
Of course, if you can’t afford Visual Studio Team System then there are other offerings out there… and jetBrains’ dotTrace is one of them which is probably going to cost at lot less!
I’m not sure why I had so much trouble trying to do this… but because it did I think it’s worth a blog. Basically I was trying to return an int value from a stored procedure but couldn’t quite work out the syntax and was initially distracted by trying to use AddOutParameter() . However, the key was to use AddParameter() specifying the direction as ReturnValue :
db.AddParameter(cmd, "@return_value", DbType.Int32, ParameterDirection.ReturnValue, null, DataRowVersion.Default, null);
Then simply access the named parameter :
Phew, that’s the title out of the way!!
I had a simple plan – run some complex and optimised SQL and return a graph of entities back to the app using Entity Framework. First problem is it doesn’t support returning anything but a single entity from the stored procedure so that’s eager loading done for! Second problem is the entity has to exist in the model so creating a POCO is out of the question and I’m not so sure about creating arbitrary data transfer objects in the model either (EF V2 should help with this.)
So I resorted to running the sproc then lazy loading the child objects I need. What I needed to do though was traverse a couple of references and test a field to limit the returned rows. And here’s what I ended up with after much poking around :
There are two straight-forward lazy loads using .Load() but I’m only attaching entities for Child3 based on the results of the ‘inner query’ which uses .CreateSourceQuery() to create a queryable.
There’s a very handy attribute called InternalsVisibleTo that can be used in a class to make internally declared members accessible to a specific assembly. This probably isn’t wise for most situations but it’s very handy for testing internals from a separate test assembly. It’s also handy to provide a parameterised constructor that can accept different dependencies whilst the default constructor is hard-wired to the standard dependencies – this seems like an easy way to add DI for the purposes of testing to existing applications without disturbing the app structure too much or introducing a confusing DI factory.
It’s a bit of a code smell adding code simply for testing purposes and in particular following different paths through the code during testing whilst production code follows another path (default constructor in production, parameterised in test) however, I think it’s a reasonable compromise to take if it means adding valuable tests without too much extra work!
Given this enum :
Instead of this code :
I would rather see this :
I can’t add an extension to the Enum system class but I can create a new generic Enum class which strongly types the results :
Here’s a question. What’s wrong with this code :
To be honest, there’s nothing wrong with it. And most applications would be absolutely fine. However, I find there are things I don’t like :
- The need to type cast.
- The use of a string to index the session – if this code is repeated elsewhere only testing will discover whether one of them was typed incorrectly.
- The use of the Session object itself!!
That last point is probably my biggest reason for writing the code a differently. When UserAccess is fetched elsewhere in the code, each line will be hardcoded with the method of access. If I changed my mind and wanted to place the object elsewhere, in view state or configuration file for example, then I’d have more places to edit. So what would I replace it with?
Seems like a lot of effort but some of that could be reduced by using templates if you don’t want to type too much! I cache the object so no matter how many times I use the property I’m not hitting the Session object and casting. I’ve also added the property as virtual so a subclass of the page could override the access and replace it with something entirely different – which could be very useful for wrapping the code behind for the page in a unit test and therefore removing any need for ASP.NET objects.
I’ve highlighted this fairly small idea because it highlights some important concepts including that of writing code that maintains the original intent (line 3 now doesn’t have to include any code that indicates where or how the user object is stored) and isolating persistence methods even within a class.
A long time ago I wrote some really basic code to colour the output displayed in the Windows cmd.exe. I’ve just come across it hidden in one of my folders and thought I’d resurrect it since it’s something I’d like to look at in PowerShell when I have the opportunity (although changing the colour in PowerShell is easy with Out-Host.) The tool was originally knocked together because I’d wrapped grep.exe in a batch file but wasn’t happy with the output – I wanted matching strings that I was searching for to be highlighted in different colours.
As an example here’s some test text…
This is a <style textcolour='red'>quick</style> test to <style textcolour='Blue'>see if</style>
the colourise utility works and identifies
<style textcolour='Yellow'>all</style> the necessary items.
This is a test to check <style textcolour='Magenta'>it matches colours at the end.</style>
<style textcolour='Yellow'>This has <style textcolour='Red'>nested <style textcolour='Blue'>colouring</style> that </style> currently </style> isn't supported.
which renders as :
Note that nested styles are not supported.
The code to accomplish this is below.
static void Main(string args)
while ((lineInput = Console.ReadLine()) != null)
Stack<ConsoleColor> oldColourStack = new Stack<ConsoleColor>();
Regex rx = new Regex(@"(?<pretext>.*?)<style textcolour='(?<colour>[^']*)'>(?<colouredtext>.*?)</style>(?<posttext>.*?)");
MatchCollection matches = rx.Matches(lineInput);
if (matches.Count > 0)
foreach (Match match in matches)
Console.ForegroundColor = (ConsoleColor)Enum.Parse(typeof(ConsoleColor), match.Groups["colour"].Value, true);
Console.ForegroundColor = oldColourStack.Pop();
Match lastmatch = matches[matches.Count - 1];
Console.WriteLine(lineInput.Substring(lastmatch.Index + lastmatch.Length));
EDM Generator (EdmGen.exe) validates and generates an Entity Data Model (EDM) from an existing database. Mapping will be one-to-one. Available in .NET Framework 3.5 SP1.
Available in VS2008 SP1 :
- Entity Data Model Wizard
- Entity Designer
- Update Model Wizard
Entities are required to have keys. If not the generation tool will infer one (generating a DefiningQuery element in the store schema rendering it read-only until manually confirmed and the element is removed.)
Tables representing many-to-many relationships will not be generated as an entity rather a relationship.
See Entity Framework Terminology.
The Conceptual Model
An EDM schema defining entities and associations called the Conceptual Schema Definition Language (CSDL.) Each entity has : a name, a key and a set of properties (of type simple, scalar or complex and can be nullable or have a default value.)
The Storage Model
Uses Store Schema Defintion Language (SSDL) to define the logical model for persistent data. The types used are of those from the storage model (e.g. SQL Server.)
The Mapping Specification
Uses Mapping Specification Language (MSL) to connect conceptual types to the storage model.
The following diagram highlights how EF integrates with ADO.NET Data Providers and where developer interaction occurs. There are three methods for generating queries against the EDM :
- Entity SQL
- Language-Integrated Query (LINQ)
- Object query builder methods
Working with Entity Data
Referenced objects are not automatically loaded and therefore the Load method on the EntityReference (for one-to-one relationship) or the EntityCollection (for a one-to-many relationship) must be called to load the related data into the object context. An alternative is to specify a query path that defines the related object to load.