Benchmarking Enum.ToString() performance in Visual Studio Code

I have been recently working quite a lot in Visual Studio Code (VSC), writing some Go code. With the Go for Visual Studio Code extension – writing code is easy and simple experience.

For .Net however, I was using full version of Visual Studio. There were several reasons for that. Initially, VSC was designed to build ASP.Net code and I rarely do web applications only. Technically speaking, one could have used command line tools like msbuild or csc for building other types of .Net components, but the experience wasn’t acceptable. Especially, if you’d try to use NuGet packages in your code.

Recently I thought, why not to give another try, because new tools for .Net development were added to the VSC.

The scenario

Recently we were investigating performance issues related to one server side application. One thing that came into our attention was code like this:

return EnumTypeValue.ToString();  

Alone this code wouldn’t be a problem, if not the fact that it is called like 1,000,000 times per second.
It is a problem that has been discussed on Stackoverflow and these days it is easy to find the code behind Enum.ToString() on github.
For example, here the code spends time evaluating if there’s a flags attribute through reflection of course:

private static String InternalFormat(RuntimeType eT, Object value) 
  Contract.Requires(eT != null); 
  Contract.Requires(value != null); 
  if (!eT.IsDefined(typeof(System.FlagsAttribute), false)) // Not marked with Flags attribute 
    // Try to see if its one of the enum values, then we return a String back else the value 
    String retval = GetName(eT, value); 
    if (retval == null) 
      return value.ToString(); 
      return retval; 
  else // These are flags OR'ed together (We treat everything as unsigned types) 
    return InternalFlagsFormat(eT, value); 

As a result, single-liner ToString() can result in a measurable performance hit under heavy workload.

Could I write benchmark for this case in VSC now?

The problem is obvious and solution is clear, the question for me was whether it is reasonably possible to write .Net code that would benchmark that scenario using Visual Studio Code.

On the machine I had only VSC installed, nothing else. Opened VSC and, just after creating first file with extension .cs I was greeted with the message suggesting to install recommended extensions.


Nice surprise after choosing “Show recommendations” was the list of two extensions: C# and Mono Debug.


I have decided to go with C# extension only. This resulted in yet another prompt that I need to install .Net CLI tools.


Note: I don’t have full version of Visual Studio installed on this particular machine, so I needed to download .NET Core SDK for Windows.

I did installed that (restart of the machine was required) and went back to VSC and Command Line.

First things first – setup new project in folder:

dotnet new 

Guess what type of application I got by default? Console application! Back to the command line roots.
Next, I’ve added some code in separate files:

public enum SomeEnum

public static class SomeEnumRepository {
    private static Dictionary<SomeEnum, string> _namesBySomeEnum;
    static SomeEnumRepository()
        _namesBySomeEnum = Enum.GetValues(typeof(SomeEnum))
            .ToDictionary(k => k, k => k.ToString());
    public static string ToStringFromDictionary(this SomeEnum se)
        return _namesBySomeEnum[se];

public class EnumToStringBenchmark
        public string EnumToString(SomeEnum value)
            return value.ToString();
        public string EnumToStringCustom(SomeEnum value)
            return value.ToStringFromDictionary();

Coding experience in C# has improved a lot since last time I have tried (thanks to the extension):

  • Missing using statements can be added from warnings
  • cw expands to Console.WriteLine (love it!)
  • If something is missing – restore is suggested automatically and if executed, references to the packages are added to the project.json file
  • Shift-Alt-F – formatted code
dotnet run –c RELEASE

– builds and runs the code.

So far so good, but that was only a half way. I wanted to run benchmark using BenchmarkDotNet, which meant that there are some NuGet packages involved in the process. Of course, I could do the timers, but that wouldn’t give me the easy way to look into GC and memory usage statistics. I added dependencies to the project.json file and was immediatelly greeted with suggestion to “Restore” dependencies, which too nearly 6 seconds and immediately failed, because diagnostics module requires .Net 4.x version. Because of that, reference to core clr had to be removed and .net 4.6.1 added to the project.json file. Resulting file looks like that:

  "version": "1.0.0-*",
  "buildOptions": {
    "debugType": "portable",
    "emitEntryPoint": true
  "dependencies": {
  "frameworks": {
    "net461": {}

It was easy to figure out that you need to type the NuGet package name in dependencies section. Another nice part, that Intellisense was suggesting package versions.
It was slightly harder regarding framework version, because, at least on my machine, I cannot have both net461 and coreclr configured at the same time even without diagnostics module.

Note: I needed developer pack: to be installed on my machine.

The code was still building, so I needed to attach the benchmark to the existing code and see for the results.

public class Program
    public static void Main(string[] args)
                .With(new Job { LaunchCount = 1, WarmupCount = 2, TargetCount = 10, Runtime = Runtime.Clr})
                .With(new MemoryDiagnoser())
public class EnumToStringBenchmark
    private Random _rnd;
    private SomeEnum[] _possibleValues;
    private static int _count;
    public void SetupData()
        _possibleValues = Enum.GetValues(typeof(SomeEnum))
        _count = _possibleValues.Length;
        _rnd = new Random();
    public string EnumToString()
        return _possibleValues[_rnd.Next(_count)].ToString();
    public string EnumToStringCustom()
        return _possibleValues[_rnd.Next(_count)].ToStringFromDictionary();

These are the results on my machine:

Method Median StdDev Gen 0 Bytes Alloc/Op
EnumToString 497.5330 ns 1.0337 ns 334.00 25,99
EnumToStringCustom 28.6341 ns 0.0456 ns 0,00

The code is available on GitHub

The verdict

There’s still some mess, especially in the area of handling different frameworks and runtimes, however the state looks much better now than even a half a year ago. I definitely could start building some server-side .Net apps using Visual Studio Code and I like that.
Also, we already know that project.json is almost gone before even making full appearance. That means, inevitable changes, issues and status “not yet there”.

Resources from my Techday 2012 presentation on SQL Server transaction performance

Recently I gave a presentation on SQL Server transaction performance diagnostics and improvement on Microsoft organized TechDay events in Lithuania and Latvia.

The time for the talk was very limited and topic – really big and complex. So I have decided to briefly cover some aspects like blocking & deadlocks, query plans and finally some tools for diagnostics, including my new favorite – extended events (xEvents).

For those who want to know more about the topics I’ve talked, you will have to dig deeper on your own, but hopefully you have some idea.

Here are couple of good points where to start:

There are many more sources, but definitely you can start with those links and Google Bing further absorbing all the good information along the way.

Of course, I have posted slides and scripts of my presentation here. And if you will go through these links, or at least take a look to extended events and system_health session – then my goal was achieved.

And make sure you know your data and pass this knowledge to your SQL Server in form of good physical design, indexes, statistics and appropriate queries.

Thoughts on Windows Azure Mobile Services

I was excited, when I saw the Windows Azure Mobile Services announcement. The idea itself is great – all those different services joined into a single, simple cohesive API. While SDK for other platforms (even Windows Phone) is currently not available, but that is not a big problem, because underneath it uses the same HTTP.

Mobile services provide several important features: data storage, authentication, authorization and push notifications. In fact, if you think about developing apps for the user that have to share the state (e.g. between devices) – that’s a perfect thing. Considering the fact that according the Windows Azure pricing site you can

Run up to 10 Mobile Services for free in a multitenant environment

This is great, isn’t it? And the provided ToDo List Sample application shows just like that. Everything is good, except one thing – mobile services don’t include SQL Server cost. And it does cost …

Suggested model is valid, if you’re developing multitenant application and you want to hold the data as part of your service – then you pay for the data storage and bandwidth.

Another story is, if you are really making yet another “to do” application (YATDA), then it is about the user to decide, whether they want to keep it locally or make available on all their devices and potentially pay for the storage and bandwidth usage.

So, even after introduction of mobile services, fact is still the same: there is still no place for per-user (not per-application) storage for application data that should be available on his multiple devices.

Windows 8 has an ability to sync settings, but that works only between Windows 8 machines and only if you have switched to Microsoft account. Really? Thanks, but no, thanks.

Storing application data on SkyDrive is also not an option – its “fair use” requirements prohibit storing anything else except documents and pictures.

Mobile services on the other hand requires an app developer to own, manage and pay for. And most probably he will want that money back either in form of payment or Ads. Can you imagine YATDA that has “Sign-up for the free trial” as it is shown in the sample? Me not.

So what’s missing in the story? IMO only one thing – this service should be also available as part other Windows Online services together with Outlook, SkyDrive, Calendar and others. Ideally, each Windows Online user could have allocated 100MB or 1GB database for free that he could use for all his applications he wants to make available on all devices.

I hope it will happen sometime in the near future, because this would make “three-screen” strategy possible and easy.

Notes from #agileturas conference

It happens that I have attended Agile Tour Vilnius 2011 conference.

A lot of great speakers delivered interesting presentations on various Agile aspects from technical to management, for beginners and for those with some experience.

Keynotes were presented by J.B. Rainsberger, Mary Poppendieck and Jurgen Appelo.

Ok, here are few things that I’ve noted during the conference (typically, I try to take top 3 things for myself) so, here it comes:

  • Scrum Master is not your scrum secretary (by Paul Goddard). Scrum master is not your agile PM, it is the one who should help the team to become self-organizing (also means effective).
  • Jurgen has enforced the above by talking about “How to change the world” (no, not usual “change management” stuff of surviving change) by referencing complex adaptive systems theory discussed in his Management 3.0 book.
  • Change cannot be implemented with a flip of the switch. It is more like dancing (guiding) to the desired result by constantly adjusting direction.
    I guess that’s it for now. Time to look at our own team and have some dance Smile

Parameterized queries, filtered indexes and LINQ

Sometimes I like to cite “Remember remember the fifth of November” in case, when I trip the mine that I teach others to avoid.

This time it happened, when reviewing some database structure and related code. So, here it goes.

The database had a table, which can be scripted as shown below. As you can probably note, we are talking about a table that holds queue of jobs that have to be performed.

CREATE TABLE [dbo].[Jobs](
    [Id] [int] IDENTITY(1,1) NOT NULL,
    [Priority] [tinyint] NOT NULL,
    [DateInserted] [datetime] NOT NULL,
    [DateAllocated] [datetime] NULL,
    [DateFetched] [datetime] NULL,
    [Status] [tinyint] NOT NULL,
    [SerializedData]  NULL,
    [ParentId] [int] NULL,
    [ReferenceID] [bigint] NULL,

[DF_Jobs_DateInserted]  DEFAULT (getdate()) FOR [DateInserted]

[DF_Jobs_Status]  DEFAULT (1) FOR [Status]

The use case for the table was simple:

  • One part of the application inserts new jobs based on customer activities and by this sets the default date and status (1)
  • Then service fetches jobs, by taking Id’s of specified amount of new (status=1) jobs and setting appropriate date on DateFetched and updating status to (2)
  • Later at some point XML data will be retrieved using the Id’s retrieved previously

Initially, the code was LINQ 2 SQL in the program itself and was something like this:

var q = ctx.Jobs
            .Where(j => j.Status == 1)
            .Select(j => new { j.Id, j.Priority, j.DateInserted, j.Status })
For the testing purposes, I’ve loaded the table with 100k rows, out of which only 401 had status=1.
SELECT TOP (100) Id, [Priority], DateInserted FROM Jobs
WHERE Status=1
Running the query without any additional indexes, you will get the results, but there will be definitely a table scan with a little bit more of logical reads than you could expect.
(100 row(s) affected)Table 'Jobs'. Scan count 1, logical reads 648


Now, because the second step is looking for the new records (status=1), which normally should be small number of records comparing to the whole table size, so I immediately thought about creating non-clustered index for the status column, filter on the status=1 and include appropriate values (to have a small covering index). So I created one:
ON Jobs ([Status])
INCLUDE ([Priority], [DateInserted])
WHERE ([Status]=1)
This of course had a nice effect on the previous query:
(100 row(s) affected)Table 'Jobs'. Scan count 1, logical reads 2

For a moment I thought, that the rest should be fine, because query and LINQ statement were fairly straight forward with not much magic in them, but somehow, I’ve decided to see what query LINQ does produce.

The query was almost the same with one exception – parameterization, where the parameter value was of course equal to (1):
SELECT TOP (100) [t0].[Id], [t0].[Priority], [t0].[DateInserted], [t0].[Status]
FROM [dbo].[Jobs] AS [t0]
WHERE [t0].[Status] = @p0
However the shocking discovery for me at that point was the query plan:
Yes, SQL server was NOT using the non-clustered index I’ve created moments ago. WTF!
I’ve also checked the query actually shown by the SQL Server Profiler:
exec sp_executesql N'SELECT TOP (5) [t0].[Id], 
[t0].[Priority], [t0].[DateInserted], [t0].[Status]
FROM [dbo].[Jobs] AS [t0]
WHERE [t0].[Status] = @p0',N'@p0 int',@p0=1
Again, result was obviously the same.
Ok, it was obviously related to the parameterization of the query rather than some LINQ issue. It just happened that LINQ actually parameterizes (and for good) queries.
Quick Bing check resulted in, which has clear statement on the subject: “In some cases, a parameterized query does not contain enough information at compile time for the query optimizer to choose a filtered index. It might be possible to rewrite the query to provide the missing information
And this was the moment, when I cited the famous phrase. Was I sleeping during my own sessions, where I was talking about query plans, their caching, etc., like described here?
The problem becomes obvious, when you think that there are other possible values for which query plan using the non-clustered index would become invalid.
In my case, because of this and some other issues, I’ve decided to use stored procedure, which basically does all the job (retrieving the data and updating the state in one nice transaction).
UPDATE TOP (@BatchSize) Jobs 
SET Status = 2, DateFetched = GETUTCDATE()
OUTPUT inserted.Id, inserted.[Priority], inserted.DateInserted
WHERE Status = 1

Works like a charm Smile

Just added 3TB disk to Windows Home Server v1.0

I bought recently 3TB WD drive. You may know that drives over 2.19TB are not supported in Windows XP/2003 systems due to the lack of GPT support. Windows Home Server of course doesn’t work with drives more than 2TB size, but there are some workarounds.

First, you will need a driver for HBA that comes with WD drive. I found that it is HighPoint Rocket 620 and you can find Windows XP/2003 drivers here.

So, after you will connect drive, install drivers, you should see 3TB drive in the disk manager. The rest is fairly simple, just follow the following posts:

For now, I’m running WHS v1.0 with 3TB drives with Drive Extender (didn’t needed to upgrade to 2011) nicely.

A little note: if Hitachi GPT Disk Manager doesn’t want to work with your disk, this isn’t a problem. It still installs GPT driver, so you can use disk manager to create GPT partition.

Windows Phone 7 from “user” perspective

One month ago I was thinking which type of phone I’d like to have. I was choosing between Android and Windows based phones. After some review reading, talking to people and hands-on experience I’ve selected Windows 7 phone: HTC Trophy.

There were couple of reasons for this:

  • Didn’t wanted to have a toy to play “hacking” games
  • Spending time figuring out OS version upgradeability not for me
  • Wanted to know what works and what doesn’t early
  • Form factor

In other words, I wanted more “user oriented” phone than a smart device.

Of course, WP7 misses a few things that I was used to on previous (Windows) phones. One of those features is: fast contact lookup using keypad – where pressing keys would search number and names at the same time. It was super convenient and fast, and for dialing I’ve used only this approach. Now – it’s gone. Bad, very bad. In the end, this particular feature reduced ergonomics of the phone a lot.

Actually, when reading reviews I was always wondering: everybody were talking about screens, sensitivity, applications, but no one (at least I haven’t found) noted that WP7 misses “phone capability”. I mean, when someone is talking about smart phone, then they mean smart device.

The second feature that I’ve missed once(!) was the ability to connect to hidden WiFi networks. Other than that – I’m pretty satisfied with the OS, applications and hardware.

And no, “copy/paste” is not on my wish list at all. Don’t have a smallest idea when I should use it. I didn’t had a single case in the month, when I’d miss copy/paste. I just don’t … and I don’t understand why people are putting it on top of the list. For me – bring back the fast contact search.

I’m using the phone for more than a month and one thing sure, my behavior and usage changed a bit. Before WP7, I barely used any smart phone more than a phone with smart address book (synced with Outlook). And initially my home screen had a few apps only:

  • Phone for the easiest access to the recently dialed numbers
  • People – the contact store
  • Hotmail – if you don’t know it, then it is like a Gmail, but integrates with office documents better
  • Messaging for SMS and parking (in Lithuania, you can pay for car park using “specially encoded” SMS messages)
  • Calendar
  • Twitter – for those who don’t know it, it is a stream of 140 character sized crap that you can get addicted to. And it made into my initial home screen. Stays so far …
  • Settings – to deal with Wireless and some other necessities
  • Weather

One month has passed and the screen has some changes:

  • REMOVED: weather app – used really infrequently and live tile was updating was crap. In other words – useless.
  • ADDED: Xbox Live – oh yeah baby … take a look at my Bejeweled Live achievements. Only 3 to go …
  • ADDED: Outlook – corporate is corporate. Interestingly, phone became a the place, where both corporate meetings and private appointments are right in front of me and easily accessible. Much better than Outlook itself.
  • ADDED: Parking-LT – finally an app that allows paying for parking done easy way. No SMS encoding anymore. Big thanks for the developers.
  • ADDED: Amazon Kindle – I never thought that it will end up on my home screen, however, it did. The main reason for happening was that I’m using public transportation to get to the office quite a lot. I use this time to read books. I prefer books to twitter Smile

In just a month, my usage patterns changed slightly and I could say that smart device features are getting into my life. However, I still miss the fast contact search with less clicks. I hope it will be fixed with “Mango”.

Next thing in my plans related to the phone – Visual Studio Smile

Posted by / March 23, 2011 / Tagged with / Posted in WP7

After party or summary of presentation @ Powered by MVP

Today I was giving presentation at “Powered by MVP” event in Lithuania.

The topic: what to do, when programs leak, crash and misbehave in many other ways.

The summary: there are tools that can help you finding the cause problem and solving them. They only require some time to learn and some more time to master, but otherwise – it is much better than automatic IIS pool recycling because of a faulty app.

I would suggest the following way of mastering this part of skills.

Familiarize yourself with the subject like memory management

Equip yourself with tools

  • Grab sysinternals suite here. Familiarize with tools. I mean, even if just your PC startup is too long, check autoruns – I bet you’ll be amazed. Process Explorer and Process Dump – are the two developers tools that you must be familiar with.
  • Windows SDK, where the WinDBG lies hidden Smile
  • Visual Studio
  • CLR Profiler
  • Any other third party tool like: Memory profiler from SciTech, Red Gates ANTS Memory profiler, JetBrains Profiler

Learn performance counters related to memory and performance. Don’t just say slow. It is not important. It is important to know why it is slow. Performance counters often can help answer that question.

Try some samples with those tools, read help materials and additional information on those, like:

Learn further

Read books like: “Windows Internals” and “Advanced Windows Debugging” (Recommended by @alejacma).

Start reading really interesting blogs on the subject, like:

I guess that’s it for todays session. I hope you liked it.

If you didn’t understood everything, well – don’t say that I didn’t warned you about session being @ 400 level Smile. Anyway, don’t worry – step by step you can learn those things fairly quickly. And sometimes, they can help a lot.

Why I will NOT upgrade to WHS 2011

I saw the this post (via @seandaniel) called “Why I plan on using the new Windows Home Server 2011” (a.k.a. “Vail”).

I have quickly ran my eyes over the blog post and well – I do NOT agree and I cannot accept the decision given the little I know.

Surely, I’m slightly different kind of persona and I am using WHS slightly differently and my priorities are different. For me it is a backup and storage machine that is used at home mostly. I’m accessing it very seldom from remote locations, just in case, when I need to grab some personal file.

So lets look at so called “just awesome stuff”.

  • Improved remote access

I’m rarely using it. It is improved so that I can modify it? How nice, but it is so unimportant to me. With two small children at home I really don’t have time to “modify” it and I consider such time as wasted. I mean really, it’s not related to anything what this box is supposed to do – keep your files available and safe.

  • Silverlight video streaming

I cannot imagine myself watching a Blu-ray RIP streamed via Internet and wireless connection, while I’m at hotel. Do you? One more nice to have that will be never used in my case. Most of the streaming is happening from WHS to Xbox @ home, when my kids are watching their content or myself – our personal stuff.

  • Silverlight photo streaming

I would consider this to be a nice feature, if there would be a chance to share it easy with multiple relatives or casual “Oh, please …” type of friends. I’m considering writing add-in myself that would allow anonymous (with ticket) access to the defined set of photos on the WHS. Silverlight? Why not, it’s not that difficult, when you have appropriate data on photos. So is the photo streaming a “Wow”? It is most appreciated if it allows “anonymous/ticketed” access, but otherwise – thanks, but no. Thanks. I know my photos well.

  • Moving files via remote access

Again, one nice feature, which will never be used by myself. If I’m about to touch any file, especially – delete. I will look (view it) carefully, if this is really the content I want to (re)move. I’ve done that locally over the shared folder and I’ll be continuing doing that.

  • DLNA media streaming

It would be much more appreciated, if it could become a full Media Center with data storage in one box. DLNA itself is nice, however the only devices that support DLNA at home are my PC’s, which happen to play the content nicely without DLNA.

Regarding data protection, the talk is slightly different.

What about the data on the server?  Server’s hold data, isn’t that data safe?  Yes of course it is.  You can backup that data to an internal, or better yet, external hard drive!”. At this point, I have only one question – so why do I need WHS then? I can have regular ***ux box with backup to an external drive. And also, considering theft/robbery scenarios, I prefer backing up critical data (photos and some videos) to Amazon (probably cheapest #cloud option) to keep the data out of home.

My main interest here is to have flexibility with hard drive management (I have replaced one dead drive, added some additional, replaced smaller with bigger one) and data availability without having to deal with limitations of RAID configurations or “Copy paste wizard”.

Sean, you say “I ultimately agree with the decision given what I know”. Well, you maybe agree, but you didn’t convinced me and, it seems, many others around the globe.

The question here is what is more important: you or customers?

IIS responds with 403 using certificate based authentication

From time to time I have to deal with certificate based authentication, when developing WCF services and from time to time I’m falling into the same pit.

Today I was configuring WCF service to use certificates for authentication (via AD certificate mapping). After configuring the IIS and WCF I’ve tried to access the SVC help page/metadata, but was getting 403.7 Forbidden: Client certificate required from IIS. The IIS logs contained something like this:

<date time> W3SVC1 <IP> GET /site/service.svc – 443 – <IP> <Browser> 403 7 64

Bing came out with the support KB article on this issue, but all possible causes were dealt with: CA was trusted, certs were not revoked or expired. And then it hit me – “not expired”, yes, of course – how the IIS checks the revocation of the certificate? Simply by looking into the certificates CRL distribution points information (if it is present there) and it must be accessible and reachable from IIS, which hosts the service. To check if everything is ok, just copy CRL’s URL from the certificate and try to open it via browser on IIS hosting service.

In my situation that was the problem, which was fixed easily by entering appropriate DNS records.

Of course, it is possible to switch off the certificate revocation checking on IIS, but that’s completely NOT recommended.