ADO.NET Connection Pool Performance Counters

I’m currently focusing on performance of my clients most used APIs. A few core services get hit a lot in their architecture, so they must be optimised for read and write, yet we are seeing some strange results from our apps when deployed. One theory could be that there are not enough connections open through to the database. We’ve noticed lots of task switching too in the app, could this be due to waiting for a free connection? This is a demo app to see how connections and connection pools behave on the local machine. I’ll use .NET ADO SQL Performance Counters to help us.

I’ve written a sample app to demonstrate this. There’s a JMeter runner which hits a simple controller. The controller uses the database correctly and should give us the results I want to demo.

Here’s the core code from the app, note that I don’t have to open the connection, it’s done internally inside Dapper.

public class ValuesController : ApiController
 public async Task<IHttpActionResult> Get(string id)
 using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["database"].ConnectionString))
 //await connection.OpenAsync();
 var result = await connection.QueryAsync("SELECT [Data] FROM Example WHERE [Id]=@id", new { id });

 if (result.Count() == 0) 
 return NotFound();

 return Ok(result);

Connections take resources to connect between the client and server. Keeping these connections opened and pooled lets them be shared between executions within the same process. .NET has connection pooling built into it which can be controlled via code or the connection strings.

This app performed well on my machine, with JMeter reporting sub 5 milliseconds response times with a connect time of below 1 milliseconds.

Performance Counters

If we want to see all the values, we must raise the level of performance counter detail:

  <add name="ConnectionPoolPerformanceCounterDetail" value="4"/>


How many connections are being opened and closed explicitly each second. You would expect one connection to be opened per database, or when there aren’t enough connections in the connection pool. If you see this when the app is under load after making initial connections, then your app is closing connections and you should find a way to pool them – basically don’t close them.


Same as HardConnectsPerSecond except this is how many times a connection is being closed. I wouldn’t expect to see any values whilst the app is running under load in the majority of applications since the connections should be pooled. If you see this your app is being disconnected from the server or you’re closing the connection.


This counter shows how many connections are being opened by the application but have been intercepted by the connection pool. If there was already an unused connection pool connection open, then then open command would not open another connection, instead the connection pool would give it an open connection. If there wasn’t an open connection and the connection pool is allowed to open one, you’ll see the HardConnectsPerSecond counter increase. If you’re calling connection.Open() anywhere (e.g., inside Dapper code, or your own ADO.NET SqlConnection using statements) then you should see soft connects. Microsoft say you should always Open() the connection but don’t worry about closing it.


The same as above apart from this is soft disconnects. The value should be the similar or the same to SoftConnectsPerSecond. If your app is not disconnecting, then you are keeping connections open and it might lead to locking. .NET normally takes care of this for you.


An early warning count of connections that don’t take advantage of connection pooling. I’d expect this to be zero.


This is an important graph. It states how many connections are being used within the pools. This is a count across all the pools so may be misleading. If you’re putting a lot of traffic through to your app and this value is low, then you may be being bottlenecked by the connections as the app is having to share connections in its connection pool a lot (you may see task switching in your profiling). Increase the minimum pool connection count or check your connection opening code. A value of 1 here is bad. Here under load my app with one connection pool (for my one database connection) has 5 pooled connections – they may be in an active or waiting stage, but there are 5 connected to the SQL server on my machine.


Shows the number of active connection pool groups, which is different to the number of active connection pools, there should be one per database generally.  If this is zero or less than your number of different connection strings, then perhaps you’re not utilising connection pooling. See NumberOfPooledConnections above.


I’d expect this value to be zero unless your connections explicitly don’t use connection pooling. ADO.NET uses connection pooling by default. See also NumberOfActiveConnectionPools.


Each connection belongs to one connection pool. If you’re using connection pooling then you will likely have one connection pool per database, with multiple active connections within it.


This is the number of connection pools that were created but their connections aren’t currently being used.


This is a good one to watch. In my example application, I saw a range of values here. The lowest I saw was 0 and the highest I saw was 4. If this is stuck at 1, you may somehow be locking your database code so only one task or thread can use it at once. When NumberOfActiveConnections increases generally the NumberOfFreeConnections decreases since they are being used up.


These are the number of open connections that are not being used. When the NumberOfFreeConnections increases, it means the connection is still alive in the pool but not currently being used by any of your code.


I’ve no idea what this means, and it had a value of 0 throughout. Google didn’t help me either except for some Oracle documentation where it’s not used – so this is probably a legacy performance counter.


This value was always zero under load. A value here indicates .NET garbage collection released a connection, and you should have disposed or closed it in your code.

Further reading

Publishing and subscribing to RabbitMQ using the .NET Client

Recently I was struggling to understand how a customers RabbitMq project was wired up so I created an example .NET app to play around with it. Turns out their setup code didn’t create bindings and routes out of the box – no wonder it didn’t work when I ran the acceptance tests!

The demo app has two console apps, one publishing messages to the RabbitMq exchange and one subscribed to a RabbitMq queue. We create a RabbitMq exchange and subscribe the queue to all messages sent to the exchange.

Installing RabbitMq on the local machine

RabbitMq is pretty easy to get started with on Windows but with everything is harder at scale. RabbitMq runs atop Erlang, which has a simple Windows installer. After installing Erlang, the RabbitMq service needs to be installed  using the RabbitMq installer and then the service must be started and running.

To gain visibility of messages entering RabbitMq and routes they take, you can enable the HTTP management utility which is a web-based administration interface that will run on your local machine.

Enabling the HTTP management utility
rabbitmq-plugins enable rabbitmq_management

You’ll need an exchange to send messages to (a postbox). You’ll then need to route the messages to clients (routed to sub-exchanges and queues).

Getting the RabbitMq .NET client

I used the RabbitMq NuGet Client package for this.

Install-Package RabbitMQ.Client
Authenticating RabbitMq using the .NET client
var factory = new ConnectionFactory();
factory.UserName = "guest";
factory.Password = "guest";

// create a connection using default ports
var connection = factory.CreateConnection(Settings.Ports);
Creating a RabbitMq exchange and routing messages to a RabbitMq queue using the .NET Client

Actions here are idempotent, meaning you can run this more than once and get the same result out of it. If the exchange or queue already exists, you shouldn’t run into any problems.

// a way to manipulate the design of the system
var model = connection.CreateModel();

// create the exchange if it does not exist
// when a message is received, send a copy of the message to each subscriber
model.ExchangeDeclare(Settings.ExchangeName, ExchangeType.Fanout);

// create a queue if it does not exist
model.QueueDeclare(Settings.QueueName, false, false, false, null);

// bind the queue to the exchange
// Settings.RoutingKey is blank, so no filtering occurs
model.QueueBind(Settings.QueueName, Settings.ExchangeName, Settings.RoutingKey, null);
Send a message to the RabbitMq exchange using the .NET Client

Remember that the rules in the exchange defines where the messages go.

var model = connection.CreateModel();

// messages are bytes
var body = Encoding.UTF8.GetBytes("hello world");
model.BasicPublish(Settings.ExchangeName, Settings.RoutingKey, true, new BasicProperties(), body);

Console.WriteLine("Message sent");
Subscribing to a RabbitMq queue and handling messages using the .NET Client
var subscription = new Subscription(connection.CreateModel(), Settings.QueueName);

// grab the message bytes and convert using the same encoding
foreach(BasicDeliverEventArgs e in subscription)
 Console.WriteLine("Received Message: {0}", Encoding.UTF8.GetString(e.Body));

Github RabbitMq Example

Check out my Github RabbitMq Example to see this in action and let me know how you get on!

Adding Specflow to a Visual Studio project

I’m using Visual Studio Community 2015 and want to write some acceptance tests – here are the quick steps to get started. Most of these can be found on the SpecFlow getting started page.

High level steps

  1. Install a Visual Studio Plugin so you don’t have to write too much SpecFlow glue code
  2. Write your features and steps
  3. Run your tests in your favourite test runner

Detailed steps:

  1. Install SpecFlow for Visual Studio
    This is a .vsix extension for Visual Studio 2015. There should be a version for your Visual Studio. This adds shortcuts such as feature files and custom tools, which means when you change your feature files, your feature.cs files are automatically generated upon each build. If you don’t have this, you’re probably going to have a long process of manually creating those files instead.
  2. Restart Visual Studio after the install so you get all the tools from the SpecFlow extension
  3. Create a test project and install the SpecFlow nuget
    Install-Package SpecFlow.NUnit

  4. Add a new .feature file to describe your feature in Cucumber language syntax. Right click and choose Add Item. Choose “SpecFlow Feature”. These templates were added by the .vsix Visual Studio Extension.When you add the item, you’ll see the feature file has been added, along with it’s glue – the .feature.cs file. This file is generated automatically by SpecFlow and joins the feature file to the any separate files. This file is generated at each build (or when you click Run custom tool on the .feature file), so don’t put your code for your steps in here, instead create another file.Here’s my example .feature file:
    Feature: InstancesFromTemplates
    	In order run new flows
    	As a user
    	I can create a new flow instance from a flow template
    Scenario: Do something example
    	Given I setup something
    	And I setup something else
    	When I do an action
    	Then I check the result
  5. Create a file for your steps. This is going to be a blank C# file with a [Binding] attribute on top so SpecFlow can discover it.
    using TechTalk.SpecFlow;
    namespace Flow.WebApi.Tests
        public class Steps
            // steps will go here
  6. Write a few steps in your feature file and generate the steps by right clicking the feature lines in the .feature file and clicking Generate Steps.
  7. Copy the generated steps into the clipboard and paste them into the Steps.cs file.
  8. Your steps file will look like this:
    using TechTalk.SpecFlow;
    namespace Flow.WebApi.Tests
        public class Steps
            [Given(@"I setup something")]
            public void GivenISetupSomething()
                // setup
            [Given(@"I setup something else")]
            public void GivenISetupSomethingElse()
                // setup something else
            [When(@"I do an action")]
            public void WhenIDoAnAction()
                // act
            [Then(@"I check the result")]
            public void ThenICheckTheResult()
                // assert
  9. Now when you rebuild, because you’ve got a Specflow [Binding] attribute on that class which makes it discoverable, your steps will be hit and your tests will be run in your runner.
    Rebuild your project, and run your tests:


Running Powershell as an Administrator

From Powershell:

Start a new Powershell process as an Administrator from an existing Powershell command:

Start-Process powershell -Verb runAs

Or with arguments:

Start-Process powershell -Verb runAs -ArgumentList <Arguments>

From the command line:

Run the following from a non-powershell command line:

powershell -Command "Start-Process PowerShell –Verb RunAs"


Performance Counters with NLog

I spotted that NLog have a performance counter target and decided to give it a whirl.

I’m currently working heavily with windows performance counters using a framework called PerfIt! and wanted to see if there was an easier (to configure) way.

Grab the source code here:

Install-Package NLog
Install-Package NLog.Config
Install-Package Topshelf

Using the topshelf getting started guide I created a quick example app which would write an INFO level log every second.

public class Program
  public static void Main()
    HostFactory.Run(x =>
      x.Service<Service>(s =>
        s.ConstructUsing(name => new Service());
        s.WhenStarted(tc => tc.Start());
        s.WhenStopped(tc => tc.Stop());

      x.SetDescription("Ex Topshelf Host");

public class Service
  readonly Timer _timer;
  private readonly ILogger _logger = LogManager.GetCurrentClassLogger();
  public Service()
    _timer = new Timer(1000) { AutoReset = true };
    _timer.Elapsed += (sender, eventArgs) => _logger.Info("This is an info log message");
  public void Start() { _timer.Start(); }
  public void Stop() { _timer.Stop(); }

The real magic happens in the target file. This is where the counters are created using the
autoCreate flag.

<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="" xmlns:xsi="" xsi:schemaLocation=" NLog.xsd" autoReload="true" throwExceptions="true" internalLogLevel="Info" internalLogFile=".\nlog-internal.log">

     counterHelp="This was created by NLog"
     incrementValue="1" />

   <target xsi:type="Console" name="console" detectConsoleAvailable="true" />

   <logger name="*" minlevel="Debug" writeTo="perf" />
   <logger name="*" minlevel="Debug" writeTo="console" />

When you put this all together, you get automatically created Performance Counters (available after you’ve restarted your console of course).

PS> Get-Counter -ListSet *example*

CounterSetName     : ExampleCategoryName
MachineName        : .
CounterSetType     : SingleInstance
Description        : Category created by NLog
Paths              : {\ExampleCategoryName(*)\Alexw.Example}
PathsWithInstances : {\ExampleCategoryName(logwritten)\Alexw.Example}
Counter            : {\ExampleCategoryName(*)\Alexw.Example}

Change Visual Studio 2015 TFS Associate or Resolve work item settings


VS showing item resolution in checkin screen with arrow pointing to item resolution

The default action for VS 2015 is to resolve the work item(s) you’re currently working on. In teams that check-in frequently this causes pain because they might be adding several check-ins before the story is complete.

Visual Studio 2015

You can change the default action in Visual Studio settings:

  • Navigate to Options > Source Control > Visual Studio Team Foundation Server
    • Or search for “Resolve associated work items” in the find box at the top right of Visual Studio 2015
  • Change the “Resolve associated work items on check-in”
    • Check this box to set the default action to resolve
    • Uncheck this box to set the default action to associate

Visual Studio 2015 VSTS options with arrow pointing at resolve option checkbox

Other Visual Studio Versions

That’ll be a registry edit I’m afraid. See this Stack Overflow for more information:

Finding and Deleting Windows Performance Counters

Golden Rule #1: Restart your command process and any existing apps when you modify performance counters. Performance counters are installed into the registry, so are not refreshed – only loaded in once to the current process. This causes much frustration.

Golden Rule #2: Don’t mess about with the registry, it can cause performance counters to vanish but still exist so you can’t create them again.

Listing performance counters

Remember: After modifying counters, close your console session. Performance counters are only loaded when a process is started.

To see all the performance counters installed on a Windows machine and save them in the counters.txt text file, run the following in either cmd or powershell:

typeperf.exe -q > counters.txt

To find a performance counter by name using wildcards using Powershell:

Get-Counter -ListSet "My.Counter.Name(*)"

Deleting Performance Counters

To delete an existing performance counter using Powershell – this has to be exact!

[Diagnostics.PerformanceCounterCategory]::Delete( "My.Counter.Name" )

Further reading

Wiremock on Windows Azure

Wiremock is a great tool which lets you mock responses back to an endpoint. If you setup your app to proxy through it, you can set your own responses to matching requests – great for testing.

I’m going to try this out next week. It should be simple enough to invoke the app using the scripts in the links above, which essentially starts a web app which routes all web requests through to the jar.

Edit: I still need to come back to this, the last time I tried this it didn’t work and then I got distracted by work!

Powershell Prettify XML

Got some XML you need to prettify in powershell? No problem!

Example xml:

<first><second><third>hello world</third><second><first>

Load the XML and preview it in the console prettified:

PS> [xml]$xml = Get-Content C:\temp\input.xml
PS> $xml.Save([System.Console]::Out)

    <third>hello world</third>

Load a file, prettify it, and save it as another file:

PS> [xml]$xml = Get-Content C:\temp\input.xml
PS> $xml.Save(c:\temp\output.xml)


Further reading