[Solved] Windows 10 May update causes slow starting after sign-in

Last week my laptop has been updated to Windows 10 May update. It's an update I was waiting for, because I wanted to use the new WSL 2 features when using docker. So after the update I immediately activated WSL 2 using the command

wsl --set-default-version 2

After booting and signing-in it always take a little time to startup, so the first time I did not notice any real problems. But the days after that every time I started the laptop and signed-in I just got a black wallpaper without startmenu or taskbar. When waiting for a long time (approximately 2 or 3 minutes) suddenly the startmenu and taskbar did appear.

First I thought it was the WSL 2 feature, so I disabled that again, but that did not make any difference.

After searching on the internet I found this website: https://appuals.com/file-explorer-not-loading-or-loading-slowly-after-windows-10-upgrade/ 

On the website Solution 1 is to disable the Windows Search and that was for me the correct solution. After disabling the Windows Server service signing-in was fixed. Personally I did not find any other problems with disabling this feature. 

Typescript Mocking Library (ts-mocks) now supports Generic methods better

The last few years I've created a library for mocking object in Typescript. A long time ago I wrote an article about it:

http://www.ilove-it.com/post/2016/07/28/mock-your-typescript-classes-and-interfaces-the-easy-way

A long time there was really one thing that I wanted to fix and that is mocking generic methods:

https://github.com/jjherscheid/ts-mocks/issues/9.

So what is the problem?

Typescript can determine the the return value for the mocks methods/properties. But with generic methods this will not work because there is no option to specify the type that will actually be used in the code. I will try to explain this using an example.

Let say you have a service with the following interface:

export interface SomeService {
    get<T>(index: number): Observable<T>;
}

In your application you use the service to returns users:

someService.get<User>(10).subscribe((user) => /* do something with user */ );

If you would like to mock this in you unit test normally you would write:

mockSomeService
    .setup(ss => ss.get)
    .is((value) => of(someUser));
 
// or
 
mockSomeService
    .extend({ get: (value) => of(someUser)});
Unfortunately typescript will complain about the fact that of(someUser) is not of type Observable<T>:
Type 'Observable<User>' is not assignable to type 'Observable<T>'.
Type 'User' is not assignable to type 'T'.

Now the as<T>() comes into the rescue. With this method you can overrule the return value that is automatically determined by the setup() method.
Note: Please note that this can conflict with the real code if not used appropriately, so use at your own risc

mockSomeService
    .setup(ss => ss.get)
    .as<(number) => Observable<User>>()
    .is((value) => of(someUser));

With this typescript will not complain anymore. Great! ;-)

So don't wait any longer and update your solution to ts-mocks 2.6.0

Safari Unauthorized 401 on loading Angular 8 application [Solved]

Today I was struggling with an Angular 8 application. The application is hosted on a internal network without authentication and on an external network with authentication. Before upgrading to Angular 8 the application works on all browsers both internally and externally.

After upgrading to Angular 8 suddenly the application does not work externally on Safari browsers (both Mac and iPhone). Only a white screen is displayed. After do some debugging, we found that the server returns 401 Unauthorized on some of the javascript files loaded by the angular application. 

On the internal network the application works as expected.

"I'm going crazy"

 

Luckily I found the solution on stackoverflow:  https://stackoverflow.com/questions/56777096/angular-8-differential-loading-failing-due-to-auth-issues-with-dotnet-core

":D, I'm not going crazy"

So what is the real problem?

Due to Angular 8 differential loading the script tags are added like this:

<script src="polyfills-es2015.00ce1f051b27efe483ef.js" type="module">

And for some reason Safari does not sent the authentication for scripts with 'type="module"'.
The way to solve this is to add 'crossorigin="use-credentials"' to the script tag. 

When using Angular 8 you can let Angular add this to the script tags by updating the angular.json file:

{
   ...,
   "build": {
     "builder": ...,
     "options": {
        ...
        "crossOrigin": "use-credentials"
     }
   }
}

After this the script tags look like this:

<script src="polyfills-es2015.00ce1f051b27efe483ef.js" crossorigin="use-credentials" type="module">

Application with authentication no also works in Safari. Great!

 

UnitTest: Type is not resolved for member 'log4net.Util.PropertiesDictionary

I am using log4net as logging framework in the projects I am working on. Every thing seems fine untill I used the LogicalThreadContext of log4net. During the unit tests I received this message:

An exception occurred while invoking executor 'executor://mstestadapter/v1': Type is not resolved for member 'log4net.Util.PropertiesDictionary,log4net, Version=1.2.13.0, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a'

What's happening here? Well my code look something like this:

    public class CreateLogRequestId: ActionFilterAttribute
    {
        public override void OnActionExecuting(HttpActionContext actionContext)
        {
            log4net.LogicalThreadContext.Properties["requestid"] = Guid.NewGuid();
        }
    }

This Action filter add's a Guid to every request which can be used during logging like so:

    <appender name="EventLogAppender" type="log4net.Appender.EventLogAppender">
      <applicationName value="MyApplicationName" />
      <eventId value="1" />
      <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="%date %-5level [%property{requestid}] %logger - %message%newline" />
      </layout>
      <threshold value="ERROR" />
    </appender>

In the conversionPattern the %property{requestid} gets filled with the value that is set during the requests.
Note: in my project I use different values, but this is just for the example.

I think it's super nice to use features like this to add some more logic to the logging. 

But when I created unit tests for the ActionFilterAttribute above the unit test will fail with the exception.

Apparently this can be fixed with the following code in your unit tests class that uses the LogicalThreadContext.

        [TestCleanup]
        public void Cleanup()
        {
            CallContext.FreeNamedDataSlot("log4net.Util.LogicalThreadContextProperties");
        }

Happy coding ;-).

ConEmu: The Dos Command but better

If you are a Front-end developer you may have worked with several command windows open. I always have a minimal of 3 or 4 command windows open. One for Git, one for npm start (building projects), one for npm test (testing projects) and most of the time also one for some other stuff I am doing. So, most of the time one of my desktops look something like this.

I like to have all the console windows visible to be able to see live updates when watching files. That why I like to use the Windows feature for stacking windows with WIN + Arrow Key.

On very annoying thing about several windows is that when you have some other windows (like a browser) temporarily on the desktop with the consoles is that when using Alt + Tab only focus one of the command windows. The other thing I don't like is the face of having multiple console windows in the taskbar.

Fortunately, there is a solution for that. It's called 'ConEmu' (https://conemu.github.io/). It's a windows console emulator which contains Tabs and Splitpanes! Yes... That what I need. Default it looks like below. 

With Ctrl+Shift+E or Ctrl+Shift+O you can split the command window in vertical or horizonal tabs. It is also possible to have tabs stacked with Windows+W.

The console will then look something like this:

Great!! So, know I can use Alt+Tab to switch to all consoles at once. If you also want one item in the taskbar (like I do) you can this in the settings window:

So Yes!! I finally have one window that contains all the consoles I need in my project.

But things get better and better!!!....

It is possible to start ConEmu with command parameters. So, it is possible to start the windows with all your windows in place and in the directory you want. To start with multiple windows, you can do the following:

start conemu64.exe 
        -title "GIT-SRC-TESTS" 
        -runlist -new_console:d:"C:\YourGitRepo\":t:GIT 
        ^|^|^| -cur_console:s1T70V:d:"C:\YourGitRepo\src\":t:SRC 
        ^|^|^| -cur_console:s2T50H:d:"C:\YourGitRepo\tests\":t:TESTS

The line break in the sample above should be removed. All should be on one line.

The command above sets the title for the ConEmu window. Then it uses the -runlist parameter to be able to run multiple commands. The first command is opening a new console (-new_console) in the directory C:\YourGitRepo and gives the tab the name 'GIT'. Then there are the ^|^|^| to split the commands and a second console is added with -cur_console, it is added to the first console s1T and made 70 percent of the height vertically (70V).

The output of the above command will result in a window like below:

More about the several command line arguments can be found on the ConEmu website.

But.... Again!! I can even be better...

ConEmu contains 'Tasks' which are predefined command groups:

These tasks can be requested from the command line (or batch file) by using the following command:

ConEmu64.exe -run {Bash::Git bash}

In the sample above the task '{Bash::Git bash}' is triggered, which opens a new command window started with bash on the current directory.
Best part of this is that you can also create your own tasks. Let's take the example from earlier to create a task that creates one ConEmu window with 3 splitted tabs on the correct directories. I created a new task called '{cmd::MyProject}' and in the command window I added the following code (beware, line-breaks are important here).

-cur_console:d:"C:\YourGitRepo" -cur_console:t:GIT2 cmd

-cur_console:s1T70V -cur_console:d:"C:\YourGitRepo\src" -cur_console:t:SRC cmd

> -cur_console:s2T50H -cur_console:d:"C:\YourGitRepo\tests" -cur_console:t:TEST cmd

You may wondering what the '>' sign means, but this will be the active tab. In the settings window it looks like this:

Tip: If you have set up the tabs by hand you can press the 'Active tabs' button in the right bottom corner. This will create the correct text for you.

So now it is possible to start the same tabs with just:

ConEmu64.exe -run {cmd::MyProject}

I really like this ConEmu and there are a lot of configurations you can change to customize it to your wishes. Personally I like my tabs below and some other colors. There is also another command emulator Cmder (http://cmder.net/) which is based on ConEmu. I like the colors used with Cmder, but I could not clearly find how to start cmder with tasks. In the directory of Cmder you can find the folder that contains ConEmu. I use that ConEmu64.exe to have the default colors of Cmder. But of course It's totally up to you to decide what you like!

Happy Coding!

Debugging C# Windows Services the easy way

Have you ever created a C# windows service? I found it really ease to create. But what if something is going wrong? I could never really find a good way of debugging windows services. So most of the time I was writing verbose logs to be able to find the problem. On other way was not running as windows service, but as a simple console application. But somehow there are situations where the console app does work and the windows service doen't. 

Now I found an easy way of debugging windows services. So in this post I will try to explain how to create a simple windows service and be able to debug it.

First  start Visual Studio as Administrator (administrator rights are needed for debugging the windows service).

Secondly create a windows service project. This will create a Program that looks like the code below:

    static class Program
    {
        /// <summary>
        /// The main entry point for the application.
        /// </summary>
        static void Main()
        {
            ServiceBase[] ServicesToRun;
            ServicesToRun = new ServiceBase[]
            {
                new YourWindowsService()
            };
            ServiceBase.Run(ServicesToRun);
        }
    }

 In 'YourWindowsService.cs' (where YourWindowsService must be replaced with the name you have choosen) add the selected lines to the OnStart() method.

    protected override void OnStart(string[] args)
    {
        if (args != null && args.Contains("-d"))
        {
            Debugger.Launch();
        }

        // Do Your Things
    }

The code added above checks if one of the arguments is '-d' and ifso it will ask the Debugger to launch. More about this will be explained later in this post.

Build you solution and start a console window to register the service you just created.

Commands for registering windows services are:

sc create YourWindowsServiceName binPath="full-path-to-service.exe"

# start without debugger
sc start YourWindowsServiceName

Your windows service should run now without debugging.

So how start the debugger? Follow the next steps: 

  1. Open the 'Services' of windows
  2. Right click 'YourWindowsService'
  3. Open the properties
  4. Stop the service (if running)
  5. Type '-d' in the Start parameters edit box
  6. Press start.

 

Now the Visual Studio Just-In-Time Debuger will ask to debug:
'Yes, debug YourWindowsService.exe'

Hit yes to allow the Just-In-Time Debuger to start.

The Just-In-Time debugger shows a window to choose the Visual Studio instance to select, if you started visual studio in the first steps as Administrator it will be listed here also. Otherwise choose the version you want (if multiple versions installed) and start debugging.

Visual Studio will open and break at the Debugger.Launch() line.

You have successfully attached a debugger in your windows service.

Never thought it would be so simple as this.

GIT Simplified removal of 'stale' remote branches and their local branches

As you may noticed in my previous posts I use Git in a lot of projects I am working on.

Previous posts:

Git commands I keep forgetting

Update local develop branch without checkout

Git commands I keep forgetting part 2

In the last one I already talked about "Remove local branches when remote branch is already removed". I used this way of removing a lot the last couple of months. But even with these simple command it is still a lot of work to get your local copy clean.

During some searching on the internet I found another great way to make it even simpeler to cleanup the local repository.

List branches that have no remote branch anymore.

git branch -vv

The above command can be used to list branches in a verbose way. And the output will be something like this:

bugfix/BF_Bug1           6a64538 [origin/bugfix/BF_Bug1: gone] Fixed bug1
bugfix/BF_Bug2           b7cf3ea [origin/bugfix/BF_Bug2] Fixed bug2
* develop                ca60d674 [origin/develop] Merged PR 91: Added Feature
feature/SomeGreatFeature ccb8b3f [origin/feature/SomeGreatFeature: gone] Some great feature added
master                   10ed942 [origin/master] Updated README.md

As you can see 2 branches are 'gone', which means the remote branch does not exist anymore. Now you know which branches you can remove from the local repository with the commands:

git branch -d bugfix/BF_Bug1

git branch -d feature/SomeGreatFeature

The above commands are not so hard to learn, but if you have a lot of branches that you want to remove it can be time consuming to remove every branch manualy. To make your life a little bit easier you have to do the following steps:

  1. Find you .gitconfig file which contains the configuration of GIT.
    On my machine the .gitconfig file could be found at: %USERPROFILE%\.gitconfig
  2. It is possible to add alias command in this configuration file and that's excactly what we are going to do. Add the following lines to the .gitconfig file:
    [alias]
    prune-branches = !git remote prune origin && git branch -vv | grep ': gone]' | awk '{print $1}' | xargs -r git branch -d
  3. You can now remove all local branches from which the remote branch is removed. Open a command prompt in you source directory and type the following command:
    git prune-branches
  4. The output will be something like:
    Deleted branch bugfix/BF_Bug1 (was 6a64538).
    
    Deleted branch feature/SomeGreatFeature (was ccb8b3f).

I really like this command, because now I can execute one command that will clean my local repository. Of course you can name the alias whatever you like. Personally I updated my alias to be ' git pb'  which is faster to type than 'git prune-branches'.

I hope I have made you life a little easier by writing this blog!

Happy Coding!

Creating X509Certificates from an Azure Web Service results in 'The system cannot find the file specified'

In a project I have worked on we wanted to create in memory certificates to use in our web application.

There are a lot of reasons why you don't want to enable this kind of functionality, but let's keep it
simple and only talk about this scenario without arguing.

In our code we used X509Certificate2 to create the in memory certificate using a byte array (byte[] certificateData) and
a password (string sercurePassword).

return new X509Certificate2(certificateData, securePassword);

To make our live as developers a little simpeler we created both a Web project (IIS Express) and a Console (Selfhost) project in Visual Studio.
During development I mainly use the Selfhost but the above solution works very nice in both the Selfhost as the Web Project. So I created a pull request and my colleages reviewed and completed the pull request. The nice thing about our build/release cycle is that in about 5 minutes the above code was deployed to Azure in de Development
Environment.

WHAT???? It's not working......

After looking in the Error Log I found this error:

"The system cannot find the file specified"

WHAT?? I don't have a file.. so.. which file can't be found...
All the code is only using in memory variables... Help!!!

Luckily I found the answer in one of the stack overflow answers.

Due to the fact that the Azure Web Service does not have a User Profile it cannot create a Certificate for you. To fix this problem set the StorageFlags of the X509Certificate2 with the following values:

return new X509Certificate2(
    certificateData, 
    securePassword,
    // MachineKeySet and Exportable are needed to be able to create Certificates inside
    // Azure Web Services
    X509KeyStorageFlags.MachineKeySet | 
    X509KeyStorageFlags.Exportable);

With these settings the Certificate is created successfully and my app is also working is Azure.

Happy Coding ;-)

Git commands I keep forgetting (part 2)

One year ago I wrote a blog about the Git commands I keep forgetting and Update local develop branch without checkout. These blogs contains some of the commands that I frequently use during developing. After writing this first blog post about git commands I learned other git commands as well that I find interesting to share with you.

Remove local branches when remote branch is already removed

In the projects I currently work on we use Git Flow a lot. This means that we use a lot of branching and when a remote branch is removed (ex: git flow feature finish) by someone else my own local git repo must be updated. You can delete the remote branches from the cache using the fetch prune command:

git fetch -p

But if you have pulled the branch ones then you also have a local branch. When you use this command the local branch is not deleted. For deleting local branches use the following command:

git branch -d some_feature_branch

If the local branch has some changes that where not merged to the master or develop branch git will show a warning. To force the branch deletion use:

git branch -D some_feature_branch

*note: the uppercase -D instead of -d

Remove remote branch

Sometimes it happens that the remote branch is not correctly removed from the remote repository during 'git flow feature finish'. I you want to delete the remote branch using the command line use the following command:

!!!!CAUTION NO UNDO, USE AT OWN RISK!!!

git push origin --delete some_feature_branch

Temporarily store changes without commit

If you have made changes but suddenly you see you are working on the wrong branch. Oops... What to do now? Stash comes to rescue you.
With stash you can set changes in a temporary storage of git and pull the changes when needed. So when you have changes on some branch and want to move them to another branch you can use the following commands:

# Stash the uncommited changes
git stash

# Swith to other branch
git checkout some_other_branch

# Get the changes from the stash
git stash pop

There is a lot you can do with stashing so for the full documentation go to https://git-scm.com/docs/git-stash

 

Happy Coding!

 

Autofac with SignalR and WebApi in Owin registration solution (without using container.Update)

In one of my Owin projects I want to use SignalR and WebApi. Autofac is used as Dependency injection, but somehow this results in a lot of registration problems. 

A lot of samples I've found where using the 

GlobalHost.ConnnectionManager.GetHubContext<>

Do not use GlobalHost in Owin

But the SignalR Autofac documentation clearly states that the GlobalHost static class cannot be used in a Owin project. You should use the IConnectionManager interface.

So I tried to create a Generic HubContextProvider<THubClient> which can be used to get the typed HubContext from the ConnectionManager.

public interface IHubContextProvider<THubClient>
        where THubClient : class
{
    IHubContext<THubClient> GetHubContext<THub>()
        where THub : Hub<THubClient>;
}

public class HubContextProvider<THubClient> : IHubContextProvider<THubClient>
        where THubClient : class
{
    private readonly IConnectionManager connectionManager;
    public HubContextProvider(IConnectionManager connectionManager)
    {
        this.connectionManager = connectionManager;
    }

    public IHubContext<THubClient> GetHubContext<THub>()
        where THub : Hub<THubClient>
    {
        return this.connectionManager.GetHubContext<THub, THubClient>();
    }
}

This class is used in Singleton HubHandlers, which are classes that are used to communicate back to the Clients subscribed. In the generic base class the property HubContext is set using the HubContextProvider.

Registration

In the Startup class of the application I created a ContainerBuilder and register all the types in the application. 

var builder = new ContainerBuilder();

// Register all typed needed in the application
// ....
builder.RegisterHubs(Assembly.GetExecutingAssembly());

var container = builder.Build();

Create Dependency Resolvers

For using WebApi and Autofac you need to create the AutofacWebApiDependencyResolver.

HttpConfiguration httpConfig = new HttpConfiguration();
var webApiResolver = new AutofacWebApiDependencyResolver(container);
httpConfig.DependencyResolver = webApiResolver;

And for using SignalR and Autofac you need to create the AutofacDependencyResolver

HubConfiguration hubConfig = new HubConfiguration();
var signalRResolver = new AutofacDependencyResolver(container);
hubConfig.Resolver = signalRResolver;

Error resolving

I followed the instructions of Autofac to create correct registrations for SignalR and WebApi. But when the application is started I got the following error:

None of the constructors found with 'Autofac.Core.Activators.Reflection.DefaultConstructorFinder'
on type 'HubContextProvider`1[IStateHubClient]' can be invoked with the available services and parameters:
Cannot resolve parameter 'Microsoft.AspNet.SignalR.Infrastructure.IConnectionManager connectionManager'
of constructor 'Void .ctor(Microsoft.AspNet.SignalR.Infrastructure.IConnectionManager)'.

Ok... so somehow I am missing the IConnectionManager, so who is responsible for creating/register this interface? 

Somehow this interface is created by the AutofacDependencyResolver from Autofac SignalR, but is not injected in the container that is used by the application. 

Register IConnectionManager

There seems to be a simple solution for the error. Just register the IConnectionManager at the container yourself. But.. Where can I get the IConnectionManager? In the HubConfiguration!

builder.RegisterInstance(hubConfig.Resolver.Resolve<IConnectionManager>());

So the registration looks like this:

HttpConfiguration httpConfig = new HttpConfiguration();
HubConfiguration hubConfig = new HubConfiguration();

var builder = new ContainerBuilder();

// Register all typed needed in the application
// ....
builder.RegisterInstance(hubConfig.Resolver.Resolve<IConnectionManager>());
builder.RegisterHubs(Assembly.GetExecutingAssembly());

var container = builder.Build();

var webApiResolver = new AutofacWebApiDependencyResolver(container);
httpConfig.DependencyResolver = webApiResolver;

var signalRResolver = new AutofacDependencyResolver(container);
hubConfig.Resolver = signalRResolver;

After adding the registration of the IConnectionManager everything seems to be working during startup. But the SignalR hubs and the handler are not using the same IConnectionManager. It seems like that the 'new AutofacDependencyResolver(..)' overrides the IConnectionManager implementation internally. So when the Hub's are created it uses the internal IConnectionManager and the HubHandlers are using the just registered IConnectionManager.

So now we need to have a solution for this.

Solution 1 [builder.Update(..)]

After using our big friend Google I found one solution which uses the ContainerBuilder.Update() method. First we need to register all our own registrations, then create the dependency resolvers and after the add the IConnectionManager to the current container.

HttpConfiguration httpConfig = new HttpConfiguration();
HubConfiguration hubConfig = new HubConfiguration();

var builder = new ContainerBuilder();

// Register all typed needed in the application
// ....
builder.RegisterHubs(Assembly.GetExecutingAssembly());

var container = builder.Build();

var webApiResolver = new AutofacWebApiDependencyResolver(container);
httpConfig.DependencyResolver = webApiResolver;

var signalRResolver = new AutofacDependencyResolver(container);
hubConfig.Resolver = signalRResolver;

var signalRBuilder = new ContainerBuilder();
signalRBuilder.RegisterInstance(hubConfig.Resolver.Resolve<IConnectionManager>());
signalRBuilder.Update(container);

Solution 2 [without builder.Update(...)]

However solution 1 seems to work it is best practice to not update the registrations after building the container. See Consider a container as immutable in the Autofac documentation.

This is also visible in Visual Studio with the warning:

Warning	CS0618	'ContainerBuilder.Update(IContainer)' is obsolete: 
'Containers should generally be considered immutable. Register all of your dependencies 
before building/resolving. If you need to change the contents of a container, you
technically should rebuild the container. This method may be removed in a future major release.

Mmm... we should fix this...before getting some major release of Autofac that does not support the update() method.

I've seen some complex solutions for registration of the IConnectionManager using different kind of complex methods. Like registration of the SignalR dependency resolver and using this resolver again for getting the connection manager. I also tried a solution for registration using lambda method.

builder.Register(context => hubConfig.Resolver.Resolve<IConnectionManager>());

Idea behind this was to be able to register the IConnectionManager but when the Resolve() occurs use the hubConfig to get the IConnectionManager. But this results in a StackOverflowException... Nice :(...

So now the solution that works for me and is not using the ContainerBuilder.Update() method is by changing the HubContextProvider from the beginning of the post to use the HubConfiguration instead of the IConnectionManager.

public class HubContextProvider<THubClient> : IHubContextProvider<THubClient>
    where THubClient : class
{
	private readonly IConnectionManager connectionManager;
	public HubContextProvider(HubConfiguration hubConfiguration)
	{
		this.connectionManager = hubConfiguration.Resolver.Resolve<IConnectionManager>();
	}

	public IHubContext<THubClient> GetHubContext<THub>()
		where THub : Hub<THubClient>
	{
		return this.connectionManager.GetHubContext<THub, THubClient>();
	}
}

For this to work, the HubConfiguration must be available in the container, but that one simple extra call:

HttpConfiguration httpConfig = new HttpConfiguration();
HubConfiguration hubConfig = new HubConfiguration();

var builder = new ContainerBuilder();

// Register all typed needed in the application
// ....
builder.RegisterHubs(Assembly.GetExecutingAssembly());
builder.RegisterInstance(hubConfig);

var container = builder.Build();

var webApiResolver = new AutofacWebApiDependencyResolver(container);
httpConfig.DependencyResolver = webApiResolver;

var signalRResolver = new AutofacDependencyResolver(container);
hubConfig.Resolver = signalRResolver;

There is no need for registering the IConnectionManager anymore.

Yes... No I have a solution that works with Owin, SignalR, WebApi and Autofac without using ContainerBuilder.Update().

Update:

I got some questions about the registration of the HubContextProvider<>. This is the registration I used:

builder.RegisterGeneric(typeof(HubContextProvider<>)).As(typeof(IHubContextProvider<>));

With this all the HubContextProviders are automatically added to the registration.