Azure Storage Emulator with Docker on Windows

If you are on the road a lot, this might be a great option for you to keep working with Azure Tables even when on road(or on a flight) with Docker on Windows. I will walk you through the process on setting up your system to use Azure Storage Emulator from your local Docker containers.

If you have not yet started with Docker on Windows, you can download it here.
To get the latest Azure Storage Emulator, click here.

I am using the below product versions but this guide should be good for other versions too. (if not give me shout out at @rgwork and will update this guide)

Windows 10 Version
Windows 10
Docker and Azure Storage Emulator Version
Docker and Azure Storage Emulator Version

Also, download the Azure Storage Explorer so that you can easily verify and view the changes made from your code in the container, from here.

Once all of the above are installed, start the Docker on windows (Linux Containers) and the Azure Storage Emulator from “Start” Menu.

Open a command prompt and then follow the below steps to make sure that Docker and azure storage emulator are running.
For Docker:
Type in

docker version

— If the output lists the version details of the server and client then it is up and running.
For Azure Storage Emulator

cd "C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator"
AzureStorageEmulator.exe status

— The output should list the endpoints and status of the emulator.

The output from above commands should be similar to the image above showing versions for both.

Once the services are up, go to Docker settings from the notification area in the task bar.

Docker Settings
Docker Settings

Switch to Network tab and verify the Subnet address configured for the Internal Virtual Switch.

Docker Network Settings
Docker Network Settings

It should be by default but can be different if there was a network conflict. Moving forward I will assume the subnet is but please update it with your subnet as needed.

Before moving to the next step, I will outline the process by which we will connect the Docker container with the storage emulator on the localhost.

The internal virtual switch create by Docker is the network on which it will create the Linux virtual machine to host containers and it will assign the IP to it. It will also create a network interface on the windows machine and assign it the IP and this is the network which allows you to manage the Docker server running on the Linux virtual machine and interact with it.

The storage emulator endpoints listen on and are not accessible by default from external networks including the Docker network.

The plan is to provide a bridge from the Docker network to the localhost IP so that the containers can have uninhibited access to the storage emulator. We will do this by using the port proxy option of netsh command. It acts as the proxy and maps the ports and IPs as required.

We will create v4tov4 mappings so that the Docker network adapter will listen to the requests to storage emulator and will forward it to the and vice versa.

Before we do the mapping, open a command prompt as administrator and run:

netstat -a

The output will be similar to this.

The azure emulator should also be in this list:

Azure Storage Emulator EndPoints
Azure Storage Emulator Endpoints

Choose 3 adjacent ports which are not listed in the output (Column 2 from left) to use for the mapping, from my system I identified 40000-40002 which are not being used. I will use these 3 ports to map from to

In the same command prompt(should be run as administrator), run the below commands to configure the network:

netsh interface portproxy add v4tov4 listenport=40000 listenaddress= connectaddress= connectport=10000  protocol=tcp
netsh interface portproxy add v4tov4 listenport=40001 listenaddress= connectaddress= connectport=10001  protocol=tcp
netsh interface portproxy add v4tov4 listenport=40002 listenaddress= connectaddress= connectport=10002  protocol=tcp

To confirm the changes run netstat again and confirm that the system is now listening on the configured ports.

netstat -a

It should look like for

Updated Ports
Port Proxy Mapping

The first one is created by docker for the SMB mapping(445) and the rest are our mappings. Now the only thing between containers and storage emulator is windows firewall.

Run “wf.msc” from run to directly open the Windows Firewall with Advanced Security. Browse to Inbound Rules and select the rule DockerSmbMount (created by docker).

Firewall Inbound Rules
Firewall Inbound Rules DockerSmbMount

Right click and copy the rule.

Then Paste it back.

Now you should have 2 rules.

Right Click the newly created rule(top one) and click on Properties.  Rename the rule to DockerAzureStorage.

Switch to Protocols and Ports and update the Local Port from 445 to “40000-40002”.

The required configuration is done, before you click on apply check out the scope tab as this will give you the idea on what the rule does as it is opening up the network ports only from the Linux VM to the localhost. Once you are done looking around, click on apply and close the rule.

We are all set with the configuration and now to test it. I will use a MVC 6 web app to run a test but you should be able to use your choice of application or language if you will use the connection string listed in the below steps.

Open Visual Studio 2017, I am using this and will recommend to get this due to its integrated support for Docker.  If you are not using VS 2017 then the steps I am taking will differ with the steps you need to build and run the docker container image on your system.

In VS 2017, create a new project and choose the ASP.NET Core Web App:

Make sure support for Docker is enabled .

Goto Manage Nuget Packages for the web project and install the package WindowsAzure.Storage:

Add a new class to the project – NewEntity and update its contents from below:

using Microsoft.WindowsAzure.Storage.Table;
namespace TestCoreEmulator.Model
    public class NewEntity : TableEntity
        public string RandomText { get; set; }

We will use this test class to enter data in the local Azure Table.

Add the below function to the home controller resolve the references so there are no more errors:

private static void TestStorage()
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=;TableEndpoint=;QueueEndpoint=;");

    CloudTableClient tableClient = storageAccount.CreateCloudTableClient();

    // Retrieve a reference to the table.
    CloudTable table = tableClient.GetTableReference("myTable");

    // Create the table if it doesn't exist.

    NewEntity en = new NewEntity();
    en.RowKey = "Check" + Guid.NewGuid();
    en.PartitionKey = "P";
    en.RandomText = DateTime.Now.ToString();

    TableOperation insertOperation = TableOperation.Insert(en);

    // Execute the insert operation.

I have just embedded the connection string in the code itself as this a test but ideally it should reside in either user secrets or the config file.

The connection string is derived from the connection string referenced here, which is the default connection string for the azure storage emulator. I have only updated the endpoints so that it points to the right IP and port.


The function above adds random entity to the myTable which we can verify easily with the Azure Storage Explorer. Add the function call to the Contact function of the HomeController so it will be called each time you visit the Contact page.  It should look like this:

public IActionResult Contact()
    ViewData["Message"] = "Your contact page.";
    return View();

Now let’s run the application in VS. Once the site is up and running, click on the Contact link in the header. Once the contact page loads, open the Azure Storage Explorer and browse to the local Tables.

There should be a new table and the entity.

All done, you are all set to use the storage emulator from docker 🙂

NOTE: the netsh proxy mappings are removed if the system is rebooted, my advise is to store the commands in a bat file and either run as needed or automate via task scheduler.

People Picker customization in SharePoint 2010

If you are still stuck in 2010 land, this might help you. In one of my recent projects I had to customize the out of box people picker (multi user) in a custom visual web part  and the old faithful jQuery helped me out.

The requirement was to change the people picker from this :

To this:

The list of changes is really simple :

  1. Hide the image  .
  2. Move the browse image  above the text box.
  3. Add help text before the browse image  .

To start with, here is the  high level html structure which is rendered by the SharePoint for a multi user people picker :

<SPAN ....>
 <TABLE class=ms-usereditor ....>
 <TR>//Row to display errors</TR>
 <TR>//Row with buttons </TR>

I have removed all the metadata to highlight the structure. You can also access the complete structure here.

Also, if you have not looked at the complete structure, here is the highlight from the row with buttons to look at how the images are being rendered.

Once you have gone through the structure, you will realize that  hiding the check names image is easy as it can be selected by the img title tag and then hidden by jQuery.

 <1> $("img[title='Check Names']").hide();

Remember to limit the selector’s range by providing the div/table id which will contain the people picker to be changed as the script above will hide all images with title of ‘Check Names’ on the page.

For our second requirement, adding the text is also straight forward as SharePoint adds an empty cell before the check names and browse buttons, with a span. We can update the text in the span with our required help text and apply the CSS we want.

<1> var row = $("img[title='Check Names']").
<2> row.find('td:first').find('span').
text('<Custom Text>');
<3> row.find('td:first').find('span').
addClass('<Custom CSS Class>');

For our third requirement, we need to move the last row containing buttons to the top of the text box i.e. move the last row to the top.

What we have currently have from the above script is a row object which is in a nested table in the last row. So the first task of action is to get the parent row object to work on, which can be done via:

 <1> var topRow = $(row).parentsUntil('tr');
 <2> topRow = topRow[topRow.length - 1];
 <3> topRow = $(topRow);
 <4> topRow = $(topRow.parent());

In the above script, the first line gets all the objects till the top level row (which will be TD since parentsUntil  only gets the objects till the parent without including the parent), line 2 gets the top object of this collection and then in line 3 we cast the object as a jQuery object. In line 4 we get the parent of the cell which is the row we need to push to top of the parent table.

Now we move this row to the top using :


This will select the two previous objects (Row 2 and then Row 1) and move the row before the first row in the table.

The final structure changes to this:

<SPAN ....>
 <TABLE class=ms-usereditor ....>
 <TR>//Row with buttons</TR>
 <TR>//Error Row</TR>

You can download the complete script here : peoplepicker2010.js