Azure Api Management acts as a security proxy to 1 or more web services (hosted separately). The intention is that developers will request resources via Azure API Management that will forward the request onto the appropriate web API given appropriate permissions. It is important that the underlying Web Service cannot be accessed directly by an end user (and therefore bypassing the API management security). To achieve this we are using a client certificate to validate that the request has come from the API management site.
This post describes how to:
- Create a self signed certificate
- Configure certificates in Azure API Management
- Configure the Azure Web App to enable client certificates
- Add code to validate a certificate has been provided
1) Create a self signed certificate
Run the following example commands to create a self-signed certificate. Tweak the values as required:
makecert.exe -n "CN=Your Issuer Name" -r -sv TempCA.pvk TempCA.cer
makecert.exe -pe -ss My -sr CurrentUser -a sha1 -sky exchange -n "CN=Your subject Name" -eku 184.108.40.206.220.127.116.11.2 -sk SignedByCA -ic TempCA.cer -iv TempCA.pvk
2) Configure certificates in Azure API Management
-> Open the Azure API management portal
-> Click APIs –> Choose the appropriate API that you want to secure -> Security -> Manage Certificates
-> Upload the certificate
A policy should automatically have been added that intercepts requests and appends the appropriate certificate information before forwarding the request to that Web API. Check the policies section to confirm it has been added. The following screenshot shows the expected policy definition
3) Configure the Azure Web App to enable client certificates
Given the Web Api is deployed as an azure App then there is no direct access to IIS to enable client certificate security. Instead configuration must be done either using the Azure REST api; or using the Azure Resource Explorer (preview).
A description of using the REST api is here.
To update is via resource explorer follow these steps:
- go to https://resources.azure.com/, and log in as you would to the Azure portal
- find the relevant site, either using the search box or by navigating the tree
- Switch mode from ‘Read Only’ to ‘Read/Write’
- click the Edit button
- Set "clientCertEnabled": true
- Click the PUT button at the top
4) Add some code to the web api to check the client certificate
This can be done a number of ways. However the following code will perform these checks:
- · Check time validity of certificate
- · Check subject name of certificate
- · Check issuer name of certificate
- · Check thumbprint of certificate
To check on each request to the Web Api add a custom DelegatingHandler. Extend a class from from System.Net.Http.DelegatingHandler and override the SendAsync Message. To access the certificate information you can query the HTTPRequestMessage
To add the custom message handler to all new requests add the following code to App_Start/WebApiConfig.cs
The output window in Visual Studio shows lots of useful information; but it doesn't link it to the time.
Sharepoint deployments take ages and I’m quite often distracted and forget whether I have triggered the deploy or not.
I want to quickly check the output window and see when I last triggered the deploy. I found 2 solutions:
1) Add a pre-build (pre-deployment) step.
Open project properties and add the following command:
ECHO ------ %TIME% ------
Now the output window displays:
2) Increase the MSBuild diagnostic level in options
By default Visual Studio shows ‘Minimal’ diagnostic information when compiling a project. Increasing this level will show more details (including timestamps at start, end and elapsed build time)
- Click tools –> options
- Under Projects and Solution –> Build and Run
- Change the output verbosity to ‘Normal’;
It was good to see that an updated framework for Prism that works with WinRT available. We've used it now on a couple of Win8 app's and it has worked reasonable well.
It's focus is on rapid app delivery and it has useful features to cater for some Win8 specific concerns; such as assembling Flyouts and handling SessionState. These are useful features that save a modicum of time when starting a new App.
Its major flaw (in my opinion) is it lacks support for other .Net variants. I was hoping to see more abstractions being moved into a Portable Class library; that could allow for code re-use across classic .Net, Windows 8 Apps and Windows Phone App. Features can only be used within the realms of a Windows 8 app (with the exception of the EventAggregator.)
This also has a knock-on effect with unit testing.
Unit testing a Win 8 is severely limited. There are challenges with build integration and a lack of a decent mocking framework. A common technique is to architect the App in such a way that the UI logic (ViewModel's) live in a separate PCL assembly. The logic can then be tested using classic .Net assemblies to take advantage of all the great Mocking frameworks available.
Unfortunately, the Prism downloaded from Codeplex targets .Net for WinRT and makes this technique impossible.
Its possible to modify the Prism framework to cater for this unit testing technique. I done so on a recent project and had good success…However its a shame it didn’t come out of the box this way.
Technorati Tags: Prism
I blogged previously about the difference when accessing Resources within a .net for Windows Store app as compared with a classic .Net application.
Recently I included a 3rd party assembly into my app and received the exception:
‘ResourceMap Not Found’
This indicates that a request for a particular resource has failed because it cannot find it. It would be nice if the error message told you which resource it was unable to find. However, In my case it was due to the 3rd party library using String resources that it could not find.
To debug this I took a look at the resource file that the app was using. This is found in the \bin\debug\ folder of the app and is named ‘resources.pri’. This file is a binary and you cannot read it directly. Instead you must use the command line tool ‘makepri.exe’ to dump a human readable version of the file.
- Open developer command line
- Navigate to the ‘~\project\bin\debug\’ folder
- Run the command ‘makepri.exe dump’
This will output an xml version of the resources used by an application – including any that were associated with my 3rd party assembly.
Effectively all resources (including 3rd party) should be merged into this one file when compiling. However to do the merge the 3rd party resources need to be available; and located relative to the 3rd party assembly (or in the bin folder). if the compiler does not find them then they do not get merged into the resources.pri and your app breaks at run-time.
- So check that you have the .pri files for 3rd party assemblies and that you have put on disk alongside the referenced assembly (.dll). The .pri file should be the same name as the assembly. So for instance, if you reference an assembly called ‘Prism.dll’ then you should have a ‘Prism.pri’ in the same folder as it.
Code coverage with TFS Build can be enabled by editing the build definition and modifying the Automated Tests settings. Unfortunately this may instrument more assemblies than you want it to. For example, Unit Test projects will appear as part of the code coverage results.
To solve this there is a filter mechanism that determines which assemblies will be instrumented; however it can be non obvious to configure.
The trick is to include a .runsettings file within your solution that contains rules on the assemblies to include/exclude. It is a xml structure and full details of the schema can be found here. (Do not confuse with .testsettings). You can also install a Visual Studio Template to automatically add the basic .runsettings file directly from Add New Item.
Then edit the xml to exclude or include specific assemblies (or alternatively use regex to match paths). For instance, If you want to exclude all assemblies where its name finishes ‘Tests.dll’ then you can add the following to the Exclude section:
Note: that this matches on full path names so you need to bear this in mind when forming your regex
Check the .runsettings file into TFS.
Now edit the build definition, –>Process, –> Automated Test, ….and instead of selecting ‘Enable Code Analysis’; select ‘Custom’ and provide the source control path to the .runsettings file.
This is it.
Save the build definition and start a new build. It should run Tests and Code Analysis and use the .runsettings file to determine which assemblies to instrument.
Note: One gotcha to watch out for is that the Options drop down shows different choices depending on where you attempt to configure it. If you use the Automated Tests Dialog then the option is ‘Custom’ as above. If you edit without opening the dialog then the choice is UserSpecified. Go figure
On a recent TFS consultancy job, I was asked to monitor how long some builds spent waiting in the Build queue before Starting.
My plan was to use the TFS API to query all builds with a status of ‘Queued’ and monitor the wait times.
I wrote the code and everything seemed to work fine. However, after capturing a number of wait times and comparing them to the overall build times I noticed that the times did not match together.
In fact a build that was estimated to complete in 2 hours; took more than 6 hours and did not spend any time ‘Queued’
The Build Controller distributes builds across multiple agents and will start one build per build agent. (given you haven't changed the default MaxConcurrentBuilds setting ). i.e If you have 3 build agents and you start 3 builds then the controller will set 3 builds into ‘InProgress.’ If you start a 4th build the this build will be ‘Queued’
This works fine given that any build agent can run any build definition
Unfortunately it does not take into account ‘Tags’ that may force certain builds onto specific agents.
Given the same conditions of 3 build agents:-
If you tag a build agent so only certain builds can use it and then start 3 builds that should only run on this tagged build agent. –> well you would expect that only one build would be set ‘InProgress’ and the other 2 builds would remain ‘Queued’ until the build agent finished the 1st build.
However the actual behaviour is that all 3 builds change to ‘InProgress’ at the same time; one per the MaxConcurrentBuilds setting on the build controller); but only the first build is actually doing anything. The second two builds are stuck waiting to be allocated an agent .
You look at your dashboard and see a list of builds ‘In Progress’ that are actually blocked waiting for a build agent.
On the above screen-shot, only 213 is actually running on Build Agent 1. (214 and 215 are blocked waiting for agent 1 to become available)
Worse than that is 217; that can run on any build agent; is blocked in a ‘queued’ state when there are 2 idle build agents that could be running this build. However, It cannot start because the MaxConcurrentBuilds value of 3 has been reach.
Be very careful with the use of Tags. In future I will try and avoid tags when it could introduce the above bottleneck.
Additionally when attempting to use the TFS Api to capture metrics on wait times then you cannot rely on Queued build only. Instead I’ll query all builds assigned to a controller; and then filter out the list of builds that have been assigned a build agent –> This will give me the accurate list of builds pending.
The new Team Explorer in Visual Studio 2012 has taken me some getting used to.
When working with source control in VS2012, there is no longer a ‘Pending Changes’ View that is independent of team explorer. I missed it because now I must navigate 3 menus into Team Explorer to find out what files I have modified.
That was until I found the new Solution Explorer filters. Pressing Ctrl+[, Ctrl+P will filter the solution explorer to show only checked out files. Tap again to remove the filter. Sweet!
Another interesting shortcut is Ctrl+[, Ctrl+O that filters solution explorer to show only files you have open
I installed Visual Studio 2012 release candidate yesterday. My quick n dirty conversion of Prism failed to compile.
I’ve fixed the issue and released a new version on Codeplex http://metroprism.codeplex.com/releases/view/87705
During my original conversion, I used ‘CoreWindow.Current.Dispatcher’ (to try and dispatch a command on the UI context). In 2012 RC, The ‘Current’ property is no longer present; so instead I have replaced with CoreApplication.MainView.CoreWindow.Dispatcher.
So Visual Studio 2012 RC is out …time to open up all our projects and see what no longer works.
Hit an issue with my unit test projects, the namespace in these projects, ‘Microsoft.VisualStudio.TestTools.UnitTesting’ has been renamed to ‘Microsoft.VisualStudio.TestPlatform.UnitTestFramework’
A quick ‘Find & Replace’ updated all the references …and I thought the job was done until the compiler reported:
‘The type or namespace name 'ExpectedExceptionAttribute' could not be found (are you missing a using directive or an assembly reference?)’
It appears this attribute has been either moved or removed from the framework.
Possible because there is an improved mechanism for Asserting exceptions now. Its possible to replace the [ExpectedException] attribute with the following test code:
Although this is a fantastic improvement over the attributed version; it leaves me in a predicament of having lots and lots of unit tests to fix-up <groan>
If anyone discovers that this attribute does exist somewhere please let me know
Quick Answer: It has a hidden file attribute applied. Changing the folder options to ‘Show Hidden Files and Folder’ revealed the SUO file adjacent to the solution file .
When you open a Visual Studio Solution file, i.e. MySolution.sln, a corresponding MySolution.suo file is generated. This file is constantly updated with the current state of Visual Studio (i.e. which windows you have open). When you close and reopen Visual Studio, the previous state is reloaded from this file.
Unfortunately from time to time this can lead to problems. For example, It is common for Visual Studio to crash when too many designer files are opened simultaneously. Because the state is remembered in the SUO file ; it can then become impossible to re-open the Visual Studio solution because it will result in a re-occurring crash. When this occurs I usually just delete the SUO file; and Visual Studio will re-open a solution afresh; without any retained state.
However in Visual Studio 11 Beta I could not find the SUO file? Its normally would be found in the directory adjacent the solution file; yet I could not see it. I ran a search for SUO files on my C: drive and found none.
Thoroughly confused I opened ProcMon.exe and watched what files Visual Studio was loading as I opened my solution. A quick search found the suo file where I had expected it; adjacent to the solution. However the file had a hidden attribute applied to it so I couldn’t see it with the default folder options in Windows 8.
Changing the folder options to ‘Show Hidden Files and Folder’ revealed the SUO file.