5 Tips for using Azure Web Jobs

1. Use public on the main program class.In order for web jobs to initialise correctly the main class that contains the web jobs needs to be made public. Once this has been added the individual jobs can then be read and should be visible in the output when running locally.

clip_image001

2. In order to store and view the invocation details for each web job you need to configure AzureWebJobsDashboard in the configure tab of the website you have deployed the web job to. Even if you have configured this in your app.config file.

clip_image003

If this is not configured in the website then you will receive the following error when you try and view the web jobs dashboard

clip_image002

3. Debug using Visual Studio. Once of the nice features of the web jobs SDK is the ability to run and debug the web job locally in Visual Studio. Following the Getting Started guide, you create a console application which you can debug in Visual Studio before deploying it to Azure

4. User TextWriter for debugging. The Azure Web Jobs SDK (see the logging section) provides a mechanism to log out information that can be viewed through the Azure Web jobs dashboard. By adding a TextWriter as an input parameter to your web job method, you can use WriteLine to then output information you wish to log.

5. Make your Blob Triggers more efficient by triggering them using BlobOutput. The mechanism that the BlobInput trigger uses has a 10-20 minute lag before the trigger can fire, but each time BlobOutput is used it triggers a rescan for Blob input.

“There is an optimization where any blob written via a [BlobOutput] (as opposed to being written by some external source) will optimistically check for any matching [BlobInputs],” See How does [BlobInput] work?. Storage Queues and Service Bus topics and Queues are generally processed within seconds so if you can use a queue to trigger a BlobOutput then use this to trigger any subsequent BlobInputs

Azure Service Bus Event Hub Firewall Port

I’m investigating the Azure Service Bus Event Hub using the getting started tutorial and I didn’t seem to be able to receive any data. It turns out that our firewall was blocking an outbound port. After some investigation I found a post which hinted at a port for the on premise service bus. Our IT guys kindly enabled the outbound port 5671 and I now can receive data from the event hub.

For completeness the following site has details of the other firewall ports required for service bus : http://msdn.microsoft.com/en-us/library/ee732535.aspx

Internet of Things (IoT): Gadgeteer and Service Bus

Internet of Things seems to bring together two of my favourite topics: Gadgeteer and Service Bus. Whilst researching IoT I came across an article in MSDN magazine written by Clemens Vasters (http://msdn.microsoft.com/en-us/magazine/jj190807.aspx). This article is from July 2012 and things have moved on a little since then, but the fact that he  has Gadgeteer talking to service bus meant that I had to give it a go myself. The first port of call was the previous article (http://msdn.microsoft.com/en-us/magazine/jj133819.aspx – note the link is wrong in the current article). This explains the architecture that the sample is based upon using service bus topics to send commands to the device and a different topic to allow the device to send data. There is also a provisioning service that allows the devices to be initialised with the correct configuration. this provisioning service also configures up the service bus access control service (ACS) to allow each device to have its own security key. This means you can turn off devices using ACS.

Before you start take a look at the Service Bus Explorer as this is a useful tool when you are trying to diagnose why things aren’t working.

As I’m using a GHI Electronics Fez Spider main board I am using the .Net Micro-Framework 4.2. Upgrading the project to 4.2 had a couple of errors which needed resolving. Firstly, you will need to change GetJoystickPosition to GetPosition; secondly, change ConvertBase.ToBase64String to Convert.ToBase64String. This allowed me to run the project on my Gadgeteer board. However, I kept getting an error whenever I tried to call the provisioning service. I kept getting Bad Request. I immediately assumed that my configuration was wrong but after a bit of searching and then turning WCF tracing on I found that the service could not load the service bus assembly so I removed and the re-added it to solve the problem. As I mention configuration its probably a good idea to say what each of the settings in the provisioning service is used for:

sharedSignature : Go to https://manage.windowsazure.com/ and login. Click on service bus and then select the service bus you are using. on the bottom menu click the Connecting Information button. This will popup a configuration window. there are two keys in here. The first is part of the connectionstring and is under the sas section. Copy the connection string and find the key. This is the sharedSignature for this configuration setting.

servicebusNamespace: This is the name of the service bus as it appears in the management portal.i.e. sb://<servicebusNamespace>.servicebus.windows.net

managementKey: In the same connection information popup where you found the shared signature there is a section at the bottom labelled ACS. the managementkey is the Default Key

Microsoft.ServiceBus.ConnectionString: I used the connection string that appears in the SAS section of the connection information popup.

The other configuration you need to do is to change the url for the provisioning service. This is hard coded on the Gadgeteer board and is located in Program.cs, serverAddress variable in the ServiceBusApp project.

The provisioning service should be ready to go. However, I had problems connecting to the service from the Gadgeteer board as I kept receiving NotSupportedException each time I called GetRequestStream. This was due to an issue with the Ethernet configuration when trying to connect over https. This can be solved by updating the ssl seed using the Fez Config tool (https://www.ghielectronics.com/community/forum/topic?id=13927). This is done by clicking the Deployment (Advanced) button and the clicking Update SSL Seed.

image

Once complete I could then connect to the provisioning service. The provisioning service should only be called once per device and it is up to the device to store its configuration in a persistent store. This did not appear to be working on my device. Some of the settings were being but the topic urls were not. I changed the type from a URI to a string and the persistence then seemed to work and I only went to provision once. Each time the provisioning service is called a new subscription is create and a new access control identity and rule are also created.

With all this fixed I could now send messages, but I could not see them. this was because I didn’t have a subscriber to the topic where the data was published. This is easily resolved by creating one, but it will only receive new messages. Any messages sent before the subscription is created will be lost.

The provisioning service also has a web page that allows you to send commands to each device. It will broadcast a message to all devices by putting a single message into the devices topic and it sets the Broadcast property to be true. During provisioning the subscription that is created has a SQL Filter applied which allows the subscription to only receive messages that are targeted specifically at the device or if they are broadcast. The web page puts a message into the topic to tell the device to set its temperature to a specific value.The device should be listening for messages to its subscription and will act on the command once it is received.

The device never seemed to receive the message even though the Service Bus Explorer showed that the message was waiting in the queue. Whenever we tried to connect to the subscription “Bad Request” was being returned. After investigation is turns out that the sample only ever sets the event topic uri and not the devices topic uri. When we try and retrieve the device commands we are trying to connect to the events topic which is not a subscription. The sample needs modifying in Microsoft.ServiceBus.Micro project in the MessagingClient class. I added an extra Uri to the constructor and modified the CreateReceiveRequest and CreateLockRequest methods to use this Uri.

The final thing I changed was the command that is sent from the web page and how it was received:

The sender code in Default.aspx.cs in the BackEndWebrole Project

deviceSender.Broadcast(new Dictionary<string, object> { { “Temperature”, this.TextBox1.Text } }, “SetTemperature”);

And the receiver code in Program.cs in the ServiceBusApp project:

switch (commandType)
{
      case “SetTemperature”:
      if (cmd.Properties.Contains(“Temperature”))
      {
               this.settings.TargetTemperature = double.Parse((string)cmd.Properties[“Temperature”]);
               StoreSettings(this.settings);
     }
     break;
}

I now have a Gadgeteer device talking to the service bus with the ability to send data and receive commands. My next steps are to create a webjob to process the event data (see my previous post) and also look into event hubs.

Windows Store App Notifications, the Notification Hub and Background tasks

This article aims to talk about Windows Store Notifications and the Windows Azure Notifications Hub and it will attempt to collate the various articles in a single place to help you build notifications into your app.

In order for you to get an understanding of Windows notifications look at the following article

Introduction to Push Notifications – http://msdn.microsoft.com/en-us/library/windows/apps/hh913756.aspx. this provides a good overview of how push notifications work. To summarise the important bits.

1. Your store app needs to register with the Windows Notification Service to retrieve a unique URI for your instance of the app. Ideally you do this each time the app starts.

2. If the URI has changed then you need to notify your service that there is a new URI. Note: This URI expires every 30 days so your app needs to notify your service that this has been changed.

3. Your service sends notifications to this unique URI

You may have noticed above that I mentioned “Your service”. This is a critical piece of the notification mechanism and there are a number of ways to build this service. If you are not comfortable building backend services or you want something up and running quickly then mobile services might be the way to go for you. Here’s a tutorial that gets you started with mobile services http://www.windowsazure.com/en-us/develop/mobile/tutorials/get-started/

If, like me, you already have a source of data and a service then you will probably want to wire in notifications into your existing service. depending upon how many devices you have using your app may dictate the method that you get the notifications onto the users device. there are a number of options:

  1. Local updates
  2. Push Notifications
  3. Periodic Notifications

Local updates require the creation of a background task that Windows runs periodically that calls into your data service, retrieves the data to put on the tiles and sends out tile notifications using the Windows Store app SDK

Updating live tiles from a background task – http://msdn.microsoft.com/en-us/library/windows/apps/jj991805.aspx. Provides a tutorial on building a background task for your Windows Store App. this tutorial is for timer tasks but it can easily be used for push notification tasks. The bits that are likely to change are the details of the run method, the task registration and the package manifest.

Two more important links that you will require when you are dealing with notifications:

Tile template catalogue http://msdn.microsoft.com/en-us/library/windows/apps/hh761491.aspx

Toast template catalogue http://msdn.microsoft.com/en-us/library/windows/apps/hh761494.aspx

These two catalogues are important as they provide you with details of the xml you need for each type of notifications

Push notifications are sent through the Windows Notification Service to your device.

You can send notifications to your device from your service by creating a notification and sending it to each of the devices registered to your service via the Windows Notification Service.

If you have a large number of devices running your app then you will probably want to use the Windows Azure Notification Hub. This is the simplest way to manage notifications to your application as the notification hub handles scaling, managing of the device registration and also iterating around each device to send the notifications out. The Notification hub will also allow you to send notifications to Windows Phone, Apple and Android devices. To get started with the notification hubs follow this tutorial:http://www.windowsazure.com/en-us/manage/services/notification-hubs/getting-started-windows-dotnet/

The nice feature of the notification hub is that is makes the code needed to send notifications simple.

 

NotificationHubClient hub = NotificationHubClient.CreateClientFromConnectionString(“<your notification hub connection string>”, “<your hub name>”);

 

var toast = @”<toast><visual><binding template=””ToastText01″”><text id=””1″”>Hello from a .NET App!</text></binding></visual></toast>”;

 

await hub.SendWindowsNativeNotificationAsync(toast);

Compare this to the code to send the notification without the hub:

 

byte[] contentInBytes = Encoding.UTF8.GetBytes(xml);

 

 

HttpWebRequest request = HttpWebRequest.Create(uri) asHttpWebRequest;

request.Method =

“POST”;

request.Headers.Add(

“X-WNS-Type”, notificationType);

request.ContentType = contentType;

request.Headers.Add(

“Authorization”, String.Format(“Bearer {0}”, accessToken.AccessToken));

 

 

using (Stream requestStream = request.GetRequestStream())

requestStream.Write(contentInBytes, 0, contentInBytes.Length);

 

 

In addition you will need to retrieve the list of devices that are registered for push notifications and iterate around the list to send this to each device. You will also require a service that receives the registrations and stores them in a data store. You need to manage the scalability of these services. On the down side the notification hub is charged per message which means the more often you send notifications the greater the costs where as hosting a service is load based and the notifications will be sent out slower as the number of devices increases but this would generally be a lower cost. If you also take into account that you will need to send out notifications for each tile size and that will increase the activity count on the notification hub for each tile size (currently 3).

[Update: You can send out a single notification for all tile sizes rather than 3 separate notifications by adding a binding for each tile size in your xml see http://msdn.microsoft.com/en-us/library/windows/apps/hh465439.aspx for more details]

It is possible to send custom notifications to your app which can be received directly in the app or by using a background task. These are called Raw notifications. In order to receive raw notifications in a background task your app needs to be configured to display on the start screen. However Raw Notifications can be received in your app whilst it is running when it is not configured to display on the start screen. A Raw Notification is a block of data up to 5KB in size and can be anything you want.

The following code will send a raw notifications using the notifications hub:

 

string rawNotification = prepareRAWPayload();

 

Notification notification = new Microsoft.ServiceBus.Notifications.WindowsNotification(rawNotification);

notification.Headers.Add(

“X-WNS-Cache-Policy”, “cache”);

notification.Headers.Add(

“X-WNS-Type”, “wns/raw”);

notification.ContentType =

“application/octet-stream”;

 

 

var outcome = await hub.SendNotificationAsync(notification);

In order to receive Raw Notifications in your app you need to add an event to the channel you retrieve from the Windows Notification Service:

 

var channel = awaitPushNotificationChannelManager.CreatePushNotificationChannelForApplicationAsync();

 

channel.PushNotificationReceived += channel_PushNotificationReceived;

 

And then handle the notification received:

 

privatevoid channel_PushNotificationReceived(PushNotificationChannel sender, PushNotificationReceivedEventArgs args)

{

 

switch (args.NotificationType)

{

 

    casePushNotificationType.Raw:

 

        ReceiveNotification(args.RawNotification.Content);

 

    break;

}

}

 

Note: the content of the notification is the block of data that you sent out.

Sample background task for Raw Notifications is here: http://msdn.microsoft.com/en-us/library/windows/apps/jj709906.aspx

Guidelines for Raw Notifications can be found here: http://msdn.microsoft.com/en-us/library/windows/apps/hh761463.aspx

Periodic notifications also require a service but the application periodically calls into a service to retrieve the tile notifications without needing to process the source data and then create the notifications locally. details about how to use periodic notifications can be found here: http://msdn.microsoft.com/en-US/library/windows/apps/jj150587

In summary Windows Store application notifications can be send to the app in a variety of ways and the mechanism you choose will depend upon how quick and how many notifications are required. Push notifications allow notifications to be sent whenever they are ready to send. Periodic and Local updates are pull notifications and require a service to be available to pull the data from. All of these will require some sort of service and all have an associated costs. The notifications hub is a useful tool to assist with notifications and it can be useful to manage the device connections as well as sending out notifications to multiple device type. It does however come at a cost and you need to work out whether it is a cost effective mechanism for your solution.

Gadgeteer, Signal R, WebAPI & Windows Azure

After a good night in Hereford at the Smart Devs User Group and my presentation at DDDNorth

Here are the links from my presentation and some from questions asked:

Gadgeteer: http://www.netmf.com/gadgeteer/

Signal-R: http://www.asp.net/signalr/

Web API: http://www.asp.net/web-api

The Signal-R chat example can be found at: http://www.asp.net/signalr/overview/getting-started/tutorial-getting-started-with-signalr

Windows Azure Pricing Calculator : http://www.windowsazure.com/en-us/pricing/calculator/?scenario=full

Signal-R Scaleout using Service bus, SQL Server or Redis: http://www.asp.net/signalr/overview/performance-and-scaling/scaleout-in-signalr

The Windows Azure Training Kit: http://www.windowsazure.com/en-us/develop/net/other-resources/training-kit/

Gadgeteer Modules: http://proto-pic.co.uk/categories/development-boards/net.html

Fex Spider Starter Kit: http://proto-pic.co.uk/fez-spider-starter-kit/

 

In addition to these links I have more from my presentation at the DareDevs user group in Warrington

It is possible to drive a larger display from Gadgeteer using a VGA adapter. You use this the same way that the Display-T35 works using the SimpleGraphics interface for example.

VB eBook – Learn to Program with Visual Basic and Gadgeteer

Fez Cerberus Tinker Kit: https://www.ghielectronics.com/catalog/product/455 

Enabling Modern Apps

I’ve just finished presenting my talk on “Successfully Adopting the Cloud: TfGM Case Study”and there were a couple of questions that I said I would clarify.

1. What are the limits for the numbers of subscriptions per service bus topic. the answer is 2000. further details can be found at:http://msdn.microsoft.com/en-us/library/windowsazure/ee732538.aspx

2. what are the differences between Windows Azure SQL database and SQL Server 2012. The following pages provide the details:

Supported T-SQL: http://msdn.microsoft.com/en-us/library/ee336270.aspx

Partially supported T-SQL: http://msdn.microsoft.com/en-us/library/ee336267.aspx

Unsupported T-SQL: http://msdn.microsoft.com/en-us/library/ee336253.aspx

Guidelines and Limitations: http://msdn.microsoft.com/en-us/library/ff394102.aspx

3. Accessing the TfGM open data site requires you to register as a developer at: http://developer.tfgm.com

Thanks to everyone who attended I hope you found it useful.

Handling A Topic Dead Letter Queue in Windows Azure Service Bus

Whilst working on a project in which we we using the Topics on Windows Azure Service Bus, we noticed that our subscription queues (when viewed from the Windows Azure Management portal) didn’t seem to be empty even though our subscription queue processing code was working correctly. On closer inspection we found that our subscription queue was empty and the numbers in the management portal against the subscription were messages that had automatically faulted and had been moved into the Dead Letter queue.

The deadletter queue is a separate queue that allows messages that fail to be processed to be stored and analysed. The address of the deadletter queue is slightly different from your subscription queue and is the form:

YourTopic/Subscriptions/YourSubscription/ $DeadLetterQueue

for a subscription and

YourQueue/$DeadLetterQueue for a queue

Luckily you don’t have to remember this as there are helpful methods to retrieve the address for you:

SubscriptionClient.FormatDeadLetterPath(subscriptionClient.TopicPath, messagesSubscription.Name);

To create a subscription to the deadletter queue you need to append /$DeadLetterQueue to the subscription name when you create the subscription client

Once you have this address you can connect to the dead letter queue in the same way you would connect to the subscription queue. Once a deadletter brokered message is received the properties of the message should contain error information highlighting why it has failed. The message should also contain the message body from the original message. By default the subscription will move a faulty message to the dead letter queue after 10 attempts to deliver. You can also move the message yourself and put in sensible data in the properties if it fails to be processed by calling the DeadLetter method on the BrokeredMessage. The DeadLetter method allows you to pass in your own data to explain why the message has failed.

The DeadLetter can be deleted in the same was as a normal message by calling the Complete() method on the received dead letter message

Here is an example of retrieving a dead lettered message from a subscription queue

var baseAddress = Properties.Settings.Default.ServiceBusNamespace; var issuerName = Properties.Settings.Default.ServiceBusUser; var issuerKey = Properties.Settings.Default.ServiceBusKey;  Uri namespaceAddress = ServiceBusEnvironment.CreateServiceUri("sb", baseAddress, string.Empty);  this.namespaceManager = new NamespaceManager(namespaceAddress,                             TokenProvider.CreateSharedSecretTokenProvider(issuerName, issuerKey)); this.messagingFactory = MessagingFactory.Create(namespaceAddress,                             TokenProvider.CreateSharedSecretTokenProvider(issuerName, issuerKey)); var topic = this.namespaceManager.GetTopic(Properties.Settings.Default.TopicName); if (topic != null) {      if (!namespaceManager.SubscriptionExists(topic.Path,                                    Properties.Settings.Default.SubscriptionName))     {         messagesSubscription = this.namespaceManager.CreateSubscription(topic.Path,                                              Properties.Settings.Default.SubscriptionName);     }     else     {         messagesSubscription = namespaceManager.GetSubscription(topic.Path,                                              Properties.Settings.Default.SubscriptionName);     } } if (messagesSubscription != null) {     SubscriptionClient subscriptionClient = this.messagingFactory.CreateSubscriptionClient(                                             messagesSubscription.TopicPath,                                             messagesSubscription.Name, ReceiveMode.PeekLock);     // Get the Dead Letter queue path for this subscription     var dlQueueName = SubscriptionClient.FormatDeadLetterPath(subscriptionClient.TopicPath,                                              messagesSubscription.Name);     // Create a subscription client to the deadletter queue     SubscriptionClient deadletterSubscriptionClient = messagingFactory.CreateSubscriptionClient(                                            subscriptionClient.TopicPath,                                              messagesSubscription.Name + "/$DeadLetterQueue");      // Get the dead letter message     BrokeredMessage dl = deadletterSubscriptionClient.Receive(new TimeSpan(0, 0, 300));     // get the properties     StringBuilder sb = new StringBuilder();     sb.AppendLine(string.Format("Enqueue Time {0}", dl.EnqueuedTimeUtc));     foreach (var props in dl.Properties)     {         sb.AppendLine(string.Format("{0}:{1}", props.Key, props.Value));     }     dl.Complete(); }

Windows Azure Queues vs Service Bus Queues

If you have been wondering why you would use Windows Azure Queues that are part of the Storage Service or the queues that are part of the service bus then the following MSDN article will give you full details.

http://msdn.microsoft.com/en-us/library/hh767287(VS.103).aspx

Recent changes in pricing make the choice even harder. There are two specific areas I like that makes the service bus queue a better offering than the storage queues:

  1. Long connection timeout
  2. Topics

The long connection timeout means that I don’t have to keep polling the service bus queue for messages. I can make a connection for say 20 minutes and then when a message is added to the queue my application immediately returns the data, which you then process and then reconnect to the queue to get the next message. After 20 minutes without a message the connection closes in the same way it does when a message is received except that the message is null. You then just reconnect again for another 20 minute. The makes your application a more event driven application rather than a polling application and it should be more responsive. You can make multiple connections to the queue this way and load balance in the same way as you would when polling queues.

The following code shows how you can connect with a long poll.

 1: NamespaceManager namespaceManager;

 2: MessagingFactory messagingFactory;

 3: Uri namespaceAddress = ServiceBusEnvironment.CreateServiceUri("sb", "yournamespace", string.Empty);

 4:  

 5: namespaceManager = new NamespaceManager(namespaceAddress, TokenProvider.CreateSharedSecretTokenProvider("yourIssuerName", "yourIssuerKey"));

 6: messagingFactory = MessagingFactory.Create(namespaceAddress, TokenProvider.CreateSharedSecretTokenProvider("yourIssuerName", "yourIssuerKey"));

 7:  

 8: WaitTimeInMinutes = 20;

 9:  

 10: // check to see if the queue exists. If not then create it

 11: if (!namespaceManager.QueueExists(queueName))

 12: {

 13:     namespaceManager.CreateQueue(queueName);

 14: }

 15:  

 16: QueueClient queueClient = messagingFactory.CreateQueueClient(queueName, ReceiveMode.PeekLock);

 17:  

 18: queueClient.BeginReceive(new TimeSpan(0, WaitTimeInMinutes, 0), this.ReceiveCompleted, messageCount);

When a message is received or the 20 minute timeout expires then the ReceiveCompleted delegate is called and a check is made to see if the message is not null before processing it.Once processed another long poll connecting is made and the process repeats. The beauty of this method is that you don’t have to manage timers or separate threads manage the queue.

Topics are private queues that are subscribed to by consumers and each private queue will receive a copy of the messages put into the original queue and are all manages individually. Topics can also apply filters to the message so that they only receive messages that they are interested in.

Further details of Service bus topics and queues

http://www.windowsazure.com/en-us/develop/net/how-to-guides/service-bus-topics/

Windows Azure Training Kit–June 2012 Release

The Windows Azure Training Kit June 2012 release is out now with the following features:

  • 12 new hands-on labs for Windows Azure Virtual Machines
  • 11 new hands-on labs for Windows Azure Web Sites
  • 2 new hands-on labs demonstrating Windows Azure with Windows 8 Metro-style applications
  • Several new hands-on labs for Node.js and PHP using Mac OS X
  • Updated content for the latest Windows Azure SDKs, tools, and new Windows Azure Management Portal
  • New and updated presentations designed to support individual sessions to a full 3 day training workshops

How Windows Azure Service Bus helped pin point a configuration error

This week we had a very useful side effect of using the Window Azure Service bus. We have an Azure hosted website that connects to a CRM backend using the service bus in relay mode to communicate between the two systems. We had a test system that worked fine but when we moved to a Live system we had a configuration error in one of the systems but it was difficult to identify.

The way the service bus works means that the server can easily be moved (as long as the server has an outgoing internet connection).Our service bus at the server side is a Windows Service but we also have a console application to help us with debugging as all the traces are logged to the console window. We turned off the service on the Live system and started it on the test system. As the Azure hosted website is connecting to the service bus rather than a specific server the website will now connect to our test system. By running a successful connection test on the Azure hosted site we could prove that the Azure website configuration was correct.

The next change was to configure the test system to point to the Live CRM system. This would prove whether our data was correct or not. Running the same test as before proved that our data migration to the Live CRM system was fine.

This left us with the service bus and the business logic web service running on the test system, so we reconfigured the Live service bus service to point to the Test web service (which we had previously configured to connect to the live CRM system) and this also work. Thus proving we had an issue with the business logic service.

What we were able to do then was to move the service bus console application on to a developers machine and run it in Visual Studio so that we could debug and break on the calls to the business logic service which helped us to easily identify the problem. All this was done without needing to reconfigure or redeploy our Azure hosted website.

I wish I could say that this ease of debugging was one of the reasons we chose to use the service bus, but I would be lying. The fact that it has made our debugging so much easier will now have an influence on its future use.