Tonight I’m going to follow up on my previous post where I promised to show you how to react on somone/something uploading a Blob.
Please read http://azure.microsoft.com/blog/2014/06/18/announcing-the-0-3-0-beta-preview-of-microsoft-azure-webjobs-sdk/ as there is a lot of old information lying about on the web regarding v0-2-0 which will not work in v0-3-0, I’m not ashamed to say it’s now the early hours of the morning before I’ve finally managed to get this working as most of the documentation I was reading was v0-2-0.
Let’s get started by creating a console application.
In this code you can see that I’m just appending worked… to the input file, the important parts to consider are the BlobTrigger and Blob attributes, the trigger is the item that will start to process when a blobupdates on the reuters-input container, the BlobAttribute is the output and the reuters-gdmx is the identified container for same.
There are a few options with the job, schedule/on demand/continuous..
For the automatic trigger I’m setting the job to be On Demand, however I know in the current version of webjobs, that ondemand for blobs are polled every 10 minutes (I hope can be more real-time once WebJobs exit preview).
You need to add two connections string to your blob storage account (get the connection string with visual studio Azure explorer), I’ve set both to the same storage account. The last two are from v0-3-0.
I’m going to use the AzureStorage explorer to upload a file, once this file gets uploaded the WebJob will run and create the associated blob in reuters-gdmx
Here you can see the result of the WebJob (appending worked….)
In the picture above you see a storage account in Azure, in the storage account we have an ecbfx (European Central Bank FX Rates) container. Now let’s see how to upload some data to this container using a C# console application.
Given we are going to work with C# the best option is to use the .NET library, this can be retrieved from NuGet
The code above connects to the pre-created container, notice that my container has built in Geo Redundancy (primary storage in Dublin, secondary in the Amsterdam) so after running there will be 6 copies of this blob, 3 in Dublin and 3 in Amsterdam, this is the storage package I’ve chosen.
The easiest way to view the newly uploaded blob is to use the Windows Azure server explorer in Visual Studio, it’s the easiest way of getting the connection string to the storage account also.
In the next post I’m going to show you how react to someone uploading a Blob with an automatic trigger.
This is a post, that I've just not got time to explain in full yet...
But if you are stuck on 1-2mbps ADSL and have a bit of time on your hands then its possible to get +20mbps broadband.
I achieved this by load balancing a fixed wireless connection of 8mbps with a HSPDA 12+mbps connection. The HSPDA connection took most of the work and involved installing a fixed external yagi antenna for starters..
So far so good! PM me for more details.