Azure Arc Server Registration Error


Today while adding a new server to an Azure subscription I encountered the following error:

The subscription is not registered to use namespace 'Microsoft.HybridCompute'

In this video I show you how register the Hybrid Compute provider in your subscription to overcome this obstacle.

Azure Key Vault References

In this video I show you how move application secrets into Azure Key Vault without any code changes.I do this by using a vault access policy.

Note: You'll have to ignore my managed identity references in this video I didn't use them.

Azure Managed Identities

In this video I show you how to leverage Azure Managed Identities to allow access between Azure resources.

(excuse the audio quality.. i need to improve on this)

Serverless on my server

So I’ve been looking for a serverless framework that can run on-prem and in the cloud, I’ve been leaning towards OpenFaaS as it appears to be gaining more traction, however I love Azure functions and though let’s see if this is a viable solution.

I download what is a Preview, so I wasn’t expecting miracles, I’m sharing the reasons why I can’t use it for my own requirements below.

It might save some of you guys the effort, I must reiterate that this is still a preview so some of the stuff I say here will be out of date really quickly!

I have decided against Azure Function On Prem in March 2017 because:

  • It needs Sql Server, I can’t rely on having this at least not for some brown field projects I want to use serverless for.
  • It needs IIS, I have to run on Linux (might be a solved problem… especially as it’s using the new .net core runtime )
  • It only has Javascript and C# language support in preview, I need Java, and Go and Python would be nice to haves
  • The packaging was a windows installer, I was hoping for some docker images, I expect this is a solved solution and for now the MSI is a quick win for the developers.

Next it’s down the rabbits burrow with OpenFaas on Kubernetes, cross your fingers for me!

 

Aside from the above which are mostly external limitations it’s nice to see Azure Functions Running locally

image

Out with the old in with the new(er)

With 2016 drawing to a close and 2017 already in full swing for me, I thought this was a good opportunity to reflect on how 2016 went and what 2017 has in store for me from a technological point of view.

2016

If asked how 2016 was from a professional perspective I’d probably try to sum it up as follows “Technology continued to roll out at an ever increasing pace, not only was new technology appearing faster than ever before, existing technology stacks started iterate and churn under our feet!”

Nearing the end of 2016 was where I finally admitted defeat and realized that I can’t keep up with everything and I while I sure am greedy and to know everything about everything, it was getting to the point that I was becoming a ‘Jack of all trades and master of none’ dare I say a full stack developer! I’d actually like to think I’m master of some, but certainly it was a big effort to stay on top of everything.

What did I get up to?

Azure : I got certified in Azure this was without doubt my most prized professional achievement of 2016, I’ve been using Azure for years and I feel quite confident in acclaiming it to be the best public could in the world today.
I’ve also started work on a state of the art data distribution network using Server-less architecture. I finally got down and dirty with Swagger/API Apps/LogicApps/AzureFunctions.
I got a lot better at networking, Load balancing resiliency, Azure/AWS causes a, devops inner persona to ooze its way to the top.
I listened with baited breath to the Azure Weekly Azure Podcast to see what was new (and always scratched my head when Cale got excited about BlockChains, perhaps next year I’ll look back and kick my self for not being an early adopter, it does seem to be an area that’s heating up).

AWS: I got certified as an AWS Solutions Architect, it was great to get a better understanding of AWS and indeed for a few offerings they I’d choose them over Azure. Got heavily involved in AWS CloudFormation and helped regain some control on AWS madness.

Google Cloud: I spent a few weeks playing with it just to see how it’s coming along, at least now I’m somewhat informed but I’d only consider myself as a beginner (I’d consider GoogleCloud a beginner also , unless they put in a massive investment into the portal and services, they simply can’t compete with Azure and AWS.

Docker: I can create images, start stop then understand volumes, I didn’t get as far as any of the clustering techniques such as swarm but I see huge value in Docker!

AngularJS: Architected and delivered a cutting edge data visualization system based on Angular 1.x, typescript,sccs,gulp. 
Introduced AngularJS in to multiple smaller projects.

Typescript: This is a fantastic language, and now especially with all the bells and whistles in v2.1 (not least async/await for es5 targets). If you are writing any Javascript you need to learn this no one will ever convince me that a dynamically typed language is better than a statically typed language for starters, but with all the new Standards based features now baked in, it’s certainly taking the industry by storm, I can’t see how Babel will continue to fight for its place in the world alongside it.

Ionic2: I wrote another mobile app, I’ve done this in many languages to date, I started out with iPhones and xamarin c#, moved to objectiveC and java, and finally settled back on the Typescript/Angular2 based Ionic2 framework. It’s a pleasure to deal with, and with my other investments in the underlying stack it has become a natural fit.

Java8: Finally spend some time getting up to speed on the new jdk and it’s offerings. While not strictly Java8, I’m including Sprint Bootstrap, Wildfly10 Application server, CDI, JaxRS etc in this section.

Camel: Gained a basic understanding and working knowledge of the Camel EAI framework.

ActiveMQ: I debated about putting this one on here, all queues fulfil the same core requirements to pass messages right? But I did approach ActiveMQ from three different sides camel/c#/java, so that was interesting.

.NET 2017 I’m now informed about what’s coming down the line. Some interesting thing like C#7 (which I will admit I had to read twice before I saw the value in the language changes), better support for the web stacks (although I’d admit with a tear in my eye that I’ve moved to Jetbrains software and am unlikely to come back to VStudio unless it’s an ASP based backend).

Client Products: Not only the development stacks have been changing, products in use by my clients have been moving at a rapid pace also and given they pay the bills I dedicate a fair amount of time to understanding them in depth.

Resource Consumption:
DNR -Listened to nearly every episode of DotNetRocks.
Hanselminutes - Funnily enough I found DotNetRocks as I used to subscribe the Hanselminutes; I say used to, as I’ve finally given up on Hanselminutes it appears to have moved in a different direction in the last year or so, don’t get me wrong, Scott is a great guy one of the best technical speakers in MS if you ask me, I even follow the weekly ASP.net stand-ups which he’s in, it only the podcast that I gave up on in December.

Other recommended podcasts:
Angular In Action
Javascript Jabber
RunAs Radio
Azure Podcast

2017

As you can image it takes a lot of time to get proficient in any of the above stacks I’ve mentioned . I’ve been trying to stay on top of them all and I’ve now reached the point or realization that need to let some go (think of Kate Winslet prying Leonardo DiCaprio’s icy fingers off that board she was on, it’ll be oh so sad). I’m going to narrow down the field, I’ll still keep in touch with them and if I encounter anything I don’t understand then I’ll make it my business to understand it, I simply won’t actively go pursuing them all. I’ve been burnt before with that approach, I learn Sliverlight after all, it wasn’t all bad as I wrote a windows phone app and many WPF apps around the same time, so the experience transferred nicely, it’s just that I’m not writing much WPF these days so I’ll put effort back in that direction only if and when needed.

Q: So the question remains: Where am I going to put my extended effort this year? 
A: Azure first approach. Azure will be the primary topic of my blogs whether the implementation is in C#/JavaScript/Typescript/Java I don’t really care. If the backend is .NET or Java, again I don’t really care but I do intend on blogging on practical usage cases for Azure services, I may even create a video or two!

Happy new year!

Azure vs. AWS Text to Blob with SDKs

This demonstrates what is involved in writing and reading some text to an Azure and an AWS blob.

Use case

What i set out to achieve was to demonstrate how to read and write some text to a blob with the SDKs. Just to make it a little more interesting I decided to use .NET for the reading and Java for the writing.

Obtaining the SDKs

Adding the SDKs was a seamless process, for .NET Nuget was used and for Java Maven was used

 

.Net Java
image
image
image

 

Write

Azure

image

 

AWS

image

 

Read

Azure

image

 

AWS

image

Conclusions

Both SDK’s were trivial to install and use, the Azure SDK’s suited my use case a little better in that they didn’t need me to deal with files in my Application code (I expect text is not a mainstream use case).

AWS as always relies on the region being specified which I can’t say I like that much.

Media Indexing In the Cloud

So out of the blue I found myself giving Azure Media Indexing a trial run, for no other reason other than I could, this is why I love cloud tech so much, it brings something that would have been very difficult 5-10 years ago, within reach of anyone with an public cloud account.

AWS vs Azure

Both AWS and Azure have media services, typical used to manage digital media and serve it up to consumer playback devices at scale.

AWS has the the Elastic Transcode and Azure has Azure Media Services, however only Azure has the ability to dig into audio or media files and extract the text within.

Azure Media Indexer

Azure Media Indexer enables you to make content of your media files searchable and to generate a full-text transcript for closed captioning and keywords. You can process one media file or multiple media files in a batch. Have a look at this post for some details on how to do it from code https://azure.microsoft.com/en-us/documentation/articles/media-services-index-content/

The code uploads a file then starts an indexing job, then downloads the results:
Note: The source code seen above has a typo, I’ve submitted a pull request to hopefully this will be fixed, but easy to spot.
Also the download part failed with an exception for me so i just pulled it down with a little bit of code on a second pass.

image

 

The above code is possibly all you need if you wish to upload content and start the indexing job manually with the old portal.

Here’s how:

On your media account upload some content

image
image

 

Once the content is uploaded, start the indexer process, set a good title as Azure will reach out to the interweb and use it to seed the language extraction.

image
image

 

There is no way to download the output from the portal so use use the code i shared above to download the content.

I processed the latest podcast at time of publishing from https://www.dotnetrocks.com/
https://s3.amazonaws.com/dnr/dotnetrocks_1276_news_from_build.mp3 

In hindsight it was possibly not the best podcast to index as it was recorded live @build (i expect, i’m two episodes behind on DNR this week so have not listened to it yet), the DNR guys typically have exceedingly good audio, at some stage it might be worth indexing another episode.

Results

You can find the results here here , initially my knee jerk reaction was , “ah this is poor” but after reflecting on it I’m blown away by what was done and so so, sooo easily!

With a bit of editing this can be thrown into Azure Search / Sql Server etc for full text search and direct seek media playback.

See for yourself:

image

 

For sure it needs some editing, e.g.
I release the eleventh music decode by
should in-fact be
I released the eleventh music to code by

but what a great start!!!

Cloud costs: Shut those VM’s down

The public cloud is fantastic for numerous reasons, if you’re not faced with some restriction such as where you data lives or other factors, then my advise is get away from private clouds and get to the public clouds as fast as your legs can carry you!

However once you’re there it’s not all plain sailing, if you let a team of people loose to play with with all these new toys, on the back of your company’s credit card, then costs can start to accumulate very quickly!

Sometimes, your VM’s are not being used for production and what invariably happens, is that these machines get forgotten about or are left running for no good reason, now while there are a few ways to capture such scenarios,  what I’ll show you now is a very quick way of scheduling those known VM’s to shutdown (or start up) as on a predefined schedule,

AWS

For AWS the easiest way of scheduling a single standalone VM to shutdown is to use the AWS Data Pipeline service.

image

Lets quickly show the workflow:

1) Create new Pipeline with CLI Command

image

 

2) Enter the Stop EC2 CLI commands

image

Note: This field only shows as one line of text vertically in chrome so I modified to styles to show the full command.


You can see that i have two different stop commands, I could combine these into the one command with the two IDs however if one fails then they both fail, this can be problematic if for example an Instance gets terminated.

3) Schedule

image

 

4) Set log file bucket

image

 

5) Select role

image

Choose custom and then select the two defaults.
Security Note:  Roles needs to be configured to allow Data Pipeline access to your VM’s, please see here: https://aws.amazon.com/premiumsupport/knowledge-center/stop-start-ec2-instances/

6) Done

image

That’s it, you now have a scheduled task that will switch off your vm’s nightly. It should be noted that this will start a micro data pipeline ec2 instance VM with a default run time of 50 minutes, so you need to ensure the end justifies the means, better yet reduce the run time by editing the workflow to e.g. 15 minutes.

image

 

Azure

In order to achieve the same results with Azure we are going to select Azure automation,

image
If you’re familiar with Azure you will know that there are currently two ways of creating VM’s, the classic approach and the RM (resource manager approach). In this post I’ll show you the RM approach, but feel free to substitute classic in it’s place with a nearly identical approach.

1) Open or create an Azure Automation account.

image

 

2) Edit Assets

image[95]
image

Add a variable for the AzureSubscriptionId you’ll be using
Select your service principle account, you’ll have to search for it to appear.

3) Runbook

We have two options now, we can either use some powershell or some graphically defined workflows, let’s do this with a graphical version, we don’t need to create this, we simply import from the gallery.

image
image

After importing choose Edit on the runbook

image
image

4) Set inputs

image

Then we set the two Assets we provided earlier and optionally a ResourceGroupName (to stop all vm’s in a resource group) or a VMName The “Auto” you see above isn’t a keyword, it’s my badly named ResourceGroup.

5) Publish

image

 

6) Set schedule

Go back to the Runbook and choose schedule

image

With the schedule you can specify any of the input parameters and override the defaults if you so wish.

Security Note: Much the same as Azure you’ll need to ensure you’ve permission to access the VM’s from Azure Automation, the best option is to create a SecurityPrinciple application. See: https://azure.microsoft.com/en-us/documentation/articles/resource-group-create-service-principal-portal/

 

Conclusion:

While it does look like the Azure approach is much more convoluted it is much more powerful, e.g. it is very easy to extend the azure run book to check all VM’s for a “Production” tag and only shutdown vm’s if they are not production (because that would be bad right!); with AWS, we are simply relying on a feature of Data Pipeline that allows us to run simple cli commands.

Pricing is much of a much-ness between each, with Azure you can run for free (to a limit)

image

AWS the 15 mins with a micro instance is not even worth worrying about.

Web App deployment to AWS and Azure

As promised, hereby the first instalment of the AWS vs Azure blog post saga, again I’m trying to remain impartial throughout.

What I intend to outline is at this stage is the show to get started deploying a new application to AWS and to Azure from within Visual Studio. I’m sure there are those of you that are shouting, “.NET, Visual Studio, Azure? Of course Azure will do it better!!!” however rest assured this is only the first of a few posts related to Azure App Service and AWS elastic beanstalk and AWS doesn’t fair all that badly.

Sample Application

The sample application in this case is just a File/New ASP MVC5 project using .net 4..6.1, I’m only hitting the home page as a test and not worrying about databases for now (databases will make another interesting series of blog posts!).

AWS Elastic Beanstalk

AWS has a AWS Toolkit plugin for Visual Studio, this allows you to view and manipulate AWS resources

image

It also lets you Publish Applications to AWS by right clicking on the solution and choosing “Publish to AWS”

image

 

Once you choose this option you’ll be presented with a dialog that lets you choose your environment or create a new one.
image

 

If you don’t already have one, lets create one, you will choose a name for the environment

image

 

Next you choose your instance size (the underlying VM size, or any custom Amazon Machine Image you’ve created previously), other options of interest are, use non-default VPC, this is basically the network you’ll be running on, all AWS accounts get a default VPC per region (and if you delete it you’ll need to contact AWS to get it back!). The option of single instance environment is selected here as this is just a test. If i wasn’t running in single instance mode, I would be able to Enable Rolling Deployment to keep my app running while it gets updated (more about that here: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.rollingupdates.html)

image

 

Lastly we choose the application settings, I’m just deploying a .net 4 runtime debug application.

image

Once you review and finish, you can see your application start deploying on the portal

image

Once it’s finished which can take a few minutes after the upload you should see the Health go Green and you can access your application

image
image

 

Note: If you’re following along and wish to stop this ElasticBeanstalk environment to minimize costs/free tier bandwidth, then please ensure to terminate it from the ElasticBeanstalk section of the console, Stopping the underlying EC2 instance will only serve to signal the autoscaling group it belongs to, to start a new instance and restore the health of this application.

Azure App Service

Now lets deploy this same application to azure. Right click on solution explorer and choose Publish

image

 

Choose to Azure

image

 

Like AWS where we chose a server environment we need to choose an app hosting plan, with Azure you can sign up for a free trial, if you have a subscription you can choose to deploy a free cloud app (you get 10 free per region, there are some limitations which we are not concerned with just now).

image

 

After creating this new hosting plan we arrive back at the publish dialog

imageimage

Visual Studio then starts the publish task and opens the application in your default Visual Studio specified web browser.

image

You can also see your new application seeding life in the Azure portal http://portal.azure.com

image

 

Summary

So in this blog post I’ve run through how to deploy applications to PaaS offerings on AWS and Azure, in the next post I’m going to drill down and and do some more comparing and contrasting of these two applications, stay tuned!