Microsoft OpenHack DevOps – Stockholm

Last week I participated in OpenHack DevOps organized by Microsoft in Stockholm. It was a three day event from 5th – 7th February 2019. The hackathon was focused on Azure DevOps. The format of the event was of-course like a hackathon.

Participants were divided into teams (4-6 each). The content was set as challenges (total 9). Every team (or two) had a coach – someone from Microsoft – who was helping and guiding during the challenges. We, as teams, were supposed to find our way out to solve the challenges. There was no one way, we were free to take our decisions and paths as deemed fit.

Here is the schedule and agenda for the event if you are curious: Microsoft OpenHack DevOps

What I really liked about this event was that we were focused on a task at hand. It was a hands-on rather than attending any tech talk about a specific topic. The tasks were set, challenges were well organized, the environment was prepared, code was almost prepared (with some changes) so that we can focus on learning how we can use Azure DevOps as a tool to ensure zero down-time for production ready application. Kubernetes was chosen as an orchestration framework.

Microsoft OpenHack is a developer focused event where a wide variety of participants (Open) learn through hands-on experimentation (Hack) using challenges based on real-world customer engagements designed to mimic the developer journey.

For every challenge the links to documentation and resources were provided to understand relevant topics and areas at hand.

Besides the actual work, it was a great opportunity to network and discuss broader topics with fellow participants and Microsoft employees. They also had what they called the envisioning sessions which were basically to discuss with Microsoft guys one-on-one the challenges one is facing at his or her work.

Overall, I think it was a great learning experience with great focus on getting things done. I will definitely keep an eye on such events in the future. By the way, if you happen to be in Stockholm in April there is an upcoming Microsoft Ignite Tour that you must checkout.

Cheers

Accessing Azure Analysis Services Models using .NET Core

Azure Analysis Services is a fully managed platform as a service (PaaS) that provides enterprise-grade data models in the cloud. Use advanced mashup and modeling features to combine data from multiple data sources, define metrics, and secure your data in a single, trusted tabular semantic data model. The data model provides an easier and faster way for users to browse massive amounts of data for ad-hoc data analysis.

Refer to Microsoft official documentation to read more about Azure Analysis Services

Programming against Analysis services is nothing new and we have been doing it for a long time with the full .NET framework, the most common approach is using ADOMD.Net. In this blog post, I will go through the process of getting the same task done with .NET Core.  For this sample, I’m using .NET Core 2.1.

Look at my GitHub repository for the entire source code for this blogpost

The important thing to note here is that there is no official nuget package from Microsoft for ADOMD.NET yet, but I found an unofficial package here (Unofficial.Microsoft.AnalysisServices.AdomdClientNetCore) and it seems to work for my quick test (you have to make a call if you want to use it in production or not). I couldn’t find any official word on this anywhere I looked for.  Besides this nuget package for .net core rest of the stuff should work same in full framework (with official nuget for ADOMD.NET)

I have divided this into several steps so that it is easy to follow. So let’s get started!

Step 1: Create Azure Analysis Service resource

The very first thing we need is the Analysis Server and model in Azure. Follow this quick starter to create the server.

Next is to create a model which we will use to query. You can create a model with sample data (adventure works) right from within your Analysis Server.

Click ‘Manage’ in the blade and click ‘New Model’. Select ‘Sample data’ from the drop down and press ‘Add’. It should add the model for you. 

Create a model Analysis Services
Model successfully created

Step 2: Create App Service Principal

There are many ways to access analysis services. Simplest is usig a connection string that has usrename and password. But this is not recommended approach and works only with full .Net framework but not .Net Core (I was pointed by bdebaere in his GitHub respo regarding this), so we want to authenticate with other OAuth flows. For this post, we will use token-based authentication. For this we will need an app principal (or Azure AD App)

  1. Sign in to Azure Portal.
  2. Navigate to Azure Active Directory -> App Registrations and Click New application registration.
  3. Register an app with the following settings:
    • Name: any name
    • Application type: Web app/API,
    • Sign-on URL:  https://westeurope.asazure.windows.net (Not really important here, you can provide any valid url).
  4. Once the app is created, navigate to ‘Keys’ and add a new key
    • provide the description and select duration and press Save button
    • after that you will be able to see the key it will appear only once so take note of this key and we will use this later on
  5. Also take note of the Application Id from the main page of the application
Setting access key for Azure AD App

Step 3: Assign your user as Service Admin in order to connect from SSMS

Registering an app is not enough. We need to assign access to this app on  Analysis Service Model (adventureworks model that we created in previous step). In order to give this access, we will need SQL Server Management Studio. 

Before we could use that, we need a way to connect to this analysis services instance via SSMS. For this, we need to set up our account as Service Admin. Navigate to the Analysis Services resource that we created in the first step. Click ‘Analysis Services Admin’. Normally your subscription account is set as the admin (this is what I will be using) but you are free to set up any account you wish appropriate. 

Setting Sevice Admin for Analysis Service

Step 4: Grant permissions to app principal on the model

  1. Connect to Analysis Service using SSMS 2017 with your account that you assigned as Service Admin in the previous step
    • You will need the Server name (from Step 1)
  2. Select the database model and click on Roles or add a new Role
    • Choose any name 
    • Select ‘Read’ database permission for the role 
  3. Add the Service principal to any role in below format (search for the app name)
Connect to Azure Analysis Server with Service Admin account
Adding App principal as a member in newly defined role

This will add user with following convention:  app:<appid>@<tenantid>

appid: is the application id for your app you created in the Step 2.

tenantid – is the id of your subscription (you can find this in Properties of your Azure Active Directory)

This didn’t work for me when I tried  to use Azure subscription with my personal account (hotmail) so I had to use my company account subscription to make this work. 

Step 4: Write the code to access data

Now we are all set up write our code that reads from the Model. Please refer to the entire source code in my GitHub respository

Important method here GetAccessToken. I’m using ADAL.Net (nuget: Microsoft.IdentityModel.Clients.ActiveDirectory) to grab the token for the service principal from Azure AD. 

Method to acquire token from Azure AD to access analysis services

Once we have the token, we are good to access data from the model. Here I’m using the unofficial NuGet package for ADOMD.NET that I mentioned previously.  The correct  Connection String format is: 

Provider=MSOLAP;Data Source=<url of the Azure Analysis Server>;Initial Catalog=<modelname>;User ID=;Password=<access token here>;Persist Security Info=True;Impersonation Level=Impersonate

User ID is left empty and Password is the access token which we get from Azure AD. 

Method to read data from Analysis Services Model adventureworks

If you run this, you will see the output in the console

Final output of the program

Have you tried to work with Azure Analysis services in .NET Core? How was your experience? I would be very interested in listening to your experience and challenges.

Cheers

Are you keeping your secrets secret?

As developers, we are all guilty of leaking sensitive information of our applications and systems more than we would perhaps like to admit. Don’t get me wrong, I’m not talking about breaking the NDAs with our clients but about those little connection strings to our databases and keys to our storage accounts where we hold all our information that we want to protect from outside world. As developers, we understand that this information is sensitive and still we check-in those along with the rest of our code base which eventually ends up in our version control systems that are hosted in the cloud somewhere. And we firmly believe that this is going to be fine. This is akin to watching a dreadful accident on NEWS and saying this can’t happen to us.

“The Cloud” is all beautiful and powerful but with power comes responsibility. Among all people, we developers should be well aware and feel responsible & accountable for handling the sensitive information of our applications. We must ensure that it doesn’t leak under any circumstances which can jeopardize our systems (and also our positions). And believe it or not cloud has made these things easier and we have all the tools we need to make it happen. It is not complicated anymore, you just need to be a bit thoughtful and set up the habit to do this when you are starting your project (or even better put these things if you end up in a project where this is lacking).

Ok, enough talking let’s get to the meat of this post. In this blog post, I will create a Web Application (hosted in Microsoft Azure) which lists URIs of blobs from a container in Storage Account (Microsoft Azure). For the sake of simplicity, I’m creating the container and a dummy blob at the runtime. The Web app is a standard ASP.Net Core web app. I will present two implementations of this scenario.

The first implementation would be with the connection string of a Storage account stored in Application Settings of the web application directly. In the second part, I will move that connection string from the web application and keep it somewhere safe and will fetch information from the Storage account as earlier. Both the applications are deployed using ARM Templates and completely automated (You can create these resources manually if you so wish).

Part I

Here is the source code for this part. I will refer to some bits and pieces of this below:

Here is how the setup looks like: 

Web App reading from Storage Account directly using Connection String

If you look at the source code (here), in my WebApWithSecrets solution, I have two projects:

Solution structure – WebAppWithSecrets

deployment project contains the ARM template for provisioning the resources. The ARM Template does the following

  • provisions a Storage Account
  • provisions the web application
  • adds ‘StorageAccountConnectionString’ as a Connection String in the Web Application

You have instructions to deploy this template in GitHub repository at the above URL (if you want to try it out yourself). Once the resources are provisioned, this is how my resource group looks like when I see it in Azure Portal.

Three resources provisioned by ARM template in Azure subscription

If I navigate to my Web Application and look at the Application Settings, this is what I see: a connection string with the name ‘StorageAccountConnectionString’ 

Connection String set in Web Application after deployment

Once the resources are provisioned and in place, we can push the code to our web application. You can push this in a variety of ways but for now, I will just right-click and publish (WebAppsWithSecrets project (don’t do this in production). Once the code is pushed successfully, the web application will open up in my browser (otherwise navigate to the web application). This is how it looks for me (the red box is my custom code that lists the URIs):

At this point, the app is fully functional and there is nothing wrong here. If I navigate to appsettings.json file in my source code, I will find this: a connection string to my storage account which my code picks and fetches content from Storage Account (you will find the code to render these URIs in Index.cshtml.cs file)

Connection String in appsettings.json in the solution

I have this details not only to me but this is now checked-in to my source control.

We can all argue that it is not that bad after all since this is hosted in GitHub (or Azure DevOps or whatever the version control system you are using). These are secure systems and only our team members have access to them – so why bother? 

This all holds true until someone gets hold of your GitHub account or you haven’t configured the permissions correctly and someone who shouldn’t have the detail might get access. Put it simply, we are increasing the attack surface and we might have to deal with some unwanted situations later on. The good news is we don’t necessary have to do it this way, there is a better alternative which I will demonstrate in next part.

Part II

Here is the source code for this part. I will refer to some bits and pieces of this below:

The alternative approach could be that we keep such sensitive information in some sort of secured place or vault which is encrypted and secure which we can rely on and our application simply asks for this information from that vault without ever knowing the details itself. We sort of want to delegate the responsibility of handling sensitive information of our application to some other party without us worrying about the implications. By doing this we not only simplify our lives but also gain much more benefits like it would be better to govern these secrets, we would be able to update the keys and connection strings and certificates without changing anything in our application. The good news is Microsoft Azure has exactly such offer called ‘Azure KeyVault’. You can read about this more here.

This is how the redesign of our application looks like after introducing the Azure KeyVault: 

Solution design after introducing Azure Key Vault

The Web Application has information about the KeyVault only and delegates the responsibility of retrieving ConnectionString to Storage Account to the KeyVault.

If you look at the source code (here). In WebApWithourSecrets solution, I have two projects, same as before.

I have updated my ARM template for this. The template now does the following:

  • provisions a Storage Account
  • provisions the KeyVault
  • provisions the web application
  • adds ‘StorageAccountConnectionString’ as a Secret in the KeyVault
  • sets the name of the KeyVault in application settings

There is one catch, however. You might ask, wait for a second here! Fine, I moved my storage account’s connection string to the KeyVault but I still need details of the vault in my web application to read anything from it. Though I moved the secret of storage account my web application now holds even more sensitive information – all the secrets !! I feel you, I’m with you. But don’t worry, this is a solved problem. Microsoft quickly realized this and provided a very nice solution for this very problem. They introduced something called ‘Managed Service Identity’ aka MSI (now being renamed to just Managed identities) a while back. You can read more about this here but in short, it abstracts this detail in such a way that you don’t need to keep any ClientId/ClientSecret or Key of the vault in your web application. You simply enable Managed Identity in your web application (it creates an app principle which gets permissions to read from the Vault). Here is a really good premier of MSI if you want to get a deeper understanding of the feature.

If you navigate to Program.cs file in the solution, here is code that hooks up the KeyVault into our application (BuildWebHost method):

public static IWebHost BuildWebHost(string[] args) =>
     WebHost.CreateDefaultBuilder(args)
     //this is where KeyVault magic happens - we are setting up configurations from Azure KeyVault using Managed Service Identity
     //without specifiying any details of the Azure KeyVault itself (except the Url of the vault)
    .ConfigureAppConfiguration((context, config) =>
    {
       var builtConfig = config.Build();
       var keyVaultUrl = $"https://{builtConfig["KeyVaultName"]}.vault.azure.net";

       //this comes with .net core 2.1
       config.AddAzureKeyVault(keyVaultUrl);
                
       //if using 2.0, you should use this apporach
       //AzureServiceTokenProvider - this is the magic piece that makes it seamless to work with MSI
       //var azureServiceTokenProvider = new AzureServiceTokenProvider();
       //var keyVaultClient = new KeyVaultClient(
       //new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
       //config.AddAzureKeyVault(keyVaultUrl, keyVaultClient, new DefaultKeyVaultSecretManager());
       })
       .UseStartup<Startup>()
       .Build();

if you look closely, all information we have about the Azure KeyVault in our web application is the name (or URL if you will) of the KeyVault (KeyVaultName) stored in our appsettings.json (or Application Settings in the published web app). Everything else is managed by MSI for us. 

In ARM template, we enable MSI for the web application with this property: 

"identity": {
     "type": "SystemAssigned"
   }	

This enables this setting for the web application

So, as we did it Part I, let’s deploy the template to provision new set of resources for us (You will find instructions in GitHub repository for this). Once resources are provisioned, this is how it looks for me in Azure Portal:

Four resources provisioned by ARM template in Azure Account

If i navigate to the application settings of my web application, this is what I see: 

Notice, there is no connection string as we had earlier, instead there is an App Settings ‘KeyVaultName’ which contains the name of the vault. That’s the only information this web application has. Let’s deploy the code for the web application (like earlier, right-click and publish). After successfull deployment, this is what you should see: 

Web Application without Secrets after successful deployment

This looks exactly same as previous web application since we haven’t changed the logic. The only thing that was changed in this web app was to read connection string from KeyVault. 

Now let’s uncover some more details. If you navigate to the Azure KeyVault resource and then head over to Secrets, you will see this: 

If you pay attention, you don’t see anything, instead, it tells you that “You are unauthorized to view these contents.” But the web application lists the URIs just fine! Let’s go to Access Policies tab, you will see that there is one policy created for a web application: 

Access Policy for Web Application

This was created by the ARM template after web application was provisioned. This is the section in ARM Template which defines this policy for the web application

"accessPolicies": [
          {
            "tenantId": "[reference(concat(resourceId('Microsoft.Web/sites', variables('webAppName')), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').tenantId]",
            "objectId": "[reference(concat(resourceId('Microsoft.Web/sites', variables('webAppName')), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').principalId]",
            "permissions": {
              "keys": [ "all" ],
              "secrets": [ "all" ]
            }
          }
        ],
        "tenantId": null,
        "tenantId": "[reference(concat(resourceId('Microsoft.Web/sites', variables('webAppName')), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').tenantId]",
        "sku": {
          "name": "Standard",
          "family": "A"
        }

Since at the moment, the only allowed access is for the web application you are not able to see any secrets when you go to Secrets tab. You can give access to your account if you want to see these secrets. Remember to click Save button after adding the policy, otherwise settings will not be saved

Adding Access Policy in Azure Key Vault

now, when I navigate the Secrets again, i’m able to see the Secrets. 

Secrets visible after adding access policy for the logged in user in Azure Portal

Remember: If you want to use secrets from the KeyVault while development, you will have to assign access policy for your user in Azure KeyVault. Otherwise, you will get access denied exception. 

Conclusion:

So as we saw, it’s fairly straightforward to keep sensitive information in Azure KeyVault and configure our Web Application to read these secrets from the Vault. Moreover,  Managed Service Identity feature further facilitates us to keep the information about KeyVault out of our web application. This can be applied to any scenarios – this is not necessarily bound to the web application only. You can keep connection strings, keys, certificates and all sorts of information you don’t want to keep in your consumer application.

So, how do you handle your secrets? What is your opinion on this topic? I would love to hear more from you, please leave your comments. 

Cheers