Microsoft OpenHack DevOps – Stockholm

Last week I participated in OpenHack DevOps organized by Microsoft in Stockholm. It was a three day event from 5th – 7th February 2019. The hackathon was focused on Azure DevOps. The format of the event was of-course like a hackathon.

Participants were divided into teams (4-6 each). The content was set as challenges (total 9). Every team (or two) had a coach – someone from Microsoft – who was helping and guiding during the challenges. We, as teams, were supposed to find our way out to solve the challenges. There was no one way, we were free to take our decisions and paths as deemed fit.

Here is the schedule and agenda for the event if you are curious: Microsoft OpenHack DevOps

What I really liked about this event was that we were focused on a task at hand. It was a hands-on rather than attending any tech talk about a specific topic. The tasks were set, challenges were well organized, the environment was prepared, code was almost prepared (with some changes) so that we can focus on learning how we can use Azure DevOps as a tool to ensure zero down-time for production ready application. Kubernetes was chosen as an orchestration framework.

Microsoft OpenHack is a developer focused event where a wide variety of participants (Open) learn through hands-on experimentation (Hack) using challenges based on real-world customer engagements designed to mimic the developer journey.

For every challenge the links to documentation and resources were provided to understand relevant topics and areas at hand.

Besides the actual work, it was a great opportunity to network and discuss broader topics with fellow participants and Microsoft employees. They also had what they called the envisioning sessions which were basically to discuss with Microsoft guys one-on-one the challenges one is facing at his or her work.

Overall, I think it was a great learning experience with great focus on getting things done. I will definitely keep an eye on such events in the future. By the way, if you happen to be in Stockholm in April there is an upcoming Microsoft Ignite Tour that you must checkout.

Cheers

Using Postman for Integration Tests

In my recent project, I’m providing some integration services (APIs) that are used by the web application to fetch data from source systems. This is a legacy project that I inherited. To be honest, this was a PoC and we all know how PoCs are developed and very often continue to be used in production. That is why I have been rewriting/refactoring this codebase (I even tweeted about this a while ago) in every sprint to improve the code quality and cleaning up a lot of code. Since I don’t have a proper code coverage for all the source code, I’m always concerned when I’m continuously developing and refactoring this codebase. I’m often concerned (rightfully so) that I’ll break my API and will eventually bring down the system due to some stupid mistake.  But until I can have a good enough code coverage I want to be confident that I’m not breaking my APIs whenever I’m pushing a new release to production. Now I’ve taken quite a lot of steps over past few months (from having proper CI/CD pipeline to having slots support to have the availability for new releases etc) to clean up the codebase and continue to have my services up-and-running, one thing I’m particularly excited about is Postman.

If you are involved in RESTful API development, chances are you are already aware of Postman or atleast have heard of it. But in case this is the first time you heard about the tool, I’ll give you a very quick intro about it. 

“Postman Makes API Development Simple”

Postman website

Postman is a powerful HTTP client for testing web services. It makes API development faster, easier and better. You can read more about it on their official website.

I’ve been using postman during my development (testing APIs etc) but for last few weeks, I’ve started to write integration tests for my APIs so that whenever I’m making any changes or creating a new release (for any environment), I can quickly run the collection and make sure everything is still functioning as expected. 

This is not a replacement for having a proper unit test (with good enough code coverage) but I find it very useful to quickly verify that I haven’t broken the core functionality of my solution. 

Here are some features I’m using with Postman for my integration tests

  • Test automation
    • Collections
      • One big collection to run tests for all services in one go
    • Separate collections for each service (API) with calls of each operation that API exposes/supports
    • Extended tests to cover different scenarios
  • Debugging
    •  Pre-script and tests for individual calls
  • Environment variables – to have the possibility to test the APIs towards any environment that I want to test 

Even though I’ve used quite a few features so far but it doesn’t end here. Here is what I have planned to take advantage of other features as I go along. 

  • Integrating postman in my CI/CD (newman) so that I can auto swap my slots for the web application if everything is green
  • Mocking
  • Continue to extend my tests cases to increase coverage

How are you using Postman in your projects? Have you found anything intersting/useful that you would like to share?

Cheers

Accessing WCF Service via Azure Service Bus Relay with .NET Core

Check out my GitHub repository if you want to dig into the source code directly.

WCF (Windows Communication Foundation) has served us for a long time when it comes to talking to many LOB systems (SAP etc.). You might have love or hate (though if I have to guess, it would be on hate side) relationship with WCF but it works and at times we don’t have any other choice. If you happen to be in a situation where you are talking to legacy systems via WCF, you most likely have a built an API layer on top of it to serve your clients.

I was recently looking to consume WCF services using .NET Core (with the intention that I can migrate my web application to .NET Core). Very quickly I came up to this GitHub repository which was awesome. However, this .NET Core implementation is not in parity with the full framework yet.  I tested it and it worked just fine with standard bindings (http etc). Shane Boyer has a good blog post on this topic if you want to see how to use this.

As far as my needs (this particular scenario) are concerned, I couldn’t get it to work for my setup. This is how the setup looks for me. 

The web application (hosted in Azure Web App) talks to WCF Service (running on-premises, hosted in IIS) via Azure Service Bus Relay (in Azure)

So basically, there is no equivalent to basicHttpRelayBinding in .NET Core implementation yet and this is something my WCF service has to enable communication via Azure Service Bus Relay. 

After discussing with a friend of mine, I got a suggestion to try it out with SOAP instead and it seems to work fine! I know it sounds a bit ironic that on the one hand I’m moving to the latest (and greatest) framework for my web application but at the same time, I’m going one step back and using SOAP protocol to make it work – well such is life. I would very much appreciate if anyone can point me to the better solution. but until then, let’s continue forward.

So, in this blog post, I will go through how did I manage to call WCF service from my .NET Core Web Application. Again, here is the entire source code

So this is what I’m going to do: 

  • Create Azure Service Bus Relay (read here how to create this)
  • Create a WCF Service (with Azure Service Bus Relay Binding) and publish this locally to run it on IIS
  • ASP.NET Core Web Application calling to WCF Service using HTTP Post SOAP request.

Follow the link I have provided above to create the service bus relay. Let’s start with WCF Service. Here is the sample WCF project I build for this demo.  It’s a very basic WCF service which has Azure Service Bus Relay binding besides standard HTTP binding (check out the web.config for binding details). I have deployed this in IIS on my local machine. You can clone the repository and add the details of your relay if you want to try it out yourself. It exposes the following contract, which I’m interested in.

[ServiceContract]
    public interface ICustomerService
    {
        [OperationContract]
        string GetCustomerData(string customerId);
    }

When you browse this service from IIS you should see the service like this. At this point, a Service Bus WCF listener should also be registered in your namespace (you can check it in Azure Portal)

WCF Service deployed in IIS with Azure Service Relay Binding
WCF Relay Registered from WCF Service

Now, let’s start with the web application. I have created a vanilla ASP.NET Core Web API project (2.1). You can grab the source code from here (and clone it if you want to try it out yourself). Make sure you update settings in appsettings.json file for Azure Service Bus Relay. If you wish you publish this to Azure Web app – add these settings as Application Settings

After creating the project, I added following NuGet package

Nuget Packages: 

  • Microsoft.Azure.ServiceBus

In ValuesController, I have created a GetCustomerData method that takes in customerId as a string.  In here, I’ll make a call to WCF Service and return the response I receive from the service (the WCF service returns a dummy response)

[HttpGet("{customerId}")]
public async Task<string> GetCustomerData(string customerId)

Then moving forwards,  I generate the token for Azur Service Bus Relay:

Generating Token for Service Bus Relay

With this information, I construct the HTTP Post request like this:

We are basically making an HTTP Post Request to the Service Bus Relay endpoint over SOAP. For this, we need to set up a few headers

  • ServiceBusAuthorization – Relay Token retrieved earlier
  • SOAPAction – The Operation you want to call from WCF Service
  • Accept header – text/xml

Lastly, we need to set the body for the request – since this is a SOAP call, we need a SOAP message.

SOAP message body

Press F5 and run the web application, pass in cusotmerId parameter and you should get a response from the service:

Successful response from WCF Service

Have you tried to consume WCF Service from .NET Core application? How did it go for you? I would love to hear from you and would appreciate if you could suggest a better way of doing this as of today? I guess I don’t need to tell you that I don’t like using SOAP 🙂

Cheers.

Accessing Azure Analysis Services Models using .NET Core

Azure Analysis Services is a fully managed platform as a service (PaaS) that provides enterprise-grade data models in the cloud. Use advanced mashup and modeling features to combine data from multiple data sources, define metrics, and secure your data in a single, trusted tabular semantic data model. The data model provides an easier and faster way for users to browse massive amounts of data for ad-hoc data analysis.

Refer to Microsoft official documentation to read more about Azure Analysis Services

Programming against Analysis services is nothing new and we have been doing it for a long time with the full .NET framework, the most common approach is using ADOMD.Net. In this blog post, I will go through the process of getting the same task done with .NET Core.  For this sample, I’m using .NET Core 2.1.

Look at my GitHub repository for the entire source code for this blogpost

The important thing to note here is that there is no official nuget package from Microsoft for ADOMD.NET yet, but I found an unofficial package here (Unofficial.Microsoft.AnalysisServices.AdomdClientNetCore) and it seems to work for my quick test (you have to make a call if you want to use it in production or not). I couldn’t find any official word on this anywhere I looked for.  Besides this nuget package for .net core rest of the stuff should work same in full framework (with official nuget for ADOMD.NET)

I have divided this into several steps so that it is easy to follow. So let’s get started!

Step 1: Create Azure Analysis Service resource

The very first thing we need is the Analysis Server and model in Azure. Follow this quick starter to create the server.

Next is to create a model which we will use to query. You can create a model with sample data (adventure works) right from within your Analysis Server.

Click ‘Manage’ in the blade and click ‘New Model’. Select ‘Sample data’ from the drop down and press ‘Add’. It should add the model for you. 

Create a model Analysis Services
Model successfully created

Step 2: Create App Service Principal

There are many ways to access analysis services. Simplest is usig a connection string that has usrename and password. But this is not recommended approach and works only with full .Net framework but not .Net Core (I was pointed by bdebaere in his GitHub respo regarding this), so we want to authenticate with other OAuth flows. For this post, we will use token-based authentication. For this we will need an app principal (or Azure AD App)

  1. Sign in to Azure Portal.
  2. Navigate to Azure Active Directory -> App Registrations and Click New application registration.
  3. Register an app with the following settings:
    • Name: any name
    • Application type: Web app/API,
    • Sign-on URL:  https://westeurope.asazure.windows.net (Not really important here, you can provide any valid url).
  4. Once the app is created, navigate to ‘Keys’ and add a new key
    • provide the description and select duration and press Save button
    • after that you will be able to see the key it will appear only once so take note of this key and we will use this later on
  5. Also take note of the Application Id from the main page of the application
Setting access key for Azure AD App

Step 3: Assign your user as Service Admin in order to connect from SSMS

Registering an app is not enough. We need to assign access to this app on  Analysis Service Model (adventureworks model that we created in previous step). In order to give this access, we will need SQL Server Management Studio. 

Before we could use that, we need a way to connect to this analysis services instance via SSMS. For this, we need to set up our account as Service Admin. Navigate to the Analysis Services resource that we created in the first step. Click ‘Analysis Services Admin’. Normally your subscription account is set as the admin (this is what I will be using) but you are free to set up any account you wish appropriate. 

Setting Sevice Admin for Analysis Service

Step 4: Grant permissions to app principal on the model

  1. Connect to Analysis Service using SSMS 2017 with your account that you assigned as Service Admin in the previous step
    • You will need the Server name (from Step 1)
  2. Select the database model and click on Roles or add a new Role
    • Choose any name 
    • Select ‘Read’ database permission for the role 
  3. Add the Service principal to any role in below format (search for the app name)
Connect to Azure Analysis Server with Service Admin account
Adding App principal as a member in newly defined role

This will add user with following convention:  app:<appid>@<tenantid>

appid: is the application id for your app you created in the Step 2.

tenantid – is the id of your subscription (you can find this in Properties of your Azure Active Directory)

This didn’t work for me when I tried  to use Azure subscription with my personal account (hotmail) so I had to use my company account subscription to make this work. 

Step 4: Write the code to access data

Now we are all set up write our code that reads from the Model. Please refer to the entire source code in my GitHub respository

Important method here GetAccessToken. I’m using ADAL.Net (nuget: Microsoft.IdentityModel.Clients.ActiveDirectory) to grab the token for the service principal from Azure AD. 

Method to acquire token from Azure AD to access analysis services

Once we have the token, we are good to access data from the model. Here I’m using the unofficial NuGet package for ADOMD.NET that I mentioned previously.  The correct  Connection String format is: 

Provider=MSOLAP;Data Source=<url of the Azure Analysis Server>;Initial Catalog=<modelname>;User ID=;Password=<access token here>;Persist Security Info=True;Impersonation Level=Impersonate

User ID is left empty and Password is the access token which we get from Azure AD. 

Method to read data from Analysis Services Model adventureworks

If you run this, you will see the output in the console

Final output of the program

Have you tried to work with Azure Analysis services in .NET Core? How was your experience? I would be very interested in listening to your experience and challenges.

Cheers

Are you keeping your secrets secret?

As developers, we are all guilty of leaking sensitive information of our applications and systems more than we would perhaps like to admit. Don’t get me wrong, I’m not talking about breaking the NDAs with our clients but about those little connection strings to our databases and keys to our storage accounts where we hold all our information that we want to protect from outside world. As developers, we understand that this information is sensitive and still we check-in those along with the rest of our code base which eventually ends up in our version control systems that are hosted in the cloud somewhere. And we firmly believe that this is going to be fine. This is akin to watching a dreadful accident on NEWS and saying this can’t happen to us.

“The Cloud” is all beautiful and powerful but with power comes responsibility. Among all people, we developers should be well aware and feel responsible & accountable for handling the sensitive information of our applications. We must ensure that it doesn’t leak under any circumstances which can jeopardize our systems (and also our positions). And believe it or not cloud has made these things easier and we have all the tools we need to make it happen. It is not complicated anymore, you just need to be a bit thoughtful and set up the habit to do this when you are starting your project (or even better put these things if you end up in a project where this is lacking).

Ok, enough talking let’s get to the meat of this post. In this blog post, I will create a Web Application (hosted in Microsoft Azure) which lists URIs of blobs from a container in Storage Account (Microsoft Azure). For the sake of simplicity, I’m creating the container and a dummy blob at the runtime. The Web app is a standard ASP.Net Core web app. I will present two implementations of this scenario.

The first implementation would be with the connection string of a Storage account stored in Application Settings of the web application directly. In the second part, I will move that connection string from the web application and keep it somewhere safe and will fetch information from the Storage account as earlier. Both the applications are deployed using ARM Templates and completely automated (You can create these resources manually if you so wish).

Part I

Here is the source code for this part. I will refer to some bits and pieces of this below:

Here is how the setup looks like: 

Web App reading from Storage Account directly using Connection String

If you look at the source code (here), in my WebApWithSecrets solution, I have two projects:

Solution structure – WebAppWithSecrets

deployment project contains the ARM template for provisioning the resources. The ARM Template does the following

  • provisions a Storage Account
  • provisions the web application
  • adds ‘StorageAccountConnectionString’ as a Connection String in the Web Application

You have instructions to deploy this template in GitHub repository at the above URL (if you want to try it out yourself). Once the resources are provisioned, this is how my resource group looks like when I see it in Azure Portal.

Three resources provisioned by ARM template in Azure subscription

If I navigate to my Web Application and look at the Application Settings, this is what I see: a connection string with the name ‘StorageAccountConnectionString’ 

Connection String set in Web Application after deployment

Once the resources are provisioned and in place, we can push the code to our web application. You can push this in a variety of ways but for now, I will just right-click and publish (WebAppsWithSecrets project (don’t do this in production). Once the code is pushed successfully, the web application will open up in my browser (otherwise navigate to the web application). This is how it looks for me (the red box is my custom code that lists the URIs):

At this point, the app is fully functional and there is nothing wrong here. If I navigate to appsettings.json file in my source code, I will find this: a connection string to my storage account which my code picks and fetches content from Storage Account (you will find the code to render these URIs in Index.cshtml.cs file)

Connection String in appsettings.json in the solution

I have this details not only to me but this is now checked-in to my source control.

We can all argue that it is not that bad after all since this is hosted in GitHub (or Azure DevOps or whatever the version control system you are using). These are secure systems and only our team members have access to them – so why bother? 

This all holds true until someone gets hold of your GitHub account or you haven’t configured the permissions correctly and someone who shouldn’t have the detail might get access. Put it simply, we are increasing the attack surface and we might have to deal with some unwanted situations later on. The good news is we don’t necessary have to do it this way, there is a better alternative which I will demonstrate in next part.

Part II

Here is the source code for this part. I will refer to some bits and pieces of this below:

The alternative approach could be that we keep such sensitive information in some sort of secured place or vault which is encrypted and secure which we can rely on and our application simply asks for this information from that vault without ever knowing the details itself. We sort of want to delegate the responsibility of handling sensitive information of our application to some other party without us worrying about the implications. By doing this we not only simplify our lives but also gain much more benefits like it would be better to govern these secrets, we would be able to update the keys and connection strings and certificates without changing anything in our application. The good news is Microsoft Azure has exactly such offer called ‘Azure KeyVault’. You can read about this more here.

This is how the redesign of our application looks like after introducing the Azure KeyVault: 

Solution design after introducing Azure Key Vault

The Web Application has information about the KeyVault only and delegates the responsibility of retrieving ConnectionString to Storage Account to the KeyVault.

If you look at the source code (here). In WebApWithourSecrets solution, I have two projects, same as before.

I have updated my ARM template for this. The template now does the following:

  • provisions a Storage Account
  • provisions the KeyVault
  • provisions the web application
  • adds ‘StorageAccountConnectionString’ as a Secret in the KeyVault
  • sets the name of the KeyVault in application settings

There is one catch, however. You might ask, wait for a second here! Fine, I moved my storage account’s connection string to the KeyVault but I still need details of the vault in my web application to read anything from it. Though I moved the secret of storage account my web application now holds even more sensitive information – all the secrets !! I feel you, I’m with you. But don’t worry, this is a solved problem. Microsoft quickly realized this and provided a very nice solution for this very problem. They introduced something called ‘Managed Service Identity’ aka MSI (now being renamed to just Managed identities) a while back. You can read more about this here but in short, it abstracts this detail in such a way that you don’t need to keep any ClientId/ClientSecret or Key of the vault in your web application. You simply enable Managed Identity in your web application (it creates an app principle which gets permissions to read from the Vault). Here is a really good premier of MSI if you want to get a deeper understanding of the feature.

If you navigate to Program.cs file in the solution, here is code that hooks up the KeyVault into our application (BuildWebHost method):

public static IWebHost BuildWebHost(string[] args) =>
     WebHost.CreateDefaultBuilder(args)
     //this is where KeyVault magic happens - we are setting up configurations from Azure KeyVault using Managed Service Identity
     //without specifiying any details of the Azure KeyVault itself (except the Url of the vault)
    .ConfigureAppConfiguration((context, config) =>
    {
       var builtConfig = config.Build();
       var keyVaultUrl = $"https://{builtConfig["KeyVaultName"]}.vault.azure.net";

       //this comes with .net core 2.1
       config.AddAzureKeyVault(keyVaultUrl);
                
       //if using 2.0, you should use this apporach
       //AzureServiceTokenProvider - this is the magic piece that makes it seamless to work with MSI
       //var azureServiceTokenProvider = new AzureServiceTokenProvider();
       //var keyVaultClient = new KeyVaultClient(
       //new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
       //config.AddAzureKeyVault(keyVaultUrl, keyVaultClient, new DefaultKeyVaultSecretManager());
       })
       .UseStartup<Startup>()
       .Build();

if you look closely, all information we have about the Azure KeyVault in our web application is the name (or URL if you will) of the KeyVault (KeyVaultName) stored in our appsettings.json (or Application Settings in the published web app). Everything else is managed by MSI for us. 

In ARM template, we enable MSI for the web application with this property: 

"identity": {
     "type": "SystemAssigned"
   }	

This enables this setting for the web application

So, as we did it Part I, let’s deploy the template to provision new set of resources for us (You will find instructions in GitHub repository for this). Once resources are provisioned, this is how it looks for me in Azure Portal:

Four resources provisioned by ARM template in Azure Account

If i navigate to the application settings of my web application, this is what I see: 

Notice, there is no connection string as we had earlier, instead there is an App Settings ‘KeyVaultName’ which contains the name of the vault. That’s the only information this web application has. Let’s deploy the code for the web application (like earlier, right-click and publish). After successfull deployment, this is what you should see: 

Web Application without Secrets after successful deployment

This looks exactly same as previous web application since we haven’t changed the logic. The only thing that was changed in this web app was to read connection string from KeyVault. 

Now let’s uncover some more details. If you navigate to the Azure KeyVault resource and then head over to Secrets, you will see this: 

If you pay attention, you don’t see anything, instead, it tells you that “You are unauthorized to view these contents.” But the web application lists the URIs just fine! Let’s go to Access Policies tab, you will see that there is one policy created for a web application: 

Access Policy for Web Application

This was created by the ARM template after web application was provisioned. This is the section in ARM Template which defines this policy for the web application

"accessPolicies": [
          {
            "tenantId": "[reference(concat(resourceId('Microsoft.Web/sites', variables('webAppName')), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').tenantId]",
            "objectId": "[reference(concat(resourceId('Microsoft.Web/sites', variables('webAppName')), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').principalId]",
            "permissions": {
              "keys": [ "all" ],
              "secrets": [ "all" ]
            }
          }
        ],
        "tenantId": null,
        "tenantId": "[reference(concat(resourceId('Microsoft.Web/sites', variables('webAppName')), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').tenantId]",
        "sku": {
          "name": "Standard",
          "family": "A"
        }

Since at the moment, the only allowed access is for the web application you are not able to see any secrets when you go to Secrets tab. You can give access to your account if you want to see these secrets. Remember to click Save button after adding the policy, otherwise settings will not be saved

Adding Access Policy in Azure Key Vault

now, when I navigate the Secrets again, i’m able to see the Secrets. 

Secrets visible after adding access policy for the logged in user in Azure Portal

Remember: If you want to use secrets from the KeyVault while development, you will have to assign access policy for your user in Azure KeyVault. Otherwise, you will get access denied exception. 

Conclusion:

So as we saw, it’s fairly straightforward to keep sensitive information in Azure KeyVault and configure our Web Application to read these secrets from the Vault. Moreover,  Managed Service Identity feature further facilitates us to keep the information about KeyVault out of our web application. This can be applied to any scenarios – this is not necessarily bound to the web application only. You can keep connection strings, keys, certificates and all sorts of information you don’t want to keep in your consumer application.

So, how do you handle your secrets? What is your opinion on this topic? I would love to hear more from you, please leave your comments. 

Cheers

Using HttpHandler to deal with cross-domain requests from AngularJS app for SharePoint on-premises

Scenario:

When you go to the newsfeed in your personal site (mysite newsfeed), you can write post and target it to different sites you are following or on your personal feed.

Recently, I had to built the same functionality for one of the projects. The ideas was to build the UI using SharePoint’s REST APIs and AngularJS. I am not using Office365 so i still have SharePoint on-prem.

Problem:

Its quite simple as you would expect to retrieve the list of sites a user is following. You simply call this REST endpoint “../_api/social.following/my/followed(types=4)” and you have the result, no brainer. But only when you dig into the results, you get to know that even though the list is correct but not every site returned has newsfeed enabled. As a result, you also get those sites that have no newsfeed and these are the sites you don’t want to display. So, how do you eliminate those?

The problem is that there is no information in the JSON response that REST endpoint returns. The only option you have is to actually check for the SiteFeed feature on that web. So you get the site, open that web and see if this feature in enabled. Now this is all fine when you are on server side but if you are implementing your functionality using client side, you will end up dealing with cross-domain issues and you will get exception as it is treated as a cross-domain request by the browser.

Now if you were dealing with Office365, this could have been resolved using Microsoft Azure’s ACS to authenticate your app with SharePoint and all would have worked just fine. But that’s not the case when you are building your client app using jQuery or Angular and using REST APIs in SharePoint on-prem.

Solution:

So, after struggling quite a bit to somehow deal with this, I ended up creating the HttpHandler in which I retrieved the list of sites the current user is following. Then I iterated through that collection to check for the Site Feed feature and return the collection of sites only with site feed feature enabled and continue to building my AngularJS app as I was before hitting this issue.

I am not putting up the entire code or even complete code for my app here. I am adding some snippets demonstrating the solution. Here is the markup in my view. Its simply a drop down that lists the sites I’m following just like the one you get in SharePoint by default.

<!-- Here is the markup in the view (html file) of my angularjs app. I am putting just the markup for rendering the dropdown with required results and eliminating rest of the code for demonstration purpose -->
<!-- here, vm.followedSites is the model that im setting in my controller and select element is bound to that model -->
 <div>
      <select ng-model="vm.selectedValue" ng-options="value.Name for value in vm.followedSites">
          <option value="">everyone</option>
      </select>
</div>

Here is my controller that calls the handler to retrieve that data and set the model that my view is bound to.

/* to simplify things, i am demonstrating the underlying method that is calling in the httphandler to get the desired result 
 this method is inside my controller 
 */
 
 function loadFollowedSites(currentUser) {
            //construct the URL for httphandler to get the sites with newsfeed activated
            var url = "/_layouts/15/myapp/Handlers/ClientHandler.ashx?operation=getFollowedSitesWithNewsFeed&username="+ currentUser;
            
            //make request to the url and return the result back
            return $http.get(url).then(function (response) {
                return response.data;
            });
        }

And finally, here is the code that retrieves the list of sites i’m following in my HttpHandler

/**
Here is the code snippet from my HttpHandler
 
**/  
  public void ProcessRequest(HttpContext context)
  {
      context.Response.ContentType = "application/json";
      context.Response.ContentEncoding = Encoding.UTF8;
      
      string operation = context.Request.QueryString["operation"];
 
      if (!String.IsNullOrEmpty(operation))
      {
        /* Code removed */ 
        if (operation == "getFollowedSitesWithNewsFeed")
        {
            string userName = (context.Request.QueryString["username"] != null) ? context.Request.QueryString["username"] : String.Empty;
            List<SPSocialActor> newsFeedFollowedSites = GetFollowedSitesWithNewsFeed(context, userName);
            context.Response.Write(JsonSerializer.Serialize(newsFeedFollowedSites));
        }
        
        /* code remoed */
      }
 
  }
  
  //Retrieves the list of all the sites that specific user is following that has SiteFeed web feature enabled
  private List<SPSocialActor> GetFollowedSitesWithNewsFeed(HttpContext cxt, string username)
  {
      List<SPSocialActor> followedSitesWithNewsFeed = new List<SPSocialActor>();            
      SPServiceContext context = SPServiceContext.GetContext(cxt);
      UserProfileManager profileManager = new UserProfileManager(context);
 
      if (profileManager.UserExists(username))
      {
          UserProfile userProfile = profileManager.GetUserProfile(username);
          SPSocialFollowingManager followingManager = new SPSocialFollowingManager(userProfile, context);
 
          SPSocialActor[] followedSites = followingManager.GetFollowed(SPSocialActorTypes.Sites);
 
          foreach (var actor in followedSites)
          {
              try
              {
                  using (SPSite site = new SPSite(actor.Uri.AbsoluteUri))
                  {
                      SPWeb rootWeb = site.RootWeb;
                      
                      //Check for SiteFeed feature (web)
                      Guid featureGuid = new Guid("15a572c6-e545-4d32-897a-bab6f5846e18");
                      if (rootWeb.Features[featureGuid] != null)
                      {
                          followedSitesWithNewsFeed.Add(actor);
                      }
                  }
              }
              catch 
              { 
                //Handle exception
              }
          }
      }
      else
      {
        //Log - user doesn't exist
      }            
      
      
      return followedSitesWithNewsFeed;
  }

Please let me know if there’s something I have missed or if there is any better way of doing this. I will really appreciate it.

Thanks Cheers

The specified program requires a new version of Windows. (Exception from HRESULT:0x8007047E)

Scenario:
I have a custom list definition based on which I want to create some lists and associate OOB approval workflow with them for content approval.

In the schema.xml file for my custom list definition (based documents library i.e., basetype=1), I set ModeratedList=“True” attribute among others (see in the code snippet below)

<List xmlns:ows="Microsoft SharePoint" Title="TestList" Direction="$Resources:Direction;" Url="TestList" BaseType="1"
xmlns="http://schemas.microsoft.com/sharepoint/" EnableContentTypes="TRUE" EnableMinorVersions="TRUE" VersioningEnabled="TRUE" DraftVersionVisibility="2" ModeratedList="TRUE”>

Now, I added a ListAdded event receiver that listens to my custom list definition,so whenever any list is created based on my custom list definition, the OOB approval workflow should be associated with the list.

Here is the code snippet that associates the OOB approval workflow with the list


///
/// Associate OOB Approval workflow with the list
///
///
public static void AssociateApprovalWorklowWithList(SPList list)
{
  //variables
  Guid listId = list.ID;
  Guid webId = list.ParentWeb.ID;
  Guid siteId = list.ParentWeb.Site.ID;
  SPSite site = null;
  SPWeb web = null;

  try
  {
    site = new SPSite(siteId);
    web = site.OpenWeb(webId);
    bool allowUnsafeCurrent = web.AllowUnsafeUpdates;
    web.AllowUnsafeUpdates = true;
    list = web.Lists[listId];

    //Get Approval Workflow Template Base Id
    Guid workflowBaseId = new Guid("8ad4d8f0-93a7-4941-9657-cf3706f00"+ web.Language.ToString("X"));

    //If workflow is already associated, don't re-associate
    if (list.WorkflowAssociations.GetAssociationByBaseID(workflowBaseId) != null)
       return;

    //check if workflows feature is activated, if not activate the feature
    SPFeature workflowsFeature = web.Site.Features[WorkflowId];

    if (workflowsFeature == null)
      web.Site.Features.Add(WorkflowId);

    // Get workflow history and task list
    SPList historyList = EnsureHistoryList(web);
    SPList taskList = EnsureTasksList(web);
    string workflowAssociationName = string.Format("{0} - Approval", list.Title);
    SPWorkflowTemplate workflowTemplate = web.WorkflowTemplates.GetTemplateByBaseID(workflowBaseId);
    if (workflowTemplate == null)
    {
     //Log exception
     return;
    }

    workflowTemplate.AllowManual = false;

    // Create workflow association
    SPWorkflowAssociation workflowAssociation = SPWorkflowAssociation.CreateListAssociation(workflowTemplate,
    workflowAssociationName, taskList, historyList);

    var associationDataXml = XElement.Parse(workflowAssociation.AssociationData);
    // Add workflow association to my list
    list.WorkflowAssociations.Add(workflowAssociation);

    //Set workflow association data
    workflowAssociation.AssociationData = Add_Association_Data(web, associationDataXml);

    // Enable workflow
    if (!workflowAssociation.Enabled)
        workflowAssociation.Enabled = true;

    if (list.DraftVersionVisibility != DraftVisibilityType.Approver)
       list.DraftVersionVisibility = DraftVisibilityType.Approver;

    list.WorkflowAssociations.Update(workflowAssociation);
    list.DefaultContentApprovalWorkflowId = workflowAssociation.Id;
    list.Update();
    web.AllowUnsafeUpdates = allowUnsafeCurrent;
  }
  catch (Exception ex)
  {
    //Log Exception
  }
  finally
  {
    //Dispose the objects
    if (web != null)
       web.Dispose();
    if (site != null)
       site.Dispose();
   }
}

For the explanation of retrieving the workflow template id (line#23), please refer to my previous post here Associating OOB Approval workflow with custom list for different Locale (LCIDs)

Problem:
Once I have deployed the solution and everything is hooked up, I create the list based on my custom list definition and the process ends with the following exception:

PastedGraphic-1

But when I go to the site, the list is created and workflow is associated properly. 

Solution:
After narrowing down the problem, I removed the ModeratedList attribute from List element in schema.xml file for custom list definition (shown in the first code snippet), enabled the content approval for the list in the code and redeployed the solution and everything worked !

Here is the code for enabling content approval through code (add this in the function defined above after setting DraftVersionVisibility property)


  if(!list.EnableModeration)
    list.EnableModeration = true;

Cheers

Associating OOB Approval workflow with custom list for different Locale (LCIDs)

Scenario: I have a custom web template that is used to provision the site. Among other things (branding etc), a custom documents library is provisioned whenever a new site is created (also based on my custom documents library template). Besides creating the custom library, I need to associate the OOB Approval workflow with the library that will be used as a publishing workflow. So if any user publishes the major version of the document it goes through the approval process.

Problem/Limitation: Associating the workflow with any list or library is fairly straight forward. Here are the steps:

  1. Fetch the workflow template from parent web where list exist
  2. Get the target list
  3. Create the workflow association
  4. Check if workflow is already associated with the list
  5. If workflow is not associated, add the workflow association to the list

Here is the code snippet for performing this task.

private void AssociateApprovalWorklowWithList(SPList list)
{
SPWeb web = list.ParentWeb;
SPSite site = web.Site;
try
{
bool allowUnsafeCurrent = web.AllowUnsafeUpdates;
web.AllowUnsafeUpdates = true;

//Associate the approval workflow with libraries
//based on Custom Documents template only
if (list != null && list.BaseTemplate.ToString() == "10050")
{
//Get Approval Workflow Template Base Id
Guid workflowBaseId = new Guid("8ad4d8f0-93a7-4941-9657-cf3706f00409");

//If workflow is already associated, don't re-associate
if (list.WorkflowAssociations.GetAssociationByBaseID(workflowBaseId) != null)
return;

// Get workflow history and task list
SPList historyList = EnsureHistoryList(web);
SPList taskList = EnsureTasksList(web);

//set the name of workflow association
string workflowAssociationName = string.Format("{0} - Approval", list.Title);

//Get workflow template by Base Id
SPWorkflowTemplate workflowTemplate = web.WorkflowTemplates.GetTemplateByBaseID(workflowBaseId);
if (workflowTemplate == null)
{
//Log - no template found and return
return;
}

// Create workflow association
SPWorkflowAssociation workflowAssociation =
SPWorkflowAssociation.CreateListAssociation(workflowTemplate, workflowAssociationName,
taskList, historyList);

// Add workflow association to custom list
list.WorkflowAssociations.Add(workflowAssociation);

workflowAssociation.AssociationData = String.Empty;

//.. details omitted
list.WorkflowAssociations.Update(workflowAssociation);
list.DefaultContentApprovalWorkflowId = workflowAssociation.Id;
list.Update();

web.AllowUnsafeUpdates = allowUnsafeCurrent;
}
}
catch (Exception ex)
{
//Log exception
}
finally
{
//Dispose objects
}
}

The above code snippet will work just fine. But there is a little problem.

What if the site is created on language other than English? One case ask, why would there be any problem as we are fetching the workflow template by Base Id. Well, it turned out that the Base Id is not the same across different languages, it varies. So it failed if site has different locale.

There is also another method GetWorkflowTemplateByName but it didn’t work out for me as well, even though I set the right culture.. So that means, we can not use both the methods.

Solution: Trying both the methods to get the template didn’t work and I didn’t get anything by googling it either. So the only way out of this problem was the following:

Looking at the BaseIds of different sites, a pattern emerged. Let’s examine the BaseIds from English and Swedish sites, as we saw previously

8ad4d8f0-93a7-4941-9657-cf3706f00409 (English, LCID: 1033)
8ad4d8f0-93a7-4941-9657-cf3706f0041D (Swedish, LCID: 1053)

If you look closely, almost entire Id is same except last three digits Let’s take some more examples:

8ad4d8f0-93a7-4941-9657-cf3706f00415 (Polish, LCID: 1045)
8ad4d8f0-93a7-4941-9657-cf3706f00406 (Danish, LCID: 1030)

as you can see, only the last three digits vary from language to language.

Now take the LCID and convert it to hexadecimal representation (thanks to my colleague Matthias Einig for breaking this code 😉 )

Lets say,  for:

LCID = 1033 => hex = 409

LCID = 1053 => hex = 41d

LCID = 1045 => hex = 415

So, it became clear that the last three digits that differ are basically the hexadecimal representation of LCID. Having identified this pattern, I simply converted the LCID to hex value and there we have it, the BaseId for approval workflow template for current language

So,  if we change the line #15 in our code snippet above to the following:

Guid workflowBaseId = new Guid("8ad4d8f0-93a7-4941-9657-cf3706f00"+ web.Language.ToString("X"));

With this change, now no matter which locale the site has, it will construct the right workflow template id and will fetch the workflow template for association.

Hope it helps some folks out there 🙂