Streaming your Application Insights telemetry to Power BI

Yesterday I got a question from my team about Azure Application Insights regarding visualizing Application Insights telemetry in Power BI. Even Application Insights provides extremely intuitive dashboard for a variety of metrics, the main reason of using Power BI is to first eliminate Azure Portal access control, and secondly to combine dataset of telemetry and other datasets.

The question drives me to start researching Application Insights and approach to sending telemetry to Power BI. In this article, I’d like to introduce an approach then quickly walk you through steps to build a demonstration.

Scenario

Azure Application Insights aims to help you monitor different things in your application regardless of where it is hosted. Part of performance management, Application Insights lets you monitor request rate, response time, failure rate, load performance. Part of service management (if your application is critical to business), Application Insights lets you to monitor availability, exception or so on. It can be also aimed to monitor custom events in your application, for example, you build some APIs and need to trace each of the operations. I’m not going to introduce Azure Application Insights in this article. Hence, I strongly suggest you to read this article for more understanding of how helpful Azure Application Insights is to your application performance management strategy.

In my scenario, I’d like to show my team web application request by country in Power BI. In other words, from the dashboard, the team knows where people access to my web application. Below is the anonymously published web (link)

In real-world scenario, the dashboard is supposed to have more tiles (a snapshot of your data) such as number of users, top five countries, page view, top keyword searched. If you are familiar with Google Analytics, you know how the dashboard looks like right?

Creating Azure Web App on Azure

There are several ways to create a new web application on Azure App Service. Doing things via Azure Portal is the easiest way. With PowerShell, you need to use New-AzureRmAppServicePlan  to create an Azure App Service plan to host your web application. If you like practicing with PowerShell, below is the sample script to create an App Service Plan Standard S1 with two instances.

Next, create a default website in Azure with New-AzureRmWebApp  but you only have a default website created by Azure. Let’s use Visual Studio to create a new web application because the work need to be done with Visual Studio. If you deploy from scratch, you can also skip the above PowerShell script. The Azure publishing wizard in Visual Studio allows you to create even new resource group. If you have never done any web application deployment on Azure, follow the following articles:

In my setup, I deployed sample ToDoList app with an Azure SQL Database. Make sure Web App Name field is globaly unique. This name is also part of your website domain if you don’t use custom domain.

After the deployment, make sure to verify your website on browser from the Internet. The URL follows by <your_web_app_name>.azurewebsites.net

Configuring Application Insights

There is not a dedicated cmdlet to create an Application Insights resource, for example New-AzureRmAppInsights . Application Insights can be created by PowerShell using New-AzureRmResource  in which -ResourceType  parameter is Microsoft.Insights/components . It is similar to the Azure Active Directory Domain Services provisioning in this article.

This article gives you full PowerShell script to create a new Application Insights resource.

In Visual Studio, steps would be easier. Right click on the web application project, select Application Insights > Configure Application Insights.

Click Start Free. Note that the Application Insights SDK is already added because my template is ASP.NET Framework. If you create an ASP.NET Core, the SDK is not added by default. You shouldn’t worry about this because after clicking Start Free and configuring, Visual Studio does the rest for you, including adding SDK, connecting your web application to the new Application Insights instance.

When you are asked to give some specs of the new Application Insights resource, click Configure settings to specify location, resource group. Choose one Application Insights plan. For testing purpose, let’s go with the free plan which gives you up to 1 GB data collected per month. Wait a few minutes to let Visual Studio create and configure your Application Insights.

From Solution Explorer, open ApplicationInsights.config  file to check if the InstrumentationKey  element is added. Also during the configuration, Application Insights prepares and adds bunch of telemetry initializers to the config file.

As said earlier in this article, my scenario is to capture request by country so the only way I could do is to add client-side script to my master page or target page. Application Insights provides the following JavaScript to capture client-side telemetry

Add the script into your web application’s master page. The configuration of Application Insights does not automatically add this script so you must add it manually. If you don’t add it, the telemetry is still sent to Azure Portal, however only local telemetry (when you start your website locally) is generated.

This article provides great explanation on what the script does.

Replace instrumentation key of your Application Insights resource which you can copy from  ApplicationInsights.config file or in Azure Portal.

It’s time for generating dummy data.

Generating dummy data

It does not make sense to test yourself by just browsing the website in your computer. Asking your friends can help but the data is not so rich. As we expect to get more countries to access the websites, we need to find the way to generate dummy data. There are several ways but I’d suggest the following ways:

  • Use ShotSherpa to browse website from different locations. This website uses public live proxy to send requests to the target URL. Locations are configurable but maximum 8 per run.
  • Use Availability feature in Azure Application Insights.

Both work very well and generate much data. If using ShotSherpa, you just need to copy the website URL into the box and click ADVENTURE TIME. You can pick custom or randomize location.

If using feature in Azure Application Insights, click Availability > Add test.

From the blade, configure for the new test. Pay attention to the URL. By default, the URL follows by <your_app_insights_name>.azurewebsites.net . In Test location setting, select all.

Wait around 10-15 minutes to let Application Insights fetch your website.

In order to verify if your Application Insights resource has data, go to Analytics portal at https://analytics.applicationinsights.io. Analytics portal is where you can query your Application Insights telemetry. The language to be used here is Analytics query.

Create and run sample query to see the pie chart of all countries visiting my website

The chart show you all the countries where users visit the website. Of course, this data is powered by both ShotSherpa and Availability feature.

Configuring Continuous Export

Continuous Export is a feature in Application Insights. The idea of this feature is to retain Application Insights telemetry by storing it into Azure Blob storage. From Azure Blob storage, you may need to query directly to telemetry by calling Blob REST API or using Power BI to query. In this article, I’d like to use Stream Analytics instead.

Before configuring Continuous Export, let’s create a storage account and container because Continuous Export writes telemetry into each blob file in a pre-defined container. Use the following PowerShell

…then create a new container

Now go to your Application Insights, click Continuous Export. Click Add. If “Add” button is grayed, it means your existing subscription cannot use Continuous Export. The subscription must not be free tier so Microsoft can charge you $0.5 per GB if using Application Insights Basic plan.

Click Data types to export setting to select data type you want Continuous Export to write to Azure Blob. In this demonstration, I just need Availability and Requests. On Storage location blade, select the storage and container you just created by the above PowerShell scripts.

If you are unsure that your dummy data is not written to Azure Blob, just generate again and go to the container to verify. Continuous Export job creates two virtual directory depending on logs you have and data types you configured. The path follows by the below pattern:

Each blob contains request information in JSON format.  Note that there can be more than one request written in a blob, which makes full content body of a blob unreadable if you use a tool to validate JSON. Below is the sample one I extracted from a blob.

Until now, we are confident to say we have data. Now we need to let Stream Analytics query to the data.

There are a couple of things to note:

  • Continuous Export does not help you generate data. Instead, if connects to your Application Insights’s tenant database to retrieve and write to target blob. Default retention of Application Insights is 90 days. With Continuous Export, you are given longer retention because data stored in Blob storage.
  • You cannot delete Application Insights. If you want to clear all data, you must delete the existing Application Insights resource and create a new one.

Configuring Azure Stream Analytics

Azure Stream Analytics is designed to work with real-time analytics on streaming data. As said, streaming data is often come along with high volumes of data. Such a data can be commonly seen in Internet of Things (IoT) deployment. Not only IoT, data may come from social feed, weather information or highly monitoring system. Also, imagine that your web application is an e-commerce system and a new promotion with awesome promotion has launched . Your website receives high volumes of traffic that you need to stream requests to Power BI. Well, this is just an example and this article is sort of the entirely case.

Stream Analytics does not only ‘throw’ your data to the destination repository, it provides you the ability to identify the query pattern which you can manipulate your data before streaming process. For example, in my scenario, I will just query to extract only country and maybe city field from Application Insights telemetry dataset which is now stored in my Azure Blob storage.

Alright, go to Azure Portal and create a new Stream Analytics job. Give your job a name, choose resource group, location and hosting environment. Select Cloud to use Azure to host the job. Edge is only used if you use Azure Stream Analytics in IoT scenario. When choosing this option, you can configure on-premises IoT gateway.

In your newly created Stream Analytics job, scroll down to find Input. Click Add stream input > Blob storage.

On the Blob storage blade, enter name for the new input, e.g telemetryRequest. Because we already created a storage account and container so we don’t configure Blob storage again. Complete storage configuration by choosing corresponding storage account and container which you configured in Continuous Export.

Now, pay attention carefully to Path pattern setting. This is the path which Stream Analytics job looks up to your given container. That said, if the path is wrong, the input will return null. The path we see in the container follows by the structure below:

This gives us a hint that Stream Analytics job will look up the same pattern. The only thing the job would not support is the Data Type paramter. You have to use explicitly name of the data type, e.g Requests. So the pattern in my case is

The date format must be YYYY-MM-DD to follow the format of Continuous Export. For other settings, keep by default. I would like to keep the same format with telemetry in blobs so JSON is chosen.

Click Save. Stream Analytics job will also perform an input test to see if the path pattern is valid. It would be much better to test your input before moving on to create an output. Click […] and select Sample data from input.

On Sample data blade, choose the date before any last modified date of blob. If you need all data, select Start time to be longer before. Duration is considered end time of the range. Saying my blobs were generated 2 days ago so my range is as follows:

Click Sample and wait a few minutes. If data exists, you can download to review. Otherwise, Stream Analytics returns “No events found…” message because the lookup returns nothing. Click Download and open the file using NotePad to verify the data’s body content.

Now we need to create an output which targets to Power BI. From the list, you can see the service that Stream Analytics supports.

On Power BI blade, enter the output name. Click Authorize to authorize your Power BI account. The account does not necessarily have to be the same of your subscription. After you are authorized by Power BI, you can select the existing workspace in your Power BI account. Under Dataset name setting, give a name for the dataset. It will be automatically created after Stream Analytics completes the pre-defined query. Give a name for your table in Power BI. If your dataset with the same name is already existed, the new one will be overwritten.

Now we need to write a query to get things done. The query needs to be written in form of SQL query language and it needs to support to query JSON format (because the JSON was configured in our input). Let’s have a look at the JSON input again, focusing to the country field we need.

To retrieve nested field (country) in location element, we can just write

Make sure the input and output name are valid. This is the alias name you configured. Because “context” is a record type, we can access to nested property by this way. How about “request” which is array format? Let’s say if I want to retrieve request name?

We can use GetArrayElement method to retrieve property of an array. To use the method, your format must be array.

After getting reference of the array, we can access to its property using requests.<property> . Full query is as follows:

To test the query, you must put your data into input. The steps are similar to the one you configured when creating your input. Click Test  and see what is returned.

Now save the query and go back to the blade. Click Overview and click Start. On Start job blade, choose Custom to let the job run before any last modified date of telemetry blob. If you choose Now, then you have to generate dummy data again.

If the job’s status is changed to Running, you have successfully done the configuration.

Visualizing your data in Power BI

We have done many steps to get things pushed to Power BI. It’s time to visualize your telemetry. Log into Power BI, you can see your dataset which is automatically created under your workspace.

Click edit icon and start visualizing telemetry. Note from the FIELDS panel, there are three fields which Stream Analytics queries and pushes to Power BI. Choose Country and select Map type in VISUALIZATIONS.  Format the dashboard such as increasing title’s size, highlight background or turn on the category label. It is all about formatting which you need to grab some Power BI formatting skills to understand.

Once you make it nicely, click Save on the right corner and name your report. From you report, you can share web view by clicking File > Publish to web. Follow further instructions to generate the link.

How does Refresh work?

This is not real-time streaming using Azure Streaming Analytics. Telemetry needs to be written to Azure Blob storage by Continuous Export. Stream Analytics job runs to query to each blob file and push updates to Power BI dataset.  Data in Power BI needs at least 1 hour to be updated to your dashboard. Otherwise you must manually refresh data in report.

Real-time streaming can be done with other approaches such as pure API call, or using 3rd party Pubnub.

Conclusion

This article gives you every needed step to stream telemetry from Application Insights to Power BI. Also, the query I used in the article is just simple enough for your to understand what is supported in query. Real-world scenarios require more complex SQL query, which even needs JavaScript UDF (User-defined function). All of these stuffs are supported by Stream Analytics.

Finally, again, that is not the only way to show Application Insights in Power BI. There are more ways to retrieve telemetry:

  • Use Application Insights Content Pack in Power BI
  • Use Power BI Query to query to Azure Blob storage
  • Use Application Insights REST API and Power BI REST API

Comments

  1. Bruce Flitt
    February 26, 2018 — 3:24 am

    Can Do

Leave a Reply

© 2018 The Soldier of Fortune.