How to analyze PowerBI logfiles automated using REST API with Logic Apps and Azure Log Analytics Workspace

Lukas Fuchs
11 min readDec 23, 2020

To log, monitor, and alert events of your PowerBI Service like the last dataset refresh, you can use the PowerBI REST API. It turned out that it’s not “that” straight forward to integrate the data to Azure ecosystem as I was hoping for, so I did some research and hands-on with some try & error.

In this post, I put together a detailed step by step guide with my learnings, as I was not able to find such a documentation at all until yet :-)

Overview of the components used
This is the result we want to achieve: Some PowerBI logfiles, automatically sent to Azure Log Analytics Workspace, to setup some monitoring and alerting or get some other insights like dataset processing durations.

Please note: There would also an alternative approach, using Azure Functions in PowerShell mode to query PowerBI Service REST API, maybe this approach could also be interesting for you als an alternative to Logic Apps. This would especially be interesting (meaning “cheap”) if you like to implement a high frequency monitoring solution.

Preconditions

  1. You need an Azure Subscription and Azure AD (AAD) admin privileges.
  2. You need a PowerBI Service subscription and PowerBI admin privileges.

Step 1: Register PowerBI App within Azure AD

There are two possibilities to register your PowerBI App within Azure AD. The first is to use the native PowerBI GUI. The other is to use Azure AD Admin Panel (Described below the following subchapter) here you will have more control what is happening.

Please note: The following Steps need Azure Admin privileges. Make sure you have those permissions or you can give this instructions to an AAD-Admin within the organization.

Go tho the “App Registrations” Page in Azure Active Directory.

Select “New registration”

Register the app name with the name that fits for you. In my case I use “Azure Power BI Integration App”.

Choose the supported account types that will be allowed to access the app. In most cases the first selection applies. In some scenarios e.g.: if you are a consultant coming from a trusted AD tenant you would probably select the second option “Multitenant” for instance.

The Redirect URI we leave open by now. This information we will get at a later point in time from the custom logic app configuration.

Now you are redirected to the app settings page. At the top of the page you can see important information like

  • App Client ID -> Note the ID, we will use it later
  • Tenant ID -> Note the ID, we will use it later

Head to “Certificates and Secrets” in this settings page.

Click the “+ New client secret”

Enter a description that allows you to remember for whats the secret for.

Enter an expiry date that fits for your case. (In my case I don’t want any surprises in one or two years)

Now you have generated a new secret that can be used. Make sure to note the Secret Value, we will need this later on.

You should now have the following values in hand:

  • App (Client ID)
  • Directory (Tenant) ID
  • App Secret Value

Now, we head to the point “API Permissions”. Click “+ Add a permission”.

Scroll down to the point, where “Power BI Service” is listet and select it.

See the PowerBI Service URL at the top? Please note it down, we will use it later!

https://analysis.windows.net/powerbi/api/

Select “Delegated Permissions

Select your needed permissions (depending on the log file topics you want to investigate).

In my instance I just want to read everything and don’t need any write permissions.

Please note: there is one special permission on tenant level, as shown in this picture. If you select those permissions, you need to grant admin privileges.

Confirm by clicking “Add Permissions” and you should now see the following:

Please note: I experienced some problems in accessing PowerBI datasets from Logic Apps in the following steps, that can be fixed at this point. For this I needed to click “Grant admin consent”. You can keep it as and maybe come back to this point if you experience the same issues or fix it right away here by clicking “Grant admin consent”.

Now we leave this configuration area to create a custom logic app. Please use a separate window since we need to return later on to enter and configure the redirection URL that will be generated from the custom logic app.

Step 2: Create an Azure Logic App Custom Connector

In this step, create and deploy a new custom connector via the Azure Portal that later can be used for connecting the PowerBI REST Webservices.

To start. We add a new resource in the azure portal and type in “Logic Apps Custom Connector”, click it and hit the “Create” Button.

Here we use the defaults, give a meaningful name like “Logic_App_Custom_ Connector_PBI”, click “Review / Create” and wait for the deployment.

After the deployment, we open the EDIT mode of the Custom Connector, in the “1. General” tab stick with the default settings (REST, OpenAPI File, and the Standard Swagger JSON File)

In the part just below, if you like you can choose to upload an icon (optional). Give a meaningful description and use “HTTPS” as Scheme. The most important point here is to put “api.powerbi.com” as Host. You can stick with the standard Base URL “/

Now head to the “2. Security” tab…

In the “2. Security” tab, make sure you enter the following information:

  • Authentication Type: OAuth 2.0
  • Identity Provider: Azure Active Directory
  • Client ID: Here we use our App (Client) ID we noted down in the step above…
  • Secret: Here we use our Secret Value we noted down in the step above…
  • TenantID: common
  • Resource URL: Here we use the noted resource url from the step abvove (https://analysis.windows.net/powerbi/api)
  • Scope: openid
  • Redirect URL: This will be generated, once you hit the “update connector” button (this we will need later to configure the App in AAD app registration menu.

Now we hit the “Update connector” button that generates us the Redirect URL:

https://logic-apis-westeurope.consent.azure-apim.net/redirect

This value shoud be copied since we need it for the AAD app registration finalization later on.

Now we head to the next section “Definition”

Here we add a new Action and define some general information about the API we want to use.

Here you can find the API documentaion if you have a special API in mind: https://docs.microsoft.com/en-us/rest/api/power-bi/

In my case I want to get the history of data set refreshes to see if everything is refreshed without any problems. So I navigate to https://docs.microsoft.com/en-us/rest/api/power-bi/datasets/getrefreshhistory.

Here you will find the reference that is need to get the latest (top 1) refresh of a dataset:

Everything that needs to be done now is to click “+ Import from sample”, define “GET” according to the documentation and the URL defined by the documentation, then click “Import”.

Finally we get the variables directly mapped to the fields. Those we can define in the later logic app that calls this logic app.

Don’t forget to click “update connector” that your changes are saved.

Step 3: Add the Redirection URL to your AAD App Registration

In the last step we created and noted down the redirection URL, which was https://logic-apis-westeurope.consent.azure-apim.net/redirect (in my case)

Now its time to head back to the “App registration” section in Azure Active Directory.

Please note: Here again, Azure Active Directory Admin Privileges are required

We go to the “Authentication” section. Click “+Add a platform” and select “Web

Here we enter our copied redirect url and click “configure” at the bottom of the page. Thats it.

Step 4: Logic App to Get the PowerBI Dataset Refreshes

Now we create a “standard” Logic App likewise to the Custom Logic App before.

In my case I name it “Logic_App_Monitor_PBI”

Once deployed, I go to the Logic App Designer and create a blank Logic App.

I’m interested in monitoring the status once every 60 Minutes, so I add as Trigger the “Scheduling” / “Reccurance” Trigger.

As a next Step, we go to “Custom” and should see there our Custom Logic App Connector, defined in the Steps before and select it.

Now it is time to Sign In.

Please Note: In my case I was experiencing Sign in issues at this point and needed to go back to Azure Active Directory -> App registrations -> Permissions and click “Grant admin consent” to get this issue fixed.

… Once signed in we can define the relevant dataset ID and (optionally) the parameters of the webservice. If you don’t know how to get the Dataset ID of your Report, see the next subsection.

How to get the Dataset ID from a PowerBI Report?

Login to your powerbi.com tenant, go to your report in the “dataset + dataflows” register, click the expant button and select “Settings

In your browser URL bar, you can not see the dataset ID at the end, which is for instance: https://app.powerbi.com/groups/*******/settings/datasets/ 98adda85-********dee96749

… Back to your Logic App you can now enter the DataSetID, define the API specific parameters (For my example it is the top 1 record).

Now we are able to get the data we need from the PowerBI REST API and are ready to process it to any Logic App destination.

Step 5: Use the Data from PowerBI REST API

We now have the data from the REST API and want to store it every 60 Minutes in a cheap analytical storage we can use for monitoring / alerting purposes. For this we use in our scenario the Azure Log Analytics Workspace (LAW).

We add a new step, using Azure Log Analytics Data Collector and use the “Send Data” action.

Then we need to configure a Log Analytics Workspace. If you don’t already have one, set it up as a new Azure Resource.

Once done, we go to “Agents management” in the menu bar of the LAW Service and copy the Workspace-ID and -Key.

Back in the Logic App, we type in a Connection Name and enter the copied ID and the Key.

Now, we select the BODY from the predecessing step as JSON Request body. We define the Custom Log Name where we will find the Data Stored in the Log Analytics Workspace later on. (Optionally): Use the UTC Timestamp as additional metadata. Don’t forget to safe your Logic App definitions.

That’s it. You can give it a try and hit the “Run” button.

In my case, i had an error, because of using whitespaces. If you avoid those, you should be fine.

Thats it :-)!

Step 6: Add Monitoring and Alerting Capabilities

Now we want to access the logfiles in Azure Log Analytics Workspace and setup a monitoring and an alerting query. If you are new to Azure Log Analytics Workspace, you can find an introduction here.

In Azure Portal, we go to the Log Analytics Workspace and got to “Logs”.

After 2–3 minutes of the initial Logic App execution we can see the “Custom Logs” section, containing our new table we now can query with a double click.

Now a query window will open. You can just run the query and see the results. The query language used is called KUSTO. Here you can find a documentation.

I made a query which parses the JSON Body Element so that I can use the timestamps in it and has a WHERE condition that just shows up results, if there is a 24h lag in dataset refresh.

Once you are happy with the result, you can setup a new alert rule based on this query by clicking “+ New alert rule”:

Now hit the “condition” Link and configure the conditions to your needs.

In my instance I used Based on “Number of results” with Operator “Greater than or equal to” with Threshold value “1”, and set the Evaluated Period on “60” minutes as well as the frequency.

Now you just need to define the missing Action Groups, and Alert Rule Details and hit the “Create the alert rule” button.

*** Finished :-) ! ***

--

--

Lukas Fuchs

I work as a Transformation Architect at Trivadis. My professional passion topics are Data Driven Business, Data Analytics and Information Design