You have a shiny new Analysis Services model containing all the data you need for your Power BI reports and Dashboards
It has been added into an Analysis Service because A There is a lot of data, possibly too much for a Power BI Imports, B You want to do incremental processing without needing Power BI Premium and C You want to refresh more than 8 times a day.
Everything is all set up but the final part of the puzzle is how to schedule the refresh into the Analysis Services Tabular model.
Its easy enough in Power BI, You simply set the schedule in the Service. Lets have a look at how to do this using one of the Options. Logic Apps
Create a Service Principal (SPN)
First we need to Create a new Azure Active Directory Application and Service Principal to use with Role based Access Control. Whenever you have code that needs to access and or modify resources (The logic App will Refresh the Analysis Services data) You need to create an Identity for the App.
Sign into Azure
Go to Azure Active Directory

Next go to App Registrations and + New registration


The web URL is simply one set like the example in the documentation
Then click Register
Assign the Application to a role
Before you assign resources,, the app must be assigned a role. For this example we are going to assign a role at the subscription scope for our Proof of Concept Subscription
Go to All Services and then Subscriptions
Im going to select the Proof of Concept Subscription

Next select Access Control and add Role assignment

Note I have selected the new analysisservicesapp
Click save to add the Role Assignment
The Service Principal is set up. Now we need to get the values so we can sign into the app
Get Sign in Values
Go to Azure Active Directory

Go to App Registrations again and select the new application
Copy the Tenant ID for later use
Copy the Application ID for later use
Set up your Credentials
I was going to create a certificate but I found it an extremely complicated process and I couldn’t in the end export the created certificate in Powershell

Because I haven’t been able to Create and Export a certificate I am going to use Client Secrets instead
Click on New Client Secret

Once added you need to copy the ID for use later. You wont be able to access this information again so DONT LOSE IT
Give the Service Principal Name Authority to administer the Analysis Services
Still in your App click on API Permissions

Next View API Permissions


+ Add a permission and find Azure Analysis Services in APIs my organisation uses


Ensure that the Permission allows you to read and write all models (Tick the box)
then Add permissions

Note that although permissions have changed we still need admin consent
Grant Admin Consent
Granting admin consent requires you to sign in as global administrator, an application administrator, or a cloud application administrator.
From Azure Active Directory go to App registrations and select the app
Within the App go to App Permissions
grant Admin Consent

Configure Permissions in Azure Analysis Services
Next we need to ensure the new Server principal created as server administrator permissions in Analysis Services
Open SQL Server Management Studio by Right clicking and Run as Administrator, Connect to the Analysis Service
Add in your User name (In my case the email address) but not the Password.
You add your Office 365 password when you sign into your Account (Next)
Right click on the Analysis Services name and go to Properties, Then Security

Click add, then Search for the app that has been created. This can then be added as a server administrator (Allows it to add models, amend models, refresh data etc)
Click OK
Create the Logic App
Finally, we can create the Logic App in Azure
The Logic App will be triggered by a HTTP Request which will be triggered by Data Factory the Azure Orchestration tool
In Azure Go to Logic Apps and Add


Then New Step and search for HTTP and then HTTP




Method
Post. Post appends the form data inside the body of the HTTP request
URI (Unified Resource Indicator)
URI = https://your server region/servers/aas server name/models/your database name/refreshes
Here is my example:
https://ukwest.asazure.windows.net/servers/adventureworksas/models/AdventureWorksV2/refreshes
The Server is the Analysis Services created in Visual Studio.
The Model is the model that we want to refresh. There may be multiple models on the server
Headers
As per the example I have set to Content-Type and application/json
Queries
Nothing set
Body
This is where you set up al the processing information
{
“Type”: “Full”,
“CommitMode”: “transactional”,
“MaxParallelism”: 2,
“RetryCount”: 2,
“Objects”: [
{
“table”: “DimCustomer”
},
{
“table”: “DimDate”
},
{
“table”: “DimEmployee”
},
{
“table”: “DimProduct”
},
{
“table”: “DimProductCategory”
},
{
“table”: “DimProductSubCategory”
},
{
“table”: “DimSalesTerritory”
},
{
“table”: “FactResellerSales”
}
]
}
Type: Full
the type of Processing to perform. refresh command types can be
- full – Process the Entire Model and recalculate all the dependents (Measures, columns etc)
- clearValues – Clear values in the objects and dependents
- calculate – recalculate your formulas
- dataOnly – refresh the data in the objects, does not force recalculation
- automatic – If the object needs refreshing and recalculating then do so.
- add- Append data to the partition and recalculate dependents (measures, columns etc)
- defragment – Defragmentation option will clean up values in dictionaries that are no longer used
CommitMode: transactional
Determines if objects are committed in batches of when complete. Models include
- default
- transactional
- partialBatch
Its always a good idea to set up Partitions within your data, especially on the Fact table to process the data
However the above is an example specifiying what to process and when. You can simplify the code if you wish
{
“CommitMode”: “transactional”,
“MaxParallelism”: 2,
“RetryCount”: 2,
“Type”: “Full”,
}
The following is an example of adding Partitions into your code. In this example, DimCustomer is one Partition.
“table”: “DimCustomer”,
“partition”: “DimCustomer”
The fact table can be made up of many partitions. For example you could have one for each year. Then you can decide which partitions to process. For example, you may only want to process the current years data.
Authentication
Active Directory OAuth
Tenant
Use the Tenant ID that we collected previously
Audience
https:/*asazure.windows.net
Client ID
Use the Client ID that we collected Previously
Credential Type
Secret (Remember that we added the Secret to the App)
Secret
The Secret ID that we saved (Remember, for this one you cant go back to Azure and look at the secret. Its a one time ID Only)
Save and test the Logic App by Clicking Run

Consume Logic App with Azure Data Factory
We now have a logic app that will incrementally process the data. We now need to schedule this by creating a Data Factory or orchestrate the processing
First we need to copy the HTTP POST URL within the logic App When a HTTP request is received activity

And then create a new Data Factory in Azure

for the time being I am not enabling GIT
In author and Monitor create a pipeline and drag a web activity across to the design pane.

And then set up the schedule by adding a trigger

this should now be enough to process all your data into the Analysis Service model which has a line connection into Power BI.
What we need to do is to check that this is doing its job. i will be looking at this in a later blog post