Design a site like this with WordPress.com
Get started

Quick Incremental Refresh check List

Incremental Refresh came available for Power BI Pro a few months ago but when tested there was am issue. Error Resource Name and Location Name Need to Match. This should have been fixed in April so here is a quick check list of how you approach incremental Refresh

Define your Incremental Refresh Policy

  • What is your data source?
  • Are new rows simply added to the dataset in power BI?
  • Are records deleted?
  • can old rows be updated?
  • If rows can be updated, how far back does this go?
  • How many years worth of data do you want to retain?
  • Which tables in your data set need incremental refresh?
  • Can you define the Static Date within the table that will be used for the Incremental refresh?
  • have you published to P

Each of these points are very important and will establish what you need to do to set up the incremental refresh, from your data source up to power BI Desktop and Service.

Set up incremental Refresh in Power Query Editor. Create Parameters

Go to transform data to get to the power Query Editor (You can either be in desktop or creating a dataflow in Service)

The two parameters that need setting up for incremental loading are RangeStart, RangeEnd

Range Start and Range End are set in the background when you run power BI. They partition the data

Query FoldingRangeStart and RangeEnd will be pushed to the source system. Its not recommended to run incremental processing on data sources that cant query fold (flat files, web feeds) You do get a warning message if you cant fold the query

You cant Query fold over a Spreadsheet. Its possible to query fold over a Sharepoint list. the recommendation is to set Incremental processing up over a relational data store.

For the desktop. allow yourself a good slice of the data to work with. for example, a year, two years worth of data.

Filter the data in the Model

Still in Power Query Editor.

Add your parameters to every table in your data set that requires incremental load

Find your static date. E.g. Order date, Received Date etc

Close and Apply

Define your Incremental Refresh policy in Power BI Desktop

Go to your first table and choose incremental refresh

Example screen shot of an Incremental refresh policy

Store Rows

In the Above example we are storing everything  for 5 years. its set to months so the partitions are smaller

Refresh Rows

Our Refresh policy is very simple. Data is inserted. not deleted or imported.

If this was running every single day then you would only need to refresh rows in the last 1 day. However as a just in case 1 month has been used, in case for any reason the job is suspended or doesn’t run.

Detect Data Changes

Detect Data Changes has been used. The months data will only be refreshed if the ImportDate for this record is changed (Or there are new records)

No records are deleted so we don’t need to worry about this

If you want to use Detect Data changes you must have an Update date on your source data. This may impact your data source.

  • Are you running straight from source into Power BI and there is no Update Date available?

Then you will need to make the decision to have a reporting database layer, Where you can add UpdateDate logic to your table

  • Is there a possibility that records are also deleted?

You need to deal with this slightly differently

Set Soft Delete in your reporting data

Add isDeletedColumn and update LastUpdatetime and isdeleted to 1 in the warehouse

This will come through to your model as updated and you can filter out all the isDeleted records

  • Publish the new Power BI Report and Data Flow

You might be thinking at this point, but I dont want the filters that I have set for Desktop to be applied in Service. I want to see all my data in Service

Dont worry, in Service RangeStart and RangeEnd don’t keep the dates specified for the filters in Desktop.

they are set via your incremental refresh policy. So they are set as the partitions for our 60 months (Instead of setting it to 5 years, meaning there is one RangeStart and OneRangeEnd, you get RangeStart for Month one, RangeEnd for Month 1, RangeStart for Month2, RangeEnd for Month2 etc, breaking your 5 years down into much smaller partitions to work with,

Testing Your Incremental Refresh

Make sure before you add the incremental Refresh you have done a full process of your data. Its good to get an idea of the time you are working with

Azure Data Studio

One way of testing is with Azure Data Studio.

https://docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio?view=sql-server-ver15

Once installed, connect to the SQL Database that is your data source

So long as you have the profiler extention installed you can Launch Profiler

If you don’t have it, you can download the extension

Once Launched Start a Profiler Session

Go into Power BI service, Datasets.


click on refresh now and then go to data Studio to see whats happening

From Logon to Logout during the run, it took 20 minutes because the entire model is refreshed. obviously it would be really good if we could get the time down using incremental refresh

Before you set up Incremental processing, ensure that the services preceding the Power BI Load have been well tested and signed off.

Once Incremental refresh is in place, you can refresh again and check your findings in Azure DAX Studio.

the current recommendation, due to all the possible data source amendments and issues, its recommended to start with a relational data base as your reporting layer. Then you can Query Fold, Add IsDeleted and Updatedate columns and logic to this data source.

Advertisement