This function checks whether a value is a logical value (TRUE or FALSE), and returns TRUE or FALSE.

I just looked a question that asks, what does this DAX expression do?

is logical measure = IF(ISLOGICAL("true"),"Is boolean or Logical","Is different type")

I could just have a guess based on If “true” is logical then “Is Boolean or Logical”. Else “Is different type”.

Well, We have added “True” true within quotation marks which makes it text, not Boolean or logical.

As expected. Lets change the DAX

is logical measure = IF(ISLOGICAL(true),"Is boolean or Logical","Is different type")

True is Boolean because you can only have True or false

is logical measure = IF(ISLOGICAL(false),"Is boolean or Logical","Is different type")

Still Logical because its Boolean so the card stays the same

So why would you use ISLOGICAL?

You would mostly use this to test data so its great for making sure columns of true and false only contain boolean true and false information.

So, you always need to test how good your data is. Add an ISLOGICAL for true /false columns and you can report back on the validity of the data.

Salaried Flag in Adventureworks is a boolean column

Here is where the confusion is. You cant feed Salaried flag into this because we have created a measure and salaried flag is not an aggregation. You cant aggregate true or false

So instead I create a calculated column on the Employee table

Is Salary flag true or false? = IF(ISLOGICAL(DimEmployee[SalariedFlag]),"Is boolean or Logical","Is different type")

Great, Salaried flag has passed the test.

So ISLOGICAL, great for testing within a calculated column

Buying and Pausing a Power BI A SKU

I want to set up Power BI Dedicated Capacity specifically for development and testing Premium features.

the Power BI Premium SKU’s should only be used in production. We want something that we can pause when not in use and help us to research all the Premium capabilities.

Thankfully we can go for the A SKU (1 through 6)

  • A4 provides the same functionality as the P1 SKU
  • A2 is the same as P2
  • A6 is the same as the P3

The only difference is that you don’t get full user access like you do with Premium. Only users with Pro accounts can access the content (Which makes sense because we haven’t moved to Production)

And you can Pause when not it use. This is fantastic and will save lots of money in the development and testing phases

The question is, which SKU do we require. I know I need to test Paginated Reports and this is only available from A4 so clearly we need the SKU that corresponds with P1

Buying A4 SKU

Using an account that has at least capacity admin permissions in Power BI.

Log into Azure and Search for Power BI embedded

Here I have decided to choose our training subscription and training resource group.

 The capacity admin must be a member user or a service principal in your Azure AD tenant. I have selected myself for this role for this development Power BI area

At this point you need to choose the resource size and I know that we need the A4

If we left this one the whole time it would cost £4,467.41 for the month but obviously we only need this when we are testing Premium functionality so we would be pausing this most of the time.

Its extremely important that this is paused after use

I have added a few tags to this service before creating

Go to resource because we want to immediately Pause it

And Pause. I will only unpause the resource when I want to set up an App Workspace on this test capacity to check the Paginated report and other functionality.

The A4 SKU Will cost around £6 an hour so ensure you plan your development time effectively

What if Parameters in Power BI

I wanted to have a look at What if Parameters in Power BI. This had passed me by so using Adventureworks I wanted to get a handle on it.

I have a simple problem. I have a sales goal Id like to achieve over the months and Id like to see how adding percentages onto sales would help me hit my goal.

For example, If I add 1% onto sales, would that help hit the targets?

Management have decided that we need to be selling £170M each month as their big sales target after analysing current sales

The first thing we do is add this into the report using a constant line

We want to know how to set Sales Quotas, what do we need to do to push towards the targets. From Modeling, select the What if New Parameter

This is all fairly standard stuff. We want to increase by percentages for the sales forecast so we use a Fixed Decimal number. Then we want to be able to see changes from 1% to 150%

We want to go up in increments of half a percent

Now we have a slicer which we can move to choose percentage increases and a new table called % Sales Forecast.

Within the table are the following:


This will return a single column table containing values. We give it a start and and end and an increment

GENERATESERIES(<startValue>, <endValue>[, <incrementValue>])

If you go into data view you can look at the table created and see that this has created our calculated column of data for the forecast (Single Column table)

% Sales Forecast Value = SELECTEDVALUE('% Sales Forecast'[% Sales Forecast], 1)

this is the measure that has been created in the table so it doesn’t exist in the table structure. It returns the value, in this case when % Sales Forecast calculated column has been filtered down to one distinct value

Create a measure in new % sales Forecast table to take sales and multiply by the measure

Total Sales Forecast = [Total Sales]* '% Sales Forecast'[% Sales Forecast Value]

Total sales was created the by creating this measure

Total Sales = SUM(FactResellerSalesXL_CCI[Sales Amount])

I decided to add this measure here because it makes sense to hold this in the same area as all the other forecast information.

When you add this new forecast measure you get the following

Because we are only * by 1 its exactly the same as our sales but lets change the forecast

Great. If we get 1.15% more sales then March should exceed our target. So if we can hit this next March, we have achieved our targets.

Migrating Adventureworks Data Warehouse from Local Machine (SQL Express) to Azure SQL Server Database

For training courses and general lets have a look at updates to Power BI, an Adventureworks database instance has been set up on my machine, complete with 10 years of data that is great for using for training courses.

However, this database needs moving from its local machine to Azure

Previously bacpac files have been created to import into Azure via SQL Server Management Studio but they have always errored.

Its time to have a look at Azure Data Migration

Source Database

  • Product – Microroft SQL Server Express (64bit)
  • Operating System – Windows 1 Pro
  • Version: 14.0.2027.2
  • Release: SQL Server 2017

Attempt 1

Microsoft Data Migration Assistant

The Microsoft Data Migration Assistant allows you to upgrade to Azure by detecting compatibility issues.

The first thing to do is to download the msi package and then you can get started

Open the Data Migration Assistant

Click New to create a new Project

Then click create

The server would be connected to using Windows Authentication. However there are errors for this. the credentials must have control server permission.

In SSMS go to Security and Logins to find your own user Login.

Switch to the master database and run the following SQL


Unfortunately the error comes back Securable class ‘server’ not supported in this version of SQL Server.

SSMS is not able to Migrate Express Databases

Attempt 2


Move to your Azure connection in SSMS and right click on databases Import Data Tier Application

go through the Import Data Tier guide

Currently we are moving to an S2 model. (Purchase model DTU) which is around £54 a month. When its not being used I will change it to S0, around $10 a month.

There are compatibility issues happening between the Azure database and the SQL Express data base.

The first is regards to COLUMNSTORE. Apparently COLUMNSTORE support is available in S3 and above

S3 costs around £109 pounds a month, twice the cost of S2.

The Solution

Simply Import as S3 and then let Azure Scale to another Standard option by clicking on Pricing tier

You could also move to vCore purchasing options too which is the preferred route as it offers more personalisation.

The DTU-based purchase model are differentiated by a range of compute sizes with a fixed amount of included storage, fixed retention period for backups, and fixed price.

The virtual core (vCore) model provides several benefits:

  • Higher compute, memory, IO, and storage limits.
  • Control over the hardware generation to better match compute and memory requirements of the workload.
  • Pricing discounts for Azure Hybrid Benefit (AHB) and Reserved Instance (RI).
  • Greater transparency in the hardware details that power the compute; facilitates planning for migrations from on-premises deployments.


I can finally move my Express database into Azure and manage the scaling. All sorted.

However you should be aware that if its Under S3 the index gets disabled and Power BI cant use it in this state

So the question is, If you are importing the data into power BI, will you be happy to keep the database at S3 level when its in use OR would be be OK with removing the columnstore index?

this is something to think about……

Setting up a Service Principal for Azure Data Lake Gen 2 (Storage) to use with Data Factory

An Azure service principal is a security identity used by user-created apps, services, and automation tools to access specific Azure resources. Think of it as a ‘user identity’ (login and password or certificate) with a specific role, and tightly controlled permissions to access your resources

Azure Service Principal

I am constantly having to remind myself how to set up the Service Principal for Access to things like Azure Data Lake Gen 2 when I am setting up a data factory (Or using the storage with another web app).

So I wanted to write a blog post specifically on this.

As the example, imagine you are moving data from an Azure SQL Database to files in Azure Data Lake Gen 2 using Azure Data Factory.

You attempt to add a Data Lake Connection but you need a Service Principal account to get everything Authorised.

You need this so the Data Factory will be authorised to read and add data into your data lake

An application (E.g. data Factory) must be able to participate in a flow that requires authentication. It needs to establish Secure credentials. The default method for this is a client ID and a Secret Key.

There are two types of permissions

Application Permissions No user context is required. The App (E.g. data Factory) needs to access the Web API By its self

Delegated Permissions The Client Application (E.g. data Factory) needs to access the Web API as a Signed in User.

Create an App

In Azure choose App Registrations

Here you can create an app – New Registration

Provide a name for your app. e.g. DataFactoryDataLakeApp

Grant your Registered App permissions to Azure Storage

This will enable your app to authorise Requests to the Storage Account With Azure Active Directory (AD)

You can get to your app by going to Azure Active Directory

Then App Registrations and choose the App

In your new App, go to Overview and View API Permissions

Next go to Add a permission

Go to Azure Storage API which contains Data Lake Gen 2

Notice that we are setting up Delegated Permissions for Azure Storage

You are warned that Permissions have been changed and you need to wait a few minutes to grant admin consent.

I am not an admin so I always get my admin to go into Azure Active Directory and Grant Admin Consent for Peak Indicators

Note that your app now has configured permissions for Azure Active Directory Graph and Azure Storage

Assign your new app to a subscription

Now you have an app you need to assign Contributor status to it to the level of service you require in Azure, Subscription level, Resource group level or resource level.

For this app I am going to set it up against the subscription. First go to the Subscription you want to add it to and then Access Control (IAM)

I have added the app as a contributor

Creating a Key Vault

We will be selecting and creating IDs in the next steps, but instead of simply remembering your secret. Why not store it in a Key Vault.

  • Centralise Application Secrets
  • Store Secrets and Keys Securely
  • Monitor Access And Use

Lets set one up in our Proof of Concept area.

Create a Key vault if you don’t have one already

remember to add any tags you need before Review + Create

Once completed you can go to the resource (E.g. Data Factory) but for the time being that is all you need to do

Application ID and Tenant ID

You can now go into your new app in Azure (App registrations) to get more details for Data Factory (When you set up the connection)

Tenant from Data Factory will be mapped to Directory (Tenant ID) from the App Overview

Service Principal ID from Data Factory will be mapped to Application (Client) ID From the App Overview

Create a Client Secret

Next, create your Client Secret.

In your App go to Certificates and Secrets

Click New Client Secret

Im going to allow this secret to Expire in a year (Anything using the app will start to fail so you would need to set a new secret and re-authorise)

We can add this into the Key vault so we don’t lose it because once you have finished here you dont see it again.

Open a new Azure Window and Go to your new Key Vault

Go to Secrets

Click + Generate Import

Notice I have set the expiration date to match the expiry date of the app

Ensuring the Access is set for the Data Lake Storage

For this you need to have a Data Lake Gen 2 set up and Microsoft Azure Storage Explorer downloaded

In Microsoft Azure Storage Explorer, navigate to the storage

Then Right click on the File System (In this case factresellersales) go to Manage Access and add the app.

Notice that we have set Read Write and Execute for the app on the file system and all the files will inherit these permissions

Adding The Data Lake Gen 2 Connector in Data Factory (Test)

I have a Data Lake Gen 2 with some files and I want to move them into a SQL Data base.

To test, Open or create a Data Factory

Go into Author and Monitor. Then Author

Go to Connections, +New and Choose Azure Data Lake Gen 2

Tenant = Directory (Tenant ID) from the App Overview

Service Principal ID = Application (Client) ID From the App Overview

Service Principal Key (You can get it from Azure Key Vault. Click ON secrets,Then the name and current version

You can then copy the secret value and add it into Data Factory

Test your Connection

Create the Data Lake Data Set

Here is where you know that all your efforts all worthwhile.

Create a new Dataset which will be an Azure Datalake Gen 2

This is great. I have access to the files in the data lake. Achievement unlocked.

Power BI December 2019 Updates – KPI Updates (Goal)

Finally, something we have been waiting for for a long time.

Its amazing how such a little thing can case so much extra complexity.

Previously you could only show this on the KPI

The Goal States 1749 but this isnt a Goal. This is Last Months measure. Not a goal

This basically means that for the KPI you can only use them in this way if your data makes sense for there to be a Goal. Maybe you might have to add more information about the Goal in the Visual Header Tool tip icon.

However, for my report I need it to be stated that this is last month figure as compared to this months. I have lots of KPIs based on Current and this time last month so I had to create them like this

Instead of 1 visual I have 2. A KPI and a card, Hiding Goal in the KPI. this creates a lot more complexity in the reports but I had to do it to avoid Confusion.

Now, we can reset the Goal to whatever we want

Finally I can reset all my reports so I can reduce the amount of visuals shown.

Such a small change but it makes a big difference.

I give this update 10/10

Creating an open Account Measure using DAX

Thank goodness for the amazing people on the Power BI Forums helping with this. I have the solution but what I wanted to do was to create a post to explain the why

The Problem

I have accounts and events occurring on each account. What I want to know is the time period they are open

Just for this example I have taken One account with 11 activities

If you run the visual against start date all you get is a measure againt the start time period.

We are more interested in creating a measure to show how long the activity went on for.

For this example I am going to choose Jan 28 2017 to work with to continue because the end date for this is March 21st 2017 so this activity should span 3 months

The Date Dimension

To do this you must have a date dimension. For the purposes of this example I’m going to create one in DAX (Create table) Using the following:

Notice that I have used Start and End Dates to set the min and Max years. You should already have a date dimension to work with though.

Set up the relationship between the Fact table and Date dim

Start date is the Active join and End Date is the Inactive Join

You could create a role playing dimension for the End date and have that as an active join but in this case I’m going to go for the above design.

Account Start flag (New Measure)

  • CALCULATE – You can modify your expression using a filter
  • ALLSELECTED – If you have a slicer set to Year (Date) , ALLSELECTED takes the ‘Context’ from the slicer Like 2018.
  • Date is less than or equal to the Max Date in the Filter Context

Account End Flag (New Measure)

  • ENDOFMONTH – Returns the last date of the month in the current context for the specified column of dates.
  • The date at the end of the month is less than the Max date in the Filter Context
  • USERELATIONSHIP Because End Date is the inactive relationship we need to specify that this relationship is used, Not Start Date

Add an Open Measure

To understand the results we need to try and understand how the new measures work


  • Start Flag =        1 because 18th Jan is less than Jan 31st
  • End Flag  =        NULL because March 21st (Reset to March 31st) is NOT Less than Jan 31st
  • Open =  1- null = 1


  • Start Flag =        1 because 18th Jan is less than Feb 28th
  • End Flag  =        NULL because March 21st (Reset to March 31st) is NOT Less than Feb 28th
  • Open =  1- null = 1


  • Start Flag =        1 because 18th Jan is less than Mar 31st
  • End Flag  =        NULL because March 21st (Reset to March 31st) is NOT Less than March 31st
  • Open =  1- null = 1


  • Start Flag =        1 because 18th Jan is less than April 30th
  • End Flag  =        1 because March 21st (Reset to March 31st) is Less than April 31st
  • Open =  1- 1 = 0

There are usually lots of ways to answer one question so this is one of many but the Open Flag now allows me to pull up all my activities and show in a Matrix where the are open.

Create your website with WordPress.com
Get started