Design a site like this with WordPress.com
Get started

How does Power BI cope with Source Data Changes?

You have a data source that you are connected to with a some nice reports and DAX

However, at some point your data source changes. In this example I have moved from this

To

The question is, what happens to our report when we load in this updated data set?


Straight away when we attempt to Refresh the pbix file we get an error

This would also happen if Power BI Service was being automatically refreshed. We need to go into Power BI Query to get everything sorted

Also we don’t want to have to go and change the DAX, We want to get this resolved with the less amount of effort as possible.

Click on Transform Data

When I click on Refresh All I get the same error

The new columns will need updating in your code so Power BI Understands that they are the same column.

It starts to go wrong here at Changed Type

You can click on this code and Update to the new names. From

= Table.TransformColumnTypes(#”Split Column by Position”,{{“street”, type text}, {“city”, type text}, {“zip”, Int64.Type}, {“state”, type text}, {“beds”, Int64.Type}, {“baths”, Int64.Type}, {“sq__ft”, Int64.Type}, {“type”, type text}, {“sale_date”, type text}, {“price”, Int64.Type}, {“latitude”, type number}, {“longitude”, type number}, {“sale_date – Copy.1”, type text}, {“sale_date – Copy.2”, Int64.Type}})

To

= Table.TransformColumnTypes(#”Split Column by Position”,{{“street”, type text}, {“city”, type text}, {“zip”, Int64.Type}, {“state”, type text}, {“Number of beds“, Int64.Type}, {“Number of baths“, Int64.Type}, {“Square Feet”, Int64.Type}, {“Type of House“, type text}, {“sale_date”, type text}, {“House Price“, Int64.Type}, {“latitude”, type number}, {“longitude”, type number}, {“sale_date – Copy.1”, type text}, {“sale_date – Copy.2”, Int64.Type}})

This section now works

Click on Each Applied Step just to check, in this case, I have resolved all the issues in this one step. Time to refresh and Close and Apply

Annoyingly the DAX is still referencing the original column names

The hope was that the DAX would recognise the column changes

Attempt 2

Lets go back to Power Query Editor and go back to the original code of

= Table.TransformColumnTypes(#”Split Column by Position”,{{“street”, type text}, {“city”, type text}, {“zip”, Int64.Type}, {“state”, type text}, {“beds”, Int64.Type}, {“baths”, Int64.Type}, {“sq__ft”, Int64.Type}, {“type”, type text}, {“sale_date”, type text}, {“price”, Int64.Type}, {“latitude”, type number}, {“longitude”, type number}, {“sale_date – Copy.1”, type text}, {“sale_date – Copy.2”, Int64.Type}})

This sets up the error again.

Add a new applied step before the changed type. What we are going to do is rename the new columns back to the old columns (Bear with me, this seems odd because we don’t want to use the old column names but this will make sense a little later

I click on the Applied step previous to the one that is erroring and rename the column headers in the table

Once done, move to your net step and check they now all work

If you close and Apply your DAX should work again because your column names are now old

But we don’t want to do that. Lets go back to Transform Data

Go to the end of your applied steps and rename your column to the new column names (By clicking on the column header and renaming

And this time when you refresh the data, DAX will accept your changes

So, its a little long winded but mak sure you know exactly what your old column names and new column names are

  • Before the error, rename back to old in Power Query
  • At the end of applied steps rename back to the new column names
  • And you should be in a position where all your reports have accepted the column name changes
Advertisement

Azure Subscriptions

The World of Azure Subscriptions and setting up a good Azure architecture can seem quite daunting (because it is).  Here is a guide to Azure Subscriptions and how they work.

Free Subscription

When you first go into Azure you can start for free.

This is probably the most used subscription starting point. There are services that you can use free for 12 months.  For example:

  • 5GB Blob Storage
  • 5 GB File Storage
  • 250 GB SQL Database
  • 750 hours of Windows VMs
  • 30,000 transactions of AI and machine learning

The list goes on.

You also get 150 credits to use on other services. And in many cases, some services are always free, like functions and App Services

Monthly Azure Credits for Visual Studio Subscribers

If you are a Microsoft Partner you get can get credits through the Microsoft Partner Portal https://partner.microsoft.com/en-GB/

You can either get Visual Studio Enterprise (MPN) or Visual Studio Enterprise if you have a subscription but you are not a microsoft partner.

You get access to your Visual Studio benefits https://my.visualstudio.com/

MPN: Microsoft Partner Network

Note that the Azure service states “Your own personal sandbox for dev/test”

In regards to running Production environments, the guidance states

“Monthly Azure credits for Visual Studio subscribers are specifically for individual dev/test usage only. To run production workloads, you’ll need to upgrade to a paid Azure subscription, such as pay-as-you-go pricing.”

The question is, what does Microsoft consider a Production environment?

And if these are supposed to be for individual use, what should you do when you have multiple developers working on a development project?

Answers coming up later…….

Visual Studio Professional subscription gives you £40 a month credits. Professional costs around £920 for the first year and then yearly renewal is around £613

MSDN Platforms subscription gives you £75 a month credits

This is a new Subscription that provides IT Department with a cost effective way to license Microsoft Software for individuals but don’t require the full suite of Visual Studio Development Tools. Pricing is more complex https://visualstudio.microsoft.com/msdn-platforms/

Create Subscription in Azure

When you go into Azure and search for Subscriptions +Add Subscription you are taken to the following screen. (these are the offers we have available to us. Depending upon your own environment you may have more or less)

Pay as you go

As advertised, this is simply the pay as you go model that many people move to after their free subscription has ended

Pay as you go  is billed at the standard Pay-As-You-Go rates and service level agreements (SLAs) are offered for each of the Azure platform services

Pay as you go Dev/Test

Meets the need of teams of Visual Studio subscribers and is specifically for development and testing only.

Enterprise Costs around £4,602 for the first year and then £1,970 for every subsequent year

Visual Studio Enterprise Solution – MPN

Again, this is specifically for Visual Studio Enterprise Subscribers in the Microsoft Partner Network. This relates to the Monthly Azure Credits for Visual Studio Subscribers section above

Within this service, you get low dev/test rates on VM’s (You pay for the linux rate) You also get low rates on HDInsight and Web Apps.

As mentioned, this is specifically for personal Dev/test Environments. Usage within the subscription does not carry a financially-backed SLA

Azure in Open

You can buy Azure through open licensing programmes from your Microsoft Reseller. Small and medium sized businesses purchase in this way. Simply contact your Microsoft Partner or find one to work with https://www.microsoft.com/en-gb/solution-providers/home

To go for this option, you purchase Azure Monetary Commitment credits from your reseller in the form of an Online Service Activation (OSA) Key. You can use these credits for 12 months following Activation

Other Subscription Offerings

Enterprise Dev-test

Similar to Pay as you go Dev Test, If you have a Microsoft Enterprise Agreement

EA/SA (Enterprise Agreement/ Software Assurance) is a licensing package, targeting large organisations with 500 or more computers. The EA is a three year contract which covers software licensing and updates. Customers can renew for 1 year or 3 years.

Software under EA includes Windows 10, Microsoft office, Windows Server, Exchange, Sharepoint, System Center, Client Access Licenses etc.

To set up a subscription under this offer, you will have an Enterprise Administrator. They can add Account owners to their agreement via Azure Enterprise portal. Then the Account owner can create Subscriptions via the Azure Account portal.

Once in place, active Visual Studio subscribers can be added as co-administrators to manage and use resources.

Visual Studio Enterprise (Bizspark)

In February 2018, BizSpark was replaced by the Microsoft for Startups program and has the same functionality as the Visual Studio Enterprise Subscription with £115 free credits

Azure Hybrid Benefit

Azure Hybrid Use Benefit (HUB) is a discount program for Azure users with Windows Server licences. It can save up to 40% of the normal costs of running Services.

Windows Server is used in business settings. Windows Server is a line of operating system that Microsoft specifically creates for use on a server. It includes Enterprise software because its intended for businesses.

Microsoft Windows Server 2019 pricing is complex so I wont go into detail here, but moving to HUB does provide savings for VMs and databases.

https://azure.microsoft.com/en-gb/pricing/hybrid-benefit/ Azure provides a great calculator for this. If this is an option, you really want to speak to a Microsoft Advisor to understand the best route.

Difference between Dev/ Test and Production Environments

Dev/Test  environments and Visual Studio credits are exclusively for developing and testing applications. These subscriptions don’t carry financially backed SLA’s

Any Subscription over a Visual Studio Subscription is for Dev/test and demonstrate only

It should also be noted that Training is NOT considered a development activity so training resources should be on the appropriate Subscription.

MSDN Licensing is now Visual Studio licensing but the following applies

“The single most important concept when discussing MSDN licensing is to understand the difference between production systems (those which are used to actually make money or carry out the business of an organisation) and development systems (those which actually create the programs and applications used in the production systems)”

IT Asset Management .net (2011)

And there is also the Subscriptions in Dev/ Test being for specific individual useage. For actual Project work where teams are involved, services are cheaper but you don’t get credits.

Is this policed by Microsoft? It is certainly a violation of the terms of use.

c. Suspension. We may suspend your use of the Online Services if: (1) it is reasonably needed to prevent unauthorized access to Customer Data; (2) you fail to respond to a claim of alleged infringement under Section 5 within a reasonable time; (3) you do not pay amounts due under this agreement; (4) you do not abide by the Acceptable Use Policy or you violate other terms of this agreement; or (5) for Limited Offerings, the Subscription becomes inactive from your failure to access the Online Services as described in the Offer Details. If one or more of these conditions occurs, then:

Production subscriptions can be purchased via several means

PAYGO is specified above

EA . Again this is specific to having a EA/SA (Enterprise Agreement/ Software Assurance). See above for more information

CSP (Microsoft Cloud Solution Providers) See Azure in Open for more information above.

Management Groups, Subscriptions and Resource Groups

As an example, imagine you have an Azure environment with just Visual Studio Enterprise Accounts complete with credits for each developer

So far, no thought has been put into creating Management groups

lets have a look at this so far. the decision has been made to split the resource groups up by environment. In this case Proof of concept. However, because we know that this subscription can only be for Dev/Test and POC maybe we need to think about about the hierarchy in a little more detail, bringing in the Management group

Management Group by Environment

Management groups are containers and they help to manage access, policy and compliance across multiple subscriptions. You can use them with Azure policy and Azure Role Based Access Control.

Here we have Management groups split by environment. So under Production we may have all the pay as you go Subscriptions complete with SLA backing. The Test/Dev subscriptions under the other Management Groups when required.

The Resource Groups here as set at Apps level.

You could add a Resource Group for all your SQL Servers and Databases for example. And another one for all your machine learning requirements.

Management Group by Business Area

This is a different way of looking at your hierarchy, by Business Area. Management groups can be up to 6 levels deep which allows you to set up what ever meets your needs. If you worked for other companies you could have a management group for each company for example.

You could then have management groups under this for environment. Next Subscriptions and finally the Resource groups, which can bring together resources that are related.

Don’t forget that Tagging is also an important part of the architecture process

Hopefully that gives a little more structure to what you need to to when thinking about Subscriptions, which then leads into your Management Groups, Resource groups and resources.

Right at the top of your Hierarchy is your Tenant which represents your Organisation. Your tenant is your dedicated Azure AD Service instance.

Understanding what licencing you have to work with, what hierarchies will be the most logical and who you work with within other tenants will really help you get to grips with Azure.

The Dos and Donts of Power BI relationships and Modelling

Bad Modelling 1: Single flat file table e.g. Salesextract

When people first start out using Power BI as their Analytics platform, there is a tendency to say, lets import all the data in one big flat file, like an Excel worksheet.

This way of working is just not well organised and doesn’t give you a friendly analytics structure.

Avoid Wide Tables

Narrow tables are much better to work with in Power BI. As the data volumes grows it will affect performance and bloat your model and become inefficient. then, when you create measures, things will start getting even more overly complex in the one long and wide table.

Not to mention the point when you have to add another table and create joins. You may be faced with the many to many join because of your wide table.

STAR SCHEMA are the recommended approach to modelling in Power BI

Stars with a few Snowflaked dimensions are also ok.

If you have a flat file wide table its always important to convert to an above data model with narrow dimension tables and a fact table in the middle with all your measures.

Remember, Chaos is a flat file.

Model Relationships propagate filters to other tables.

In this example the ProductID propagates down to the sales table. 1 Product can be sold many times. (1 to many)

With a snowflake you can add another level

CategoryA Propagates down to the Sales Fact table

Deliver the right number of tables with the right relationships in place.

Power BI was designed for the people who never had to think about the design of data warehouses. originally, this self service tool would allow any one with little or no knowledge of best practice to import data from their own sources, excel spreadsheets, databases etc without any knowledge of how they were set up.

This becomes an issue when the recommended Power BI model is the fact and dimension schemas as above.

Understanding OLAP models go a long way to helping you set up Power BI

  • Dimensions Filter and group
  • Facts Summarise measures

Bad Modelling 2: Direct Query your Transactional Database

When you connect up to OLTP and drag in all your tables ( there may be hundreds of them) using Direct Query there are lots of things to consider.

the overall performance depends on the underlying data source

When you have lots of users opening shared reports, lots of visuals are refreshed and queries are sent to the underlying source. This means that the source MUST handle these query loads for all your users AND maintain reasonable performance for those using the OLTP as they enter data.

You are not the most important person in this scenario. The person(s) using the database to add data is the most important person

OLTP is designed for speedy data input. OLAP is designed for speedy retrieval of data for analytics. These are to very different things.

With OLTP, you have row-Store indexes (Clustered Index, Non-Clustered Index) and these are slow for data analysis. They are perfect for OLTP style workloads. Data Warehouse queries, consume a huge amount of data, this is another reason why using OLTP as your direct query data source isn’t the best approach.

Also your Direct Query means you loose a fair amount of DAX functionality time time based DAX calculations, What if Parameters, etc.

I was chatting to someone about this on the forums and they gave me a fantastic analogy

When you connect into a transactional database with Direct Query, its like being in a busy restaurant and getting all the customers to go and get their food from the kitchen.

It slows down the customers because of the layout of the kitchen. They don’t know where anything is, and other customers are also milling around trying to find where their starter is.

the Kitchen staff who are now trying to prepare the food are having to fight for physical space. Look at the pastry chef, trying to work around 10 customers asking where their various desserts are?

So you set up a reporting area. This is where the food gets placed, someone shouts service and a waiter will go and speedily deliver the course to the correct table.

No one needs to go into the kitchen unless they are in food prep. Everything works in the most efficient way.

Model relationships Dos

  • Only 1 ID to One ID. If you have composite keys they need to be merged
  • No recursive Relationships (relationships that go back to the same table. the example always used for this is the managerID in the employer table
  • the Cardinality is 1 to many. 1 to 1. many to one. (Many to Many needs a specific approach in Power BI)
  • Cardinality determines whether it has filter group behavior or summarise behavior
  • There can only be one active path (relationship) Between two tables. All your other paths will be inactive (But you can set up DAX to use them)

In this example OrderDateKey is the active relationship because we use this the most and joins to DateKey

ShipdateKey and DueDateKey also join to DateKey in the date table and are inactive.

DAX Functions for Relationships to help with modelling decisions

RELATED

When creating calculated columns you can only include fields from the same table. Unless you use RELATED

For example, I’m adding the column Colour into the SalesOrderDetail table which has a Many to One join to Products •Colour = RELATED(Products[Colour])

RELATED allows you to use data from the one side in the many side of the join

RELATEDTABLE

RELATEDTABLE Uses data from the Many side of the Join

TotalSales = SUMX(RELATEDTABLE(SalesOrderDetail),SalesOrderDetail[LineTotal])

USERELATIONSHIP

Forces you to use a relationship instead of the active relationship

=CALCULATE(SUM(InternetSales[SalesAmount]), USERELATIONSHIP(InternetSales[DueDate], DateTime[Date]))

CROSSFILTER

Modifies the filter direction Disables propagation. You can actually do this in the model by changing the filter to both directions instead of single. OR you can do it for a specific DAX query using CROSSFILTER

TREATAS

Create virtual relationships between tables

CALCULATE([Total Budget],

TREATAS(VALUES(Dates[Year]),’Unconnected Budged Data'[Year]))

Our Unconnected Budgeted Data is in Year only and its not joined to our main model.

Here we connect up to Year in Date. then we can create a visal with Date from the Date dimension. Total sales from our connected data which is at daily level and Total Budget from our unconnected budgeted data at a different level of granularity.

PATH

Naturalise a recursive relationship with the PATH function

Getting your model right and understanding your data sources is the most important thing to get right with Power BI. make sure you don’t have lots of headaches six months into your project. Its better to spend the time now, than having to start again later.

Moving Power BI to different Environments (Dev Test Production etc)

Currently we are in Development and I only have a Development database to connect too. However, very soon there will be multiple environments to connect too

  • Development
  • Q&A
  • Production
  • Test

Currently the development area is simply hardcoded

If you need to change to a different environment to test the Power BI Reports against this environment, this becomes quite a task.

Transform Data to get back to Power Query Editor

Add in the new environment, removing the old one and then check everything is still working

Change to Parameters

Go to Options and Settings and Options

Tick Always allow parameterization in data source and transformation dialogs

Server and Database can now be created by text or parameters.

Click on Server and New Parameter

Each environment is on a different server

Next comes the database (Environment)

Transform Data in Power Query Editor

You can see you environment in your parameters shown under queries

You can simply click on your parameter and change your current value

Then Publish to the correct Workspace

When you log into the App Workspace for that environment

Go to the data set and check the credentials

You cant change your environment parameters here, but its a good way of second checking that you have published to the correct App Workspace

Personally I would like to see more here, It would be nice to associate an App Workspace with an environment.

Introduction to DAX

I have gone through a fair few training courses on DAX now because its an important part of Microsoft Analytics. I thought It would be nice to include all my notes from one of the first DAX courses I attempted.

You don’t have to be an expert on DAX to get started with Power BI. You can start with a few of the basics and get lots of help along the way in the forums.

DAX (Data Analysis Expressions)  is an expression language for slicing and dicing analytical data

Where is DAX Used?

  • Power BI
  • Power Pivot (Excel)           
  • SSAS Tabular Mode (You can’t use it in Multidimensional mode)
  • Azure Analysis Services (This is only available in Tabular Mode)

DAX IS USED HEAVILY IN THE MICROSOFT BI STACK

Where Does DAX Shine?

  • Aggregations and filtering. Its optimized for both

What you need

  • An easy way of defining key metrics.
  • You need to be able to slice and dice
  • You need to be able to do historical analysis
  • DAX is an easy way of defining key metrics.

What is DAX not good at?

  • Operational Reporting – Detail heavy used for day to day operating. Line by Line Reports
  • Wide tables – Tables with a lot of columns
  • Many to Many relationships. There are ways around this, but it can be difficult to resolve

Its worth noting that this visual was provided over a year ago when Azure Analysis Services was the only way of creating the back end centralised model. we are now at December 2020 and Power BI Has moved on to become not only Self Service, but a really great way to implement a Standard centralised model for Enterprise reporting. You do this by creating data flows in Service, which establishes your transformations in one specific area for reuse. Then over this, data sets can be created that can be reused by other developers once you have promoted them.

Thinking in columns not rows

  • OLTP – Online Transactional Processing. (Normalised schemas for frequent updates)
  • OLAP – Online analytical Processing (Fact and Dimensions. Star Schemas etc.)
  • Single column aggregations – Sales by area etc. (OLAP)
  • Large number of rows – Sales by year, we have 10 years of data. Lots of rows (OLAP)
  • Repeated values – Data is flattened unlike in an OLTP
  • Need to quickly apply filters By Area, Postcode, Year etc.

We may only want 3 columns in a table that has 50 columns. We just want the slice of those 3 columns and we don’t want to read the other columns

For this we store data as columns Not Rows

Vertipaq

Also known as xVelocity. It is the engine used to store data as columns. The data is imported into Power BI data center as Columns not rows

Direct Query

If you set up your connection as a direct Query, Power BI has to translates DAX formulas into relational SQL queries so you lose a lot of the functionality like Time DAX because its too complex for SQL.

Compression and Encoding

This is how Power BI Compresses your data. Each column is compressed separately

  • Value Encoding
  • Dictionary Encoding
  • Run-Length Encoding

Value Encoding

Let’s take the lowest value in the range

and then store everything else as the delta (Difference) of that value C. 

This makes the Code shorter in length, compressing the data

Dictionary Encoding

Actual Column

The value gets assigned a number which is then used in place of the actual data item

This takes up far less space.

Run-length encoding

The data is sorted

This just keeps the colour and the number of repeats. Repeating values creates excellent compression. Unique values, not so much.

Adding Business Logic with Calculated Columns and Measures

The following examples have been set up using the old faithful Adventureworks data base.

Im initially bringing through two tables

  • Sales.SalesOrderDetail
  • Production.Product

and renaming them to Products and SalesOrderDetail

We are now ready to try some examples

Calculated Columns

Where should you be creating calculated columns? In Power Query using M Language or in DAX.

the speedy answer is, if you can, always create them in Power Query. However, you do need to understand how the calculated column works in DAX in order to understand measures

  • Expands a table by adding a new column.
  • Operational, Detailed information. Can only look at the specific row
  • It’s an expression that produces a column. Computes at the time of data refresh and is stored within the table
  • Limited by a row context. Price-Discount. They don’t take advantage of the columnar database

In Power BI we are going to set up a Line total in SalesOrderDetail because you need where possible to put the new column in the right table with the data that is being used to create it

Click on … Against the table under fields in power BI and New Column

LineTotal = SalesOrderDetail[UnitPrice] * SalesOrderDetail[OrderQty] * (1- SalesOrderDetail[UnitPriceDiscount])

It calculates the values for each row

The new column now sits in the model with the other fields (Go to Data Tab)

And you can use this like any other metric. In this table we are filtering by product Colour. Line Total has already been calculated within the row context.

You can now use Line Total in a table visualization because we can treat this column like any other column and its now being set in the implicit filter context which in this case is Color.

Implicit context filters are when you add a description column which aggregates your metric within the visual. Or if you add a filter to the visual. We will look at Explicit filters later.

Related

When you create a column you can only create it from data in the same table unless you add RELATED or RELATEDTABLE into your DAX

Adding Color into the SalesOrderDetailTable from Products

1 Product can be sold Many times.

Colour = RELATED(Products[Color])

I have added data from another table into a table. RELATED allows you to pull data across from another related table. Note that in the previous DAX I only used data from one table, so I didn’t need to use RELATED.

RELATED is for the One to Many Relationship. Using data from the one side in the many side. Colour (from 1) now sits in salesOrderDetail (M)

RELATEDTABLE

RELATEDTABLE uses the data from the many in the one side

We are in Product

TotalSales = SUMX(RELATEDTABLE(SalesOrderDetail),SalesOrderDetail[LineTotal])

The SUMX is an Iterator. Takes a table and an expression to evaluate. This means We are going through each row in the Products table and its running the evaluation. It’s using the related table to go through all of the LineTotal’s that we created in SalesOrderDetail

So for example

Row for Product ID 707. Sum up the Line total in salesOrder Detail where ProductID 707 and add the line total into Product

Next iteration. Row for ProductID 708. Sum up the Line total in salesOrder Detail where ProductID 708 and add the line total into Product

And repeat until you have iterated through the entire table for each row

Again, a quick rule of thumb is, if you are creating a calculated column using data from the same table, do it in Power Query Editor. If you have to use data from other tables, use DAX.

Your data will be compressed like any other field if its been created with Power Query Editor but it wont be if its a calculated column because it happens after compression.

Measures

  • Summarises all the data into a single value. Not stored on the table.
  • Analytical. Takes a column and brings back a summary
  • They are computed at runtime so stored temporarily
  • Every time you open a report, your Measures are computed
  • Limited by a filter context (Rather than a row context)
  • A measure looks at the data minus what has been filtered out by the user at that time.
  • They are more loosely associated with a table, so you don’t need to RELATE tables

Again, In Power BI on the SalesOrderDetail Table:  Create Measure

Minimum Price = Min(SalesOrderDetail[UnitPrice])

You can’t add your measure to a slicer or use as a filter because its created after the filters have been set

Implicit Measure – Underneath the hood is a measure for anything that you aggregate that is created by the DAX engine. For example your Total column. Here we are implicitly filtering UnitPrice by Colour to get the minimum value.

If you go into the Data tab, you wont see this measure because it only exists after you create your visual.

Filter data Using Calculate

Implicit

Applied by the user or the layout of the report. The visuals implicitly filter the data. When you drill up and down on a visual for example.

Explicit

Coded into the DAX Expression. Explicit filtering overrides Implicit filtering

Calculate

Calculate changes the current evaluation context

In Power BI set up a Table containing Products Colour and Unit Price (Set to Average rather than Sum)

BlackAveragePrice = CALCULATE(AVERAGE(SalesOrderDetail[UnitPrice]),Products[Color]=”Black”)

This Calculates the Sales Total Only on the colour Black. This overrides the implicit filters of colour. It changes the Current Evaluation context, e.g. White or yellow

Add a filter and notice that this is also ignored because the Explicit context filter ignores the Implicit Context Filter of red

Next create a measure

IsColorFiltered = ISFILTERED(Products[Color])

ISFILTERED Return value. TRUE when columnName is being filtered directly. 

Colour is filtering each of the line items but is not filtered for the total because it’s the average of everything

Now the Grand total is also filtered by Grey once we have filtered on grey using the slicer so both are now true

Calculate is designed to be fast. For more complicated examples we need to use Filter.

Calculate only lets you compare a single column to a fixed value

Filter

CALCULATE( SUM ( [Total] ), FILTER (Sales, QTY * Price < 100 ))

Filter takes in two values. The Quantity and Price and then applies the filter of less than 100

Now, We want to see The Average Unit Price Discount Where Order Quantity * Unit Price * Unit Price Discount Any time the total discount is greater than 10 dollars, we want that to apply to the average

As an example, In Power BI we need another table of Color and UnitPriceDiscount (Set as Average)

Create a measure in SalesOrderDetail

DAX is a functional language, its created from functions like SUMX etc.

DAX is executed from the innermost parameter so don’t create all your DAX on one line.

AverageDiscountGreaterthanTen =
CALCULATE(AVERAGE
(SalesOrderDetail[UnitPriceDiscount]),
FILTER(SalesOrderDetail,
SalesOrderDetail[OrderQty] *
SalesOrderDetail[UnitPrice] *
SalesOrderDetail[UnitPriceDiscount]>10
)
)

Filter takes in two parameters. We need to tell it the table and then the expression we want.

Filter is creating a table of values to work with because it is a table function. Its always embedded in other functions, in this case CALCULATE,

You cant have measures inside a filter.

to start to understand this I’m going to add details for a specific order and product. Im going in at the lowest level of granularity

I wanted to see exactly what was going on in the filter so I created a calculated column working on the row context

Qty*Price*Discount = SalesOrderDetail[OrderQty]*SalesOrderDetail[UnitPrice]*SalesOrderDetail[UnitPriceDiscount]

So for Full Finger Gloves L.44 were bought, Around 17 dollars each with a discount of 15%

the Average of UnitPriceDiscount is the same as the Summed UnitPriceDiscount because the products are all identical with the same discount applied

44 $17 gloves would bring in $752. Applying the discount of 15% (0.15) shows is that £112.83 has been discounted from the total price.

This is above 10 so we can simply display the average Unit Price

ALL

Historical Sales = CALCULATE (SUM ([Total]), ALL (‘Date’) )

ALL clears any filters in the entire date table

For this example, we want the Sum of Order Quantity for the specific context filter divided by (We now need calculate because we are manipulating filters) Sum of all Order Quantities.

ALL takes in just the product table because we want to undo any filters from the Product table (New Measure in the Products Table)

% of total sales = SUM(SalesOrderDetail[OrderQty]) / CALCULATE(SUM(SalesOrderDetail[OrderQty]), ALL(Products))

So for Black context filter the Order Quantity is 81937 / the total (We have removed the expression filter for the entire product table) of 274914 so we can see that Black is 29.80% or the order total.

With a slicer on the report. We choose One Product and even though its the only row in the visual Total Sales shows -.13% instead of 100% which is just what we want.

We still have 18% of Total Sales instead of it showing 100% because that’s the only row left in the visual. Perfect. This is what would have happened without ALL

% of total sales = SUM(SalesOrderDetail[OrderQty]) / CALCULATE(SUM(SalesOrderDetail[OrderQty]))

Looping over the Data with Iterators

Allows you to use multiple columns in an aggregation

What is an iterator?

Power BI’s DAX engine is optimized for working with columns not rows.

Iterators can process on a row by row manner. Its less performant because its working with rows. 

Takes in a table parameter (What is it looping through)

Then takes in an expression (For each row, evaluate this expression)

We have used something before that takes in the table and then an expression. The Filter function. The Filter function is an iterator.

Many of them end in X. SUMX MAXX CONCTENATEX etc

Average Gross Sales =

AVERAGEX(Category, [Quantity] * [Price] )

You could create Quantity* Price as a calculated column so each row gets this valuated first, and then average this column separately. The iterator does both in one formula

Add a new Measure into SalesOrderDetail and start typing in the following (Type it in so you bring up the intellisense)

AverageGrossSales = AVERAGE(SalesOrderDetail[OrderQty] * Uni

If you type in the above you notice intellisense stops working when you get to Unit price. This doesn’t work. The average function only accepts a single column.

This is where you could create this part as a calculated column but we want everything in one measure

AverageGrossSales with Discount =

AVERAGEX(SalesOrderDetail,

SalesOrderDetail[OrderQty] *SalesOrderDetail[UnitPrice]*

(1 – SalesOrderDetail[UnitPriceDiscount])

)

Also, you cant use AVERAGE because Average only accepts one value. AVERAGEX can be used in a measure because it iterates through each row and then aggregates

CONCATENATEX example

Add the following measure to Colours

Colours Incorrect = CONCATENATEX(Products, Products[Color],” “)

Its only called Incorrect because this is the wrong way to do it

This measure should show us what colours we have filtered on but as you can see below, now I have added a Card. We get the filtered colours for every row and we don’t want that. The Concatenate is iterating through every row and adding every colour together

Colours = CONCATENATEX(DISTINCT(Products[Color]),Products[Color],” “)

If you change the Table part of the expression to Distinct Colour, this gives you the distinct colour filtered on

This time we added a Distinct and if we choose more than one colour a space will be between each colour.

Evaluation Context

Defines what values an expression can see

Nesting Evaluations

If the above is a little confusing (and it will be) here it is again

RANKX is an iterator

Its iterating through products so we are going to start at the first product. Tulip

Its passing everything from Internet Sales related to Tulips into basically a temporary table

We are passing this information into the SUM of X and we repeat for every Product

And now the RANKX has information to work with

Analysing History with Time intelligence

We need to be able to compare periods. We can make use of DAX time intelligence

  • Single date functions – Return a single date First date
  • Date table functions – Returns a table as the output. Year to date. Takes in a date and takes in a column filtered year to date
  • Date aggregation functions Total Year to Date

Create a Date Table

Why can’t I use the date table in sales?  Dates need to be contiguous

How to 1:

  • Get Data > Blank Query
  • Next click on Advanced Editor

let

    Source = #date(2011,1,1),

    Dates = List.Dates(Source,3650,#duration(1,0,0,0))

in

    Dates

Create the source date which is the 1st Jan 2011. Then take the source and create Dates. I’m creating 3650 rows with a duration of 1 day between each

Then click on ‘To Table’ to convert to table. Then I renamed the column to Date,

I can then add more dates using M query by duplicating the date and changing to Year, Month and Day. Make sure Date is a Date

How to 2: You can also make this table using DAX

  • Modelling tab > New Table

DatesDaxCALENDAR = CALENDAR(“01/01/2011″,”01/01/2021”)

This is your starting point, but we can amend this to include other date columns

DatesDaxCALENDAR = ADDCOLUMNS(CALENDAR(“01/01/2011″,”01/01/2021″),”Year”,YEAR([Date]),”Month”,MONTH([Date]),”Day”,DAY([Date]))

If you used CALENDARAUTO it looks at the date that you used and it automatically calculates the start and end based on your data set

Connect your new date table up to the Sales table

For the Power BI exercise, I am going to use the Date table created with M.

Dates as Surrogate Keys

Sometimes when you work with data warehouses you have an integer as a date for the surrogate key. 20190206 for example

If you have to join to a surrogate key.

  • Modelling > Mark as Date table. Say which is the date and now this will work. The only reason to mark as date table is if you are joining to a surrogate key.

If you move everything to an Analysis Service, you MUST have a date table marked as date

Year over Year and Year to Date Analysis

First, Add Sales Header to your Power BI Report from the Database

  • Edit Queries > Recent Source > Choose the Adventureworks Source > Sales.SalesOrderHeader > Import
  • Rename to Sales Header and close and apply
  • Go into relationships and set as follows

Sales Header OrderDate to Date in DateM (Ensuring they are both dates)

Disconnect the other Date table. This is for reference only

Now create this year to date measure in the DateM table which uses date from the date table

YTD DatesM = TOTALYTD(SUM(‘Sales Header’[TotalDue]),DatesM[Date].Date)

As you can see this works. Year to date resets when we get to the next year.

Get the Sales from the Previous Year (Added measure to the Sales Header Table)

Prev Year = CALCULATE(SUM(‘Sales Header'[TotalDue]),PREVIOUSYEAR(DatesM[Date]))

YoY Growth = ([Prev Year] – Sum(‘Sales Header'[TotalDue])) / Sum(‘Sales Header'[TotalDue])

So take the current total Due from the Previous years Total Due. Then divide by the total Due

This is only a starting point really but gives you a good idea of some basic DAX queries, How to write it and why.