Power BI Premium Gen 2 First Look

Currently Premium Gen 2 is in Preview (Jan 20201) but it looks really exciting for those with Premium capacity. Lets have a look at Gen 2 and see how it differs to Gen 1

Gen 1

With Premium Gen 1 we are bound by the number of vCores and by the memory we have. Lets take the P1 SKU as an example

P1, P2 P3 P4 = Are all Premium SKUs. Each one has more vCores for better performance

SKU = Stock Keeping Unit

vCores = Virtual Cores is a puchasing model which gives you more control over compute and memory requirements.

When it comes to Premium, reports would slow down if there were two many queries running. And if people were using many models at the same time you would have to wait for your time slot.

As you can see, there is only 25 gig memory across the data sets Once a report isn’t being run and used any more, that memory gets dropped and is added back into the pool.

Report users and report developers are also fighting with report refreshes.

Premium Gen 2

Gen 2 is not in generally availability yet but if you have Premium you can switch to Gen 2 in preview.

In Power BI Admin (as the Power BI administrator or Global Admin)

Go to the Capacity Settings and switch Gen 2 from disabled to enabled.

You can also go back to Gen 1 if you need to but if you do make sure you flag up any issues you are having.

lets have a look at the model compared to Gen 1

Autoscaling

The end users encounter the throttling and performance issues with Gen1 because they physically only have 4 backend vCores. Now with Gen2, Autoscaling allows you to deal with spikes. This is not available yet but will be coming. This is helped by the fact that there are other vCores that can be called on.

If you do come up to the 4 core Limitation it will, or may lend you a vCore so you don’t see impact for your end users

Previously our admins had to deal with this kind of problem but this will really help automate these kind of issues

Memory

Data Sets can go over the 25 gig memory capacity. Previously Premium was 25 gig for all the data sets. Now Data sets are gated individually.

This is a fantastic update. We don’t have to worry about the collective size of all our data sets.

Refreshes

Previously there was a maximum of 6 refreshes at any one time. Otherwise you can get throttled.

With Gen 2, refreshes get spread out over a 24 hour period and don’t impact other queries from users. refreshes just run

The looks great. People are seeing refreshes of an hour and a half coming down to 10 minutes.

Capacity Usage Metrics

This is coming soon and will have a breakdown of items.

Its a little annoying when you have set up gen 2 and want to view the metrics to see how everything is working but currently can’t.

With Gen 2 we will also be able to work against a chargeback model. This means that we can spread the costs of Premium between distinct areas of an organisation dependant upon their usage.

Workloads

Again the workload settings aren’t fully functional at the moment but more will be coming.

For example for data set workloads we can specify minimun refresh intervals and execution intervals. we can detect changes in our metrics.

We don’t have settings for dataflows and AI yet.

Why go with Premium Gen 2 Checklist

  • Performance benefits
  • End users see faster reports
  • Refreshes, we now don’t have refresh bottlenecks and we remove refresh failures due to throttling
  • Premium per user
  • Improved metrics will be introduced soon
  • Autoscaling
  • Proactive admin notifications

Why it may be worth waiting until Preview becomes GA

It looks like people are having some issues with dataflows and there is already be a known issue about this

It looks like this might be fixed quickly, and once dataflows are OK, it seems like. A workaround is to move your dataflows out into another Workspace and then back in but hopefully this will get much better.

Questions

Is Premium Gen 2 going to be the same price as Gen 1?

Is there any way to find out how many dataflows you have if dataflows are an issue?

Will we still give great functionality to the Power BI Pro users?

Power BI November 2020 release anomaly detection new AI visual

Last month we had smart narratives which was amazing. This month we get another AI feature. Anomality detection. So lets have a look at how this works.

Its still in preview at the moment so make sure you go to File > Options and Settings > Options to try it out

First, lets create a line chart for every days sales (I am using Adventureworks data)

make sure the X Axis type is continuous rather than categorical. Now you can turn on Find Anomalies from the Analytics pane

If your data goes outside of the expected boundary it shows as an anomoly

You can also format the anomaly information

I increased the sensitivity to 75%

Another thing you can do it to get an explanation of the anomaly’s

Click on the anomaly icon to get possible explanations (If there are any).

Adding Data items into the Explain by will restrict the Explain by to those specific fields. Without anything in Explain by, Power BI will use all the data to ‘Explain by’ so this is a good way of removing items that you know you aren’t interested in.

Unfortunately there are no explanations for these anomalies but it will be interesting to try thin on other line charts

lets switch to our internet sales and try again

Again, no luck with the explain but lets publish to see what happens

Great. You wont see the explanation in desktop but once published, the user can click on the anomaly and see the possible explanations.

Here we have added Explain by categories for Product. It looks like the explanation is based around low value clothing products.

If you are going to use the line chart against your lowest level time series data, this is a great feature to use.

Power BI September 2020 Updates Total Labels for Stacked visuals

We recently had a requirement for a visual for a stacked column chart with a grand total

Our stacked chart contains:

  • Axis – Month year
  • Legend – Category
  • Values – Metric

To get this working we had to create a line and stacked column chart.

The line was hidden so we just had the data points and we just about got away with it. it did look a little messy though.

For the Line we went to Format

And set the stroke width to 0 so the line wasn’t visible. Only the data points and labels are visible.

Now there is a fix for this specific issue so we can try it on this specific report visual

First the report gets changed back to a stacked column chart

In format there is a new Total labels Option

Total labels are now on and a background colour of grey has been used.

The visual now looks much simpler. This small change will make a great deal of difference. time to find all those work arounds in the reports we have already created

Power BI Updates September 2020 – Mini blog – Maintain layer order

To get an understanding of this new feature lets have a quick look at a chiclet

it currently has a rectangle shape behind it. If you click on the rectangle and it becomes highlighted, the shape is brought to the front so we cant see the chiclet

So every time you hover over the report with the rectangle selected it pops up which is quite annoying

With the rectangle still selected, go to its format and switch maintain layer order off.

Now even with the rectangle selected it wont pop up over the chiclet.

This is great if you have more complex reports with bookmarks and underlying shapes to create a nice report structure. You can set this so your backgrounds wont obscure your visuals when you are using the reports.

its a pity this doesn’t do the same within Desktop. When you are developer you are still hit with the same issue. So this only applies to service.

Power BI September 2020 Updates: Smart Narratives

We have been hearing so much about this, I am very excited for Smart narratives

Smart narratives offer insights into your visuals which are also customisable

Lets use the Adventureworks demo report to try it out

Because its still in preview you will need to turn it on in Options and settings > Options

Add Smart narratives to a visual

I have a report than contains a scatter chart. I always find these a bit difficult to read

You can right click on the visual and choose summarize and this will add a smart narrative for that specific visual

We can now immediately see that 2010 was our best year. we are also told what our best selling product was and our highest earning product.

If I click on a bar in the bar chart for year smart narratives doesn’t change. Neither does it change if I click on Europe in the treemap.

It does change if you select a point in the scatter chart

I have selected a point in 2014 which also cross filters the other two visuals.

The smart narrative is actually not that helpful now ‘2014 had the highest total sales’

this isn’t true. Because we have selected 2014 this shouldn’t really be in the text.

The smarter version would be to tell us which month had the highest sales in the calendar year but I only have year in the visual.

Add Smart narratives to a Report

Here is a report created earlier for Sales.

Select Smart narratives from visualisations

we now have some great insights into the report as a whole.

Again there is a problem when, for example you select a year. for 2009 we get

‘Sales Amount and Order Quantity diverged the most when the calendar year was 2009’ This isn’t actually true again, its simply because the selection of 2009 means smart narratives only has 2009 data to go on. I suspect this may need tweaking for the next release for users who are selecting data points

Edit Existing text

We wanted to know the maximum discount applied so a sentence was added to the end, then + Value was selected to add in the new value

You can also use Q&A to add information into the smart narrative

Here we want to add the year to the maximum discount

Each value has a name and we can now find that value and format it

In this instance its worth making sure you name your values properly so you can get back to them.

Here I set the Maximum discount to English pounds

We can also make our text bold

So far, really good, I cant wait to use this on reports. However some work needs to be done when a user selects a value on a visual because the narrative only works on the selected item and therefore assumes it the highest value.

It would also be great if the smart narrative set things in bold automatically like values and years

Other than that great start. it would be interesting to see how it does against other tools like Oracle and Tableau’s offering.

Power BI Updates August 2020 Organisational visuals in Power BI Admin

Power BI report developers can either use the default Power BI visuals, or go to AppSource to pick up visuals.

Some of these visuals are available within Power BI certified, which means that these visuals have been through rigorous tests and have passed all the requirements.

But there are many other visuals you can use. some you have to set up an additional purchase for, for the additional features.

The question is always, as an organisation, do we allow users to use any visual, just the certified visuals, or only default visuals?

Organisational Visuals

The great news is that in Power BI Admin, you can now add visuals that you are happy for everyone to use within organisational visuals.

Lets try it out in Admin Portal

You can go straight to AppSource to add in the visuals

For this post, the following visuals are added

These are all popular visuals and now, the admin has added them as organisation visuals.

There are a few things we can do at this point

click on settings and enable the visuals for the entire organisation

The enabled visuals are available for everyone in the default visuals pane in Power BI Desktop

Admin tenant settings

To add more control to visuals, its possible to disable every visual that isn’t certified.

Visuals can then be added if they are deemed to be required within organisational visuals.

Pros

More knowledge and governance on what is being used across the organisation

Easier to resolve issues. There are cases where visuals have been changed or simply scrapped. This has caused issues with reports because there is no backwards compatibility to this.

Cons

Users who enjoy spending time testing visuals and seeing what they can do, your Power users may feel that they are being stopped from getting the most out of Power BI

Bottle necks, if you need to get your admin to add a visual, this may take time. One of the great things about Power BI is Self Service is the ability to simply play with your data. this may get lost to certain people without full use of visuals.

Using the organisation visuals

Your Power BI developers can now access visuals on the My Organisation tab knowing that these have been recommended at admin level

Its completely up to you what level of access you require for visuals, but the fact that we can now add selected visuals automatically into every project is great news.

However I do have one question. If visuals can only be used when they are added to the view, How do you discover new visuals or check out new ones, to see if they should be added?

Calculation groups with Tabular Editor for Power BI

Calculation groups look like a game changer, but are they? These have been available for Analysis Services tabular models for a while. Now Tabular editor Lets you build calculation groups for Power BI.

Taking one of our current pbix files as an example.

There are metrics for number of events, Financial metrics, timing metrics. Each one has to be added to Year to Date, Last month, last Year, Month to date etc metrics.

The same measures are repeated over and over again for the different metrics in the report.

And adding metrics in Power BI desktop isn’t a fast process. Its actually quite fiddly so adding the first lot of DAX required always takes a lot of time.

The Solution

As from July 2020 you can reach tabular editor Via External tools

  • You must have them installed at the right version for this to work.
  • You must enable Store datasets using enhanced metadata format in Options > Preview features

You are taken straight to the model you are connected to

Right click on Tables. Create New and Calculation Group

Time Intelligence

The best example to look at is time intelligence

This example is renamed to time Intelligence. What has been created is essentially a single column table.

the rows are based on all the calculation items that I already have in this model.

I initially recreate all the metrics from source as SUMs so they can be used within measures. Along with this I may have other base calculations like Distinct Count of IDs to make a number of events. Minimum, Maximum, Average etc.

right click Calculation items and choose new calculation item

In this example, go back to the original pbix and copy a time intelligence DAX measure

YTD Sales = TOTALYTD([Total Sales],DimDate[FullDateAlternateKey])

Total Sales is a measure created with SUM() using the Sales Amount metric from the data source

The measure is replaced with the scalar function SELECTEDMEASURE()

Instead of having to recreate this DAX over all your ID’s where necessary you only need to do it once

Scalar function – a function that returns one single value

The base DAX in this data set can now be added into Tabular Editor

You can imagine starting a prototype model with DAX for a specific metric. Then recreating as calculation groups so you can add all your other metrics without having to copy and paste the same measures over and over again

Tabular Editor currently has no intellisense so you wouldn’t write and test your DAX in here. this would be done in Desktop. If you wrote incorrect DAX in here and Saved to Power BI, you would then see an error icon in Tabular Editor.

Adding more time Intelligence from the original report

The rows are based on all the calculation items that are already have in this example model.

However the DAX needed to be amended because previously the Year to date measure was used to create variance (Two separate measures). We need to be able to give each measure the underlying measure by using variables in order to use them successfully.

So for example

YTD Sales Var % = DIVIDE([YTD Sales Var],[LY YTD Revenue])

Note that YTD Sales var and LY YTD Revenue Measures have already been created over the original year to date measure. We cant have this so we need to move to

YTD Sales Var % = VAR CY = [Total Sales] 
VAR TYTD = TOTALYTD([Total Sales],DimDate[FullDateAlternateKey]) 
VAR LYTD = CALCULATE(
                   TOTALYTD([Total Sales],DimDate[FullDateAlternateKey])
           ,SAMEPERIODLASTYEAR(DimDate[FullDateAlternateKey])) 
VAR VYTD = TOTALYTD([Total Sales],DimDate[FullDateAlternateKey])-
          CALCULATE(
                   TOTALYTD([Total Sales],DimDate[FullDateAlternateKey]),
          SAMEPERIODLASTYEAR(DimDate[FullDateAlternateKey])) 
RETURN DIVIDE(VYTD,LYTD)

Within this measure we call the base measure and create the variance% by calling variables.

Now these time Intelligence measures can be created in Tabular Editor

current
YTD
Last YTD
YTD Var

Notice that you cant use another variable within a variable. So our Total year to date has to be re added within the last year to date measure.

If we were doing this without calculation groups in mind we would simply reference the year to date measure we had created

YTD Var %

Again, Notice how we cant reuse a variable within a variable. So LTYD needs the full DAX inside the SAMEPERIODLASTYEAR DAX.

And Year to Date variance requires both last year to date and total year to date. this makes your DAX more complex than simply using a measure already set up.

In the above instance I could take out TYTD Variable but honestly. Im leaving it there as a reminder of what the DAX is for Year to Date

YoY Var %

LY
YoY

QTD

As you can see, each measure uses one base measure

And for the percentages I added format String Expression “0.00%”

Save the calculation group into the main desktop file

Lets see the calculated measure group in action using a matrix

Name from the time Intelligence group has been added to columns on the matrix.

Year and Month are in Rows

Total Sales, the base measure has been added to values

Name has also been added to a Slicer so you can turn off those you aren’t interested in.

To me, the big win of calculation groups is being able to create one measure and using it across multiple measures instead of creating the same measure over and over again.

And quickly switch to quantity by adding that measure to value

Issues with using calculation groups

  • formatting each item is much more difficult in tabular editor because you have to know the code. There is no help for creating these expressions.
  • Similarly, you cant develop DAX Here, you need to do it first within power BI Desktop
  • Your DAX has to become much more complex, rather than creating a measure and then referencing this measure within another measure. Like Year to Date and Last Year to Date

You cant create the matrix with Months in columns

This is using the measures created in the data set. As you can see, the Months are shown as columns and the measures are also displayed

You cant do this with column groups.

Lets have a look at some visuals already created with time Intelligence to see if we can recreate

Variance Waterfall chart

This is an example of a waterfall variance chart for year to date. If we didn’t have measures outside of a calculation group, could we still pull off the exact same solution?

Table: No you cant recreate the table. You would have to move to a matrix

With a filter on name to the measures you are interested in. It doesn’t appear that you can turn off grand totals though so we have some information there that we don’t need. the Total column.

So, ok but not great

Waterfall visual; there doesn’t appear to be any way you can recreate this. so to create my variance visuals there has to be a measure that can be used.

Conclusion

Calculation groups seem to only for for one visual. the Matrix. Where you can add the base measure to values and the Name to columns.

Even the matrix cant be fully configured to how it would normally be shown with months in columns and the measures underneath.

In order to use time intelligence for all other visuals, there still needs to be measures in your model to use.

It would be a great idea if you could fully get rid of your duplicated measures but you cant.

Because of this, I still don’t have the results I need to spend time creating calculation groups.

Other people seem to be seriously excited about this though so maybe I am missing something. Please let me know if I am, I would be very interested to get another point of view

Power BI July 2020 Mini Blog – Launch External Tools

This is part of the Semantic modelling for Power BI Programme and its a big one. Very exciting

The first thing you should see is External Tools on your ribbon (Making sure you are on the Jusy 2020 release of Desktop)

In the first instance, this doesn’t appear to be avilable. Why is this missing?

This is because you may not have the latest releases of the External tools.

These tools are:

  • ALM toolkit
  • DAX Studio
  • Tabular Editor

Currently, I only have DAX Studio but this isnt enough to give me the External tools Menu item? Whats going on?

DAX Studio

My Current release is

Going to DAX studio there is a new version

https://daxstudio.org/

this is the release you need for Power BI. Click on the installer to download the new release (Power BI Has been closed down during this process)

Do you want to run this file? Click Run

Once installed open up Power BI

External Tools is now an option

However DAX Studio is greyed out

go into Preview Features and make sure Store datasets using enhanced metadata format is turned on.

Check the External Tools Folder

C:\Program Files (x86)\Common Files\Microsoft Shared\Power BI Desktop\External Tools

I now have the DAX Studio JSON file in the External tools folder

ALM Toolkit

http://alm-toolkit.com/

After installing, take another look at your external template Folder

Tabular Editor

https://tabulareditor.com/

Again, a new JSON file is added to the folder

So now we have all three tools connected into Power BI. Lets have a look at what they can do for you. the following is a few basics from each tool

DAX Studio

With a Power BI pbix file open. go to External tools and click on DAX Studio. It will automatically connect to the data set you have open

Go to Advanced and View metrics

This is a great way of using the Vertipaq Analyser to look at Tables, Columns, relationships, partitions and a Summary of your data model

You get to see which items have low and high cardinality. What items are taking up too much space in your model and really helps you get a feel for what you can do to make your model better

ALM Toolkit

The ALM toolkit comes from the Analysis Services space. Due to Power BI encompassing an Enterprise architecture along with self service, we can now start to use these community tools with Power BI datasets.

Its a schema comparison tool for your datasets.

Use it many scenarios such as incremental Refreshing, metadata only deployment (Now available for Premium and Pro)

This is another big hitter and requires a much more in depth look

Tabular Editor

Again, tabular Editor will connect to the tabular database you have open within your pbix file.

This is an excellent tool to help with quickly modifying tabular models.

If you do anything in here, like add measures into a folder, you will see the change straight away in Power BI Desktop.

You can also test and create measures in here which can offer a much easier way to do this than within power BI Desktop.

basically, using this tool you can automate some of the more tedious tasks and save lots of time.

Like DAX Studio. You can also analyse your entire model to get a feel any improvements you can make.

There are some great videos out there on how to make the most out of this tool and if you use Power BI heavily this is a must have.

Power BI July 2020 Mini Blog – Gradient Legends

Its always nice to work through the updates by relating them to something you are working on.

Lets have a look at the Gradient Legends

lets start with a simple events chart for 2020

the Measure is

Number of Events = DISTINCTCOUNT('Event Metrics'[Event Id])

So Now lets colour the columns based on number of Events from Last year

LY Events = CALCULATE([Number of Events],SAMEPERIODLASTYEAR('Date'[Date]))

This is great because we now get an idea of how different last year was, But without any legend information its quite difficult to understand the new information

Lets add a legend

Here is where the 2020 updates come into play. we can now see that the colour is related to Last year events and it goes from 0.9K up to 1.2 K so we can see that in 2019 July had the highest number of Events.

(I have set the colour to red because in the context of our report, More events isnt actually a good thing)

Just remember, you cant add a legend data item into your visual if you are going to do this, but this is a really great new feature

Power BI why marking a date table is a good thing

We are looking at a Star Schema with a Date table connected to the Fact table by Order date Key

for these examples, the active relationship is the only one that matters

Lets have a look at all the issues arising from using the Date table without marking at a Date table

Date Slicer seems to remove the report filter flag for the last 3 years

Rolling 3 year flag= IF(DATEDIFF('Date'[Date].[Date],TODAY(),YEAR)<=3 && DATEDIFF('Date'[Date].[Date],TODAY(),YEAR)>=0,1,0) 

This flag has been added to the entire report and set to 1 to make sure there is only information from these 3 years

 If I add date as a table I get 2017 2018 2019 and 2020 as expected for a rolling flag. 

However As soon as I change that table into a Slicer, all the years appear in the data set.

In this case from 2016. The filter stops being applied on the slicer

DAX FUNCTION SAMEPERIODLASTYEAR and the date hierarchy

We have some DAX to create SAMEPERIODLASTYEAR.

This is the DAX

LY Total = CALCULATE([Total],SAMEPERIODLASTYEAR('Date'[Date].[Date]))

Note the .[Date] at the end. Because we are using the date hierarchy to create the measure you can choose which level of the hierarchy to use. In this case we are saying we want the date level.

This will come back later

This is used in a visual and we have added Month Year which is a column in the date table.

As you can see, last year displays exactly the same as the current measure. This does not work

ALLMONTHSSAME.JPG

it works using the DateTime Hierarchy in the visual? However in this instance this date time hierarchy isn’t required. We don’t want the user to have to drill down to month from year every time.

In this example, the only field you can use for any measure that is date time specific is the date field

Mark as Date time

Always make sure you have a date table connected to a fact table

Note that the Date is now not connected to a back end hierarchy table.

Marking as Date table means that all your data within this table can be used for Date Measures

However lets look at the DAX that was created

All the Lat year measures are now erroring. this is because of the .[Date] at the end of the DAX

The date does not contain a hierarchy any more so if you have used .[Date] This needs removing. Its specifically related to the hierarchy.

A Year to day hierarchy has been created in the date table. This means that you have created a more compact data model and saved space.

And you can now use Month Year on the Axis

Date Table is Marked. How to work with it

This is part of the star schema in Power BI

Now the date table is marked as date, the Date hierarchy table is removed for Date within the Date table. this saves space and you can simply create your own date hierarchy to work with

DateDim1.JPG

The Active Join is on Received Date

All my measures so far are based on received date

Each Date in the fact table creates an extra date table to produce the hierarchy. So to save space, you should create inactive joins to the other dates and then remove the dates in the Fact table leaving just the keys. the model should then reduce in size

DateDim2.JPG

This works great for the measures. I can create measures based on the none active ones and simply choose to USERELATIONSHIP

LY Totals by Closed Date = CALCULATE([Total],USERELATIONSHIP('Complaint Metrics'[Closed Date Key],'Date'[Date Key]),SAMEPERIODLASTYEAR('Date'[Date]))

The above is an example of using the Closed Date None active join for Last years Totals

So i can have measures for Received this Year, Received Last year and Closed this year , Closed Last year (For Example)

This is all absolutely spot on. However there is more logic we need to think about. What about when the users want to create a drill through

visualForDrillthrough.JPG

This visual was created on the active relatioship so its recieved date

However your users may want to drill through to the following

DrillThrough.JPG

how to do this?

Currently you may have the Active join on received date and Inactive on Start and Closed date

In my example, I have also kept the dates in the fact table along with the Date Keys for each Date.

Because they are in the fact table they all have date hierarchies.

Remove the Date Hierarchies across your entire dataset

You could keep them there and simply remove their hierarchies

File > Options and Settings > Options

You can turn off time intelligence for the specific report OR globally. the recommendation is to actually do this globally when you use a date table

This way you can use the dates in the fact table for drill down without creating unnecessary data with the date hierarchies

Role Play dimensions

If you want to drill down to all your dates on a regular basis, AND if there isn’t one main date and other dates that are not used as often.

In Power Query Editor go to your date table and create a reference table for each date

In this case the Date table has been referenced once and is named date created. date as been renamed to date Engagement

with this set up, there is only one join to each date key so no inactive relationships.

Any DAX you create can reference the correct date tables and your Dates can be removed from the fact table

the big downside is you have a much more complex schema with many more dimensions so only go for this if your dates are all frequently used.

In the example above, the user just wants to drill through and see all the dates so they can be left in the fact table in a Date Folder without their hierarchies

But this has been a really useful bit of research on marking the date table, Date Hierarchies and role playing

Create your website with WordPress.com
Get started