Power BI Premium Gen 2 First Look

Currently Premium Gen 2 is in Preview (Jan 20201) but it looks really exciting for those with Premium capacity. Lets have a look at Gen 2 and see how it differs to Gen 1

Gen 1

With Premium Gen 1 we are bound by the number of vCores and by the memory we have. Lets take the P1 SKU as an example

P1, P2 P3 P4 = Are all Premium SKUs. Each one has more vCores for better performance

SKU = Stock Keeping Unit

vCores = Virtual Cores is a puchasing model which gives you more control over compute and memory requirements.

When it comes to Premium, reports would slow down if there were two many queries running. And if people were using many models at the same time you would have to wait for your time slot.

As you can see, there is only 25 gig memory across the data sets Once a report isn’t being run and used any more, that memory gets dropped and is added back into the pool.

Report users and report developers are also fighting with report refreshes.

Premium Gen 2

Gen 2 is not in generally availability yet but if you have Premium you can switch to Gen 2 in preview.

In Power BI Admin (as the Power BI administrator or Global Admin)

Go to the Capacity Settings and switch Gen 2 from disabled to enabled.

You can also go back to Gen 1 if you need to but if you do make sure you flag up any issues you are having.

lets have a look at the model compared to Gen 1

Autoscaling

The end users encounter the throttling and performance issues with Gen1 because they physically only have 4 backend vCores. Now with Gen2, Autoscaling allows you to deal with spikes. This is not available yet but will be coming. This is helped by the fact that there are other vCores that can be called on.

If you do come up to the 4 core Limitation it will, or may lend you a vCore so you don’t see impact for your end users

Previously our admins had to deal with this kind of problem but this will really help automate these kind of issues

Memory

Data Sets can go over the 25 gig memory capacity. Previously Premium was 25 gig for all the data sets. Now Data sets are gated individually.

This is a fantastic update. We don’t have to worry about the collective size of all our data sets.

Refreshes

Previously there was a maximum of 6 refreshes at any one time. Otherwise you can get throttled.

With Gen 2, refreshes get spread out over a 24 hour period and don’t impact other queries from users. refreshes just run

The looks great. People are seeing refreshes of an hour and a half coming down to 10 minutes.

Capacity Usage Metrics

This is coming soon and will have a breakdown of items.

Its a little annoying when you have set up gen 2 and want to view the metrics to see how everything is working but currently can’t.

With Gen 2 we will also be able to work against a chargeback model. This means that we can spread the costs of Premium between distinct areas of an organisation dependant upon their usage.

Workloads

Again the workload settings aren’t fully functional at the moment but more will be coming.

For example for data set workloads we can specify minimun refresh intervals and execution intervals. we can detect changes in our metrics.

We don’t have settings for dataflows and AI yet.

Why go with Premium Gen 2 Checklist

  • Performance benefits
  • End users see faster reports
  • Refreshes, we now don’t have refresh bottlenecks and we remove refresh failures due to throttling
  • Premium per user
  • Improved metrics will be introduced soon
  • Autoscaling
  • Proactive admin notifications

Why it may be worth waiting until Preview becomes GA

It looks like people are having some issues with dataflows and there is already be a known issue about this

It looks like this might be fixed quickly, and once dataflows are OK, it seems like. A workaround is to move your dataflows out into another Workspace and then back in but hopefully this will get much better.

Questions

Is Premium Gen 2 going to be the same price as Gen 1?

Is there any way to find out how many dataflows you have if dataflows are an issue?

Will we still give great functionality to the Power BI Pro users?

Azure Built in Tagging Policy Resource Group to Inherit Subscription tags

Ensure your Azure Resource groups are tagged with the Tag from Subscription

In this example we have the following requirements

  • The costCentre tag has been manually added to the Subscription
  • The Resource Group inherits from the container they are in, but can be manually overridden

Costcentre Tag Configuration

  • Modify Resource Group to add the Costcentre tag from the parent Subscription

Go to Policy

And Click on Definitions

Search for Inherit a tag from the Subscription

Currently this only works for Resources. In order to set it to work for Subscriptions it needs updating.

click on the Subscription and then Duplicate definition

The Definition location has been set to a subscription but you can also set it to a Management Group

{  “mode”: “Indexed”,

In the Script, the mode is set to Indexed. this needs changing to All

{  “mode”: “All”,

And Save the new custom Policy

Once you have a New customer definition, we can assign it. Go back to the top of Policy and and click on Assignments

Time to assign a new Policy

No Need to select a Resource group because they haven’t been created yet and this applies to all resource groups that have yet to be created

Choose Inherit a tag from the subscription if missing (Resource Group) which is the new custom policy

The Assignment Name has been changed to include the tag

And the tag name is added as a parameter. Ensure that the Tag matches the tag in the subscription

We need a Managed Identity because this has a modify effect

Then go to Review + Create and Create

Test the New Modify Policy

This is the tag on the Subscription

Create a New Resource Group (And remember to apply any tags of Policies you have already set

And there is the CostCentre which has been inherited from the Subscription

PowerApps in Power BI – Simple Power App to Insert a record into a Table to be used for a report Filter

First of all I would like to big up guy in a Cube https://www.youtube.com/channel/UCFp1vaKzpfvoGai0vE5VJ0w

For being one of my go to’s for anything Power BI.

I had seen in the August power BI release that there was a new PowerApps visual in preview but its Guy in a Cube that gave me the confidence to try it out.

In the September 2019 release its become fully available and I have quite a lot of business logic that I could get off the ground and into my projects using PowerApps

My initial challenge to resolve for the business:

I have worst served customers in my Data

SELECT [Top 20 Worst Served] FROM [fact].[Fact]

  • Which is a 1 or 0

SELECT [Top 20 Worst Served]  FROM [dim].[Customer]

  • Which is a true or false flag

I can then have reports for worst served customers filtered by the above

However, in some cases there may have been an investigation and its been deemed that the customer shouldn’t be in worst served. Note that this information isn’t capturing in the source data set.

They want to be able to immediately say this customer is fine and the results will then update accordingly.

Every time the data is refreshed, these data items get fully refreshed in case they are not worst served any more.

In the current Reports, Filters are set on the page for both the above columns in the dim and the fact

  • Where Dim Customer WorstServed = Yes
  • Where Fact WorstServed = 1

Question 1

  • Do the users want to see an immediate change to the report?
  • Or are they happy for it to come through on the next refresh?

The users want to see the report metrics change immediately based on them adding in information

Quick Steps to adding the POWERAPP into Power BI From Guy in a Cube

  • The Tables that are to be updated need to be in Direct Query Mode
  • The tables connected to these tables  on the 1 side of the relationship, should be in Dual Mode (Which acts as Import or Direct Query)
  • Design your reports in the Desktop
  • Once designed DO NOT ADD THE POWER APP INTO THE PBIX FILE
  • Publish your report
  • In Power BI Service click on edit Report
  • Then in the Visualisations pane, If you don’t have already, Go to the Market Place and choose PowerApps
  • Add the POWERAPP into your report
  • Choose App (Or Create New App)
  • Add in the columns required to use the POWERAPP
  • In PowerApps give the people access who need to be able to update the data

Things to Consider

  • The users want to see the change immediately rather than on the next refresh
  • My report pages are already created and I have many tables (All Imported) and relationships in the model.
  • Dual Storage Mode, I have never used this before. How many tables would need this change in the data source?
  • The PowerApp will be a new enhancement to the reports
  • The PowerApp hasn’t been built yet
  • I am concerned about adding the visual into Power BI Service in Edit mode because the standard way to create reports is in Desktop. I have many more Updates to do after the power App has been added and I don’t know how adding visuals in Service only will work with the ongoing App amendments

Possible Data Solution 1

in the PowerApp, the User Adds in the CustID, (And an automatic date is assigned) these get inserted into the table Staging Worst Served Amendments

Then Dim Customer and the fact table are checked through using the custID and the items are set to ‘No’ and 0 as above (this is clearly the most difficult bits because it’s an update based on the CustID

The next refresh will again change 153 to Worst Served Yes, however an extra bit of script in the Stored Procedure that creates the table will Check the Worst served Amendments table and if there, reset to No.

The above Staging table is only used for the overnight data refresh

Changing Import to Direct Query

To develop against the above resolution, both the fact table and the property table need to be changed to direct import. What do we lose if we do this?

All my DAX functions for time will be lost against my fact table (Year to Date, This time last month etc).

Therefore I really don’t want to lost these DAX queries by changing to Direct Query

Also currently I cant seem to change the setting from Import to direct Query in Properties. Unless this is a bug, It seems you would have to start again with the tables and re import them as direct Query.

Is there another way that the new logic can be added without setting both these tables to Direct Query?

Possible Data Solution 2

Filters in Power BI

  • Where Dim Customer WorstServed = Yes
  • Where Fact WorstServed = 1
  • And Staging Worst Served Amendments CustID is NULL

Issues with the Above Solution

You cant have a Filter for an empty customer ID because this is a Left Outer Join.

There may be other issues. What happens if the user accidentally adds in Multiple custIDs and the relationship changes to Many to Many?

Normally I would deal with this by merging the tables in a left join in Power Query Editor

As a consequence I need to think about another Solution to this problem

Possible Solution 3

I have already checked that you can merge a direct query and and Imported Table. See https://debbiesmspowerbiazureblog.home.blog/2019/09/19/what-happens-when-you-merge-a-direct-query-and-an-imported-table-in-a-power-bi-composite-model/

This means that we can left join the table and filter out the none nulls in the report

Creating the Staging table for the Insert

Top tip is that within the Power APP your tables will need a Primary key or else the POWERAPP will fail

First I need an actual table in my Azure SQL database. In SQL Server Management Studio I created the table via SQL

CREATE TABLE [staging].[WorstServedAmendments](
	[CustID] [varchar](255) NOT NULL,
	[AmendmentDate] [datetime] NULL,
 CONSTRAINT [PK_WorstServedAmendments] PRIMARY KEY CLUSTERED 
(
	[CustID] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]

Creating the Power App

I work with such a great team. Thanks to Liam Chapman and Happy Wired for giving me a heads up on PowerApps and writing the code for me

Log into https://powerapps.microsoft.com/en-us/

I need an update button. A delete button and a text box to add the customer number into.

It would also be good to have a little message to say that the button has been successful

These objects can be added via Controls.

  • Timer1 Relates to a hidden Timer on the page
  • txtUpdateMsg relates to a hidden text box that will appear on update
  • btnDelete relates to the Delete Button
  • btnUpdate relates to the Update Button
  • txtCustID (hidden in image) relates to the textbox where the user will add the customer number
  • lblForTxtBox relates to the description label

Data Source (Start Amending from here)

I’ve connected to my Azure database and I’m connected to my new worst served table

The CustID is a varchar within the SQL Table

Update Button

Lets start with the basics and start with the Update button

OnSelect

Patch('[staging].[WorstServedAmendments]',Defaults('[staging].[WorstServedAmendments]'),{ProNumber:Value(txtProNumber.Text), AmendmentDate:Now()});
UpdateContext({TimerStart:true});

Code

Patch – Modifies or creates one more record in a data source, or merges records outside of a data source

Defaults – Use Patch with Defaults function to create records (Insert a new record into a table)

Value() – This is converting a text value to a number

UpdateContext – Create a context variable to temporarily hold a piece of information. In the above case we are calling it TimerStart and setting it to true. We are basically starting the timer when we click update because we have a timer hidden on the screen

Timer1

On Start we are going to use the Context Variable TimerStart

Hide the Timer because we don’t need to see it

Create a context variable called SuccessMessage and set to true

Create the context variable SuccessMessage and reset it to False

Create another Context variable called TimerStart and set to False. TimerStart was started on Update and now on Time its being turned off again

Delete Button

OnSelect

RemoveIf('[staging].[WorstServedAmendments]',ProNumber = txtProNumber.Text)
UpdateContext({TimerStart:true});

Code

RemoveIf – Remove a record if it exists

UpdateContext – Create a context variable to temporarily hold a piece of information. In the above case we are calling it TimerStart and setting it to true. We are basically starting the timer when we click Delete because we have a timer hidden on the screen in the same way as when we update

txtUpdateMsg

Visibility relates to SuccessMessage context variable. Its visible on timer start (true) and disappears on timer end (False)

What appears when the timer is ON

This is a very basic Power App that will now allow the user to add and remove customers from this table. Save and then Publish the App so it can be used in Power BI

Add the table into Power BI

The new worst served table needs to be Imported into Power BI as a Direct Import so any changes the User makes will reflect straight away in the reports

Just to make sure that everything is OK I’m going to add one Customer into the table before its added just to have a record to work with.

In your Power BI Desktop file, Edit Queries and Choose recent sources if you have already connected to your data source.

Select the Worst Served Table

We can now have composite models where some tables are import and others are direct query.  The new table is added in as a direct query

Close and Apply

Note the message regarding potential risks when you are querying data at source and have other tables imported in memory

Next go into Edit Queries and Merge Table

And merge the customer table with the direct Query table

Click OK.

this connect the table so grab customer ID

This will be added to your customer dimension

Note that so far, DimCustomer hasn’t been automatically changed to Dual Mode after being merged with the direct Query table so we dont need to worry about Dual mode in order to create our new logic.

Close and Apply

back in the desktop, go to Marketplace and grab the Power Apps Visual

The how to guide states to not add the Power App within your Desktop report so Publish the report and lets move to the Power BI Service

Power BI Service, Edit report and Add in PowerApps Visual

On the report you want to update Click Edit Report

The Power App visual will now be available in Visualisations because it was added in the desktop file

Add the Power App in power BI service

In order to test out the new service I’m adding in a blank page

Click on the power App visual and add it to the new blank page

I want to add the CustID and the Date from the worst served new staging table

Then I can choose App rather than create new because I have already published an App

Im choosing my Worst served App and click Add

Ive clicked go to power Apps Studio which opens the PowerApp in Studio and you also have the powerApp in Power BI Service

The very first issue I have is a formatting issue. My Power App is Tiny and unreadable. Time to go back to the power App Studio and change the Settings

PowerApps studio

App Settings – changing the App to Default size 16.9. For the time being Im not going to think about making the app look good. I just want to be able to see the text on the power BI page

  • Save and publish the PowerApp
  • Go back to the Power BI service

Power BI service

I had to delete and readd the PowerApp to get it to refresh.

Its also worth noting that if you dont create your visual to the right size before adding your App, the App will have scroll bars etc and you cant change the size of the PowerApp, only the size of the visual that holds it

The Power App doesn’t look great but it will do to test.

First of all we want to see if it works so add a table with the Worst served Data items

Add a CustID and click Update

It works. Now delete the item just added. Again it works.  This is part one done. Part 2 is that we want it to update worst served Customers from the customers table

How does the Updates Affect the pbix file?

Click Save in power BI service and go back to the power BI Desktop file

The new visuals aren’t there. This is as expected, because they were not created in Desktop.

Imagine that you have created your Power BI App Visual and you still have a list of updates, changes and bug fixes that you want to do within Power BI Desktop

If you update and rebublish you lose the PowerApp Visual

Currently this options is only viable if the PowerApp is the very last thing you do and you don’t need to update the pbix file at all.

As a consequence I don’t think that Editing your reports within Power BI Service is a good idea.

having chatted to a few people on the forums, Editing reports in service is normally done when you need something quickly and you intend to go back and update the pbix file with the amendment.

What happens when you add the PowerApp in Desktop

In Desktop Add the PowerApp and the table to check its working. Then Publish into the Service.

Note the change in how it works. If you Update or Delete, the table doesn’t change until you Click Refresh. If you add it in Service you don’t need to click refresh.

For the time being I’m going to accept this as it’s the only way to move forward and keep working with the pbix file.

I have created an idea on the power BI forum to deal with this issue https://community.powerbi.com/t5/Issues/PowerApp-with-DirectQuery-Mode-Need-to-be-able-to-have-correct/idi-p/791690#M48425

Omitting Worst Served Customers using the new Filter

I have Customer 1000000 and 1000001 in my Direct Query table and they have both been removed from the report

Great, I now have a Power App that can be used to omit records in a report without straight away.

There are some issues to be ironed out but this really opens up what you can do with Power BI

Create your website with WordPress.com
Get started