Microsoft Business Applications Summit 2020 and what is means for Power BI users

The Microsoft Business Applications Summit was held online this year on the 6th of May and as a UK user, that meant an entertaining evening of Power BI and Business applications learning and updates .

https://www.microsoft.com/en-us/businessapplicationssummit

The evening was incredibly well thought out with 30 minute sessions on each of the Power Platform areas.

  • Power BI
  • Power Apps
  • Power Automate
  • Power Virtual Agents

Our attendance was mostly based on the Power BI sessions that night. We wanted to focus on what to get excited about with Power BI and when to get excited about it. However there were also some great overviews of how to use all the applications together which helped us to understand the Power platform as a whole.

Power BI Is split into key area to drive data culture in your organisation

And each of these areas contain some fantastic new updates.

Each area is going to be looked at in a lot more detail in blog posts to follow but in the first instance, lets take a look at all the exciting updates.

Amazing Data Experiences

There are now over 2 million Power BI desktop users. 97% of all Future 500 businesses use Power BI. It is the Leader on the Gartner 2020 Magic Quadrant https://www.informatica.com/gb/magic-quadrant-MDM.html, and on the Forrester Wave https://info.microsoft.com/ww-landing-Forrester-Wave-Enterprise-BI-platforms-website.html?LCID=EN-US

All this comes from providing amazing data experiences to customers.

AI Infused Experiences

The very first AI visual for Power BI was the Key Influencer. Next came the Decomposition Tree and then the Q&A visual. All these visuals have proved big hits with report consumers who get the ability to understand all the factors that drive a metric, and can ask more and more questions against data in their own way.

Lets have a look at some of the Updates, and even more exciting is the new visual coming for Smart Narratives

Key Influencers Update

Key influencers are fantastic and we have been using them the moment they were added into Power BI as preview.

We have used it across lots of projects, For example, Social Media influencers. What influences a negative tweet. Customer Churn is another great use case for the Key influencer

April 2020

Key Influencers now supports continuous analysis for numeric targets

  • May 2020

Binning Support, Formatting options and Mobile Support

  • June 2020

More Visual Updates go into preview and will now be usable for Live Connect

  • July 2020

Counts will go into preview

  • August 2020

All the key Influences improvements should be moving to GA (General Availability)

Power BI Decomposition Trees Update

The Key influencer allows you to analyse a category within your data and discover influences and segments. The Decomposition tree allows a report consumer to analyse a business metric however they want.

  • May 2020

You will be able to conditionally format your visual very soon. Using the above visual, you might have the most engaged businesses in Nottingham, but conditional formatting could show the most percentages of meeting cancellations. We can do conditional formatting on another metric

You will also be able drill through from the decomposition tree visual to more detailed data.

There is a reason why people love this visual and we cannot wait to start implementing these updates into our reports.

  • June 2020

The Decomposition Tree will now be out of Preview and in General Availability

Q&A Visual

We can now include Q&A in the reports as well as just from the dashboards and there are some great new updates for this

  • April 2020

Add Terms within Q&A allow for better synonym matching and Suggest questions will allow you to tailor some ready made questions for your user

  • May 2020

New Q&A Visual Updates (TBA)

  • September 2020

Direct Query will be coming for Q&A Visuals.

New AI Visual – Smart Narratives

  • Available Later this year

We got a sneak peak of the New Smart Narratives visual and it looks so good.

Report authors will be able to add dynamic interactive narratives to reports and visuals. These narratives update when you slice and dice the data.

It automatically does trend analysis

The visual calculates the growth automatically with no user imput required

You can also add dynamic values as part of the narrative and even use Q&A to create the value

This is one development we are really looking forward to.

Power BI End User Personalisation

  • June 2020

Another development that is going to change things for report consumers in a really good way is personalisation


You may love a stacked area chart but Julie in HR May hate them. Consumers can now click on a visual, go to personalise and change the visual to suit their needs better. This visual is saved specifically for that user (As a modified view with a personal bookmark) and its easy to go back to the original visual.

This is currently in Preview so if you want to take advantage of it, make sure you go to Options and Settings > Options > Preview Features

PowerPoint for Data – Onboarding and Lessons Learned

Microsoft acknowledge that PowerPoint has really good on boarding features. Lots of people happily use Powerpoint. They should have the same experience with power BI

All the following updates come from lessons learned with PowerPoint:

  • April 2020

Lassoo Select of visuals and Datapoints. this is great. finally you can lasso (Drag a rectangle around) a number of visuals together in desktop. You can even do this with data points

  • May 2020

Drop Shadows. How to make a great report look even Nicer. Add Shadows to them. Another feature I cant wait to use

Power BI Templates Experience

  • September 2020

Report authors will get lots of help to create report pages with pre-made templates like PowerPoint layouts. Obviously Templates can already be created for Power BI but this will make everything much more intuitive and easy to use.

I’m a big fan of Story boarding in PowerPoint. I wonder if we will see this come into play in power BI?

Modern Enterprise BI

Power BI is no more a business led self service tool. Its can now be used right across your large scale business enterprise. We can now use Power BI as an enterprise scale analytics solution bringing together all our insights to drive actions and improve performance.

There are lots of key points to consider within this Microsoft strategy area. For example:

  • Admin and Governance
  • Lifecycle Management
  • Lineage and impact Analysis

The modern enterprise BI has the most impact when customers are using Power BI Premium capacity nodes. lets have a look at some of these areas in a little more detail, and specifically understand what Power BI License you need to have to make use of these new capabilities.

Power BI Lineage and Impact Analysis

  • April -2020

Lineage and Impact Analysis went into Public Preview in October 2019. We are very much looking forward to looking at this in more detail very soon.

the real excitement is, the ability to incorporate more services within Azure into the Lineage which will make it much more essential when looking at how your data is structured

Within the Power BI service, Change the view to Lineage View

You get little logos to show if your dataflows or data sets are promoted or certified.

Impact analysis is available from your data set. clicking Impact Analysis will allow you to assess the impact of a data set change. How will your changes impact downstream reports and dashboards?

You can also see your visitors and views and even notify people about upcoming changes.

It appears to be available for Pro as well as Premium but as yet, we aren’t aware of any differences between the two.

This will be explored in much more detail in a post coming soon.

Enterprise Semantic Models

Another big game changer for Power BI Users

Again, we are moving away from creating your data set within a power BI pbix file which is only available for the user. Just like Analysis Services Tabular Model, we can now create the model with Power BI, available for everyone to use, From business users, analysts, to Power Users.

The enterprise semantic model comes with some great updates:

Shared and certified Datasets

  • April 2020

When you certified a dataset in Power BI, You are stating that this data set is a single version of the truth. when we connect to a certified dataset the model may contain a large amount of data, and your specific reporting requirements may require you to only select a few tables from the central model.

XMLA Endpoint

  • May 2020

Power BI Premium Only

XMLA Endpoint allows 3rd parties to connect just like you can with Analysis Services models. This is yet another game changer as it allows organisations to create the one version of the truth using power BI.

Previously, this could have been done using Analysis Service, either in the cloud or on premise. Your own centralised Tabular model. this data model could be connected into from various data visualisation tools, and data management tools, e.g SQL Service Management Studio, DAX Studio, ALM tookit etc.

Now with XMLA endpoints open platform connectivity, the datasets you create in Power BI will be useable from a variety of other data visualisation tools, if your users don’t want to use Power BI.

This is excellent for IT Led self service. Your centralised Power BI Team can create the dataflows and models and business users can take those models and run with them. obviously Power BI is fantastic but you don’t lose out on users who absolutely want to stick with the visualisation tool that they know.

This is all about delivering a single one version of the truth semantic data model

Power BI Extensibility

  • Available later this year

This will enable external tool extensibility to unlock additional semantic modelling capabilities.

will all be able to get access to the Power BI Tabular model (data set) in the same way as they would an Analysis Services Tabular model.

This is due out later this year and as yet, its unsure if this is just for Premium or if it will be available to pro users too.

Translations (Available with Power BI Extensibility)

Translations allows you to create multi cultural datasets. These meta data translations are an offering of the analysis services semantic model, and previously locked away in the Analysis Services engine.

The extensibility model for Power BI will soon allow us to finally use Power BI translations within power BI Desktop

Clicking Tabular Editor allows you to connect to your Power BI dataset and use Analysis Services Features. Translations being one of the major draws to Analysis Services Tabular.

This should be available later this year, and will be looked at in much more detail within future posts

Deploy to Workspace Incremental Metadata only deployment

This is a Premium Only service. Imagine that you have implemented your translations and want to publish your new data set.

There are no data changes so you don’t want publish to involve the data. When you publish you will get impact analysis

However, you actually want to do an Incremental meta data only deployment. So instead of simply publish, go to the Settings within the Workspace in Power BI Service.

Go to your Premium tab

And copy the Workspace connection link. this Workspace connection can be used just like an Analysis Services Workspace. You can use this workspace name with the ALM toolkit (Under Extensibility) to look at comparisons and pick and choose what you want to update.

The Power BI Tabular model has been processed in the same way as you would an Analysis model. Thanks to these new External tools we can do so much more with the power BI Datasets.

Composite Report Models

  • September 2020

We have looked at the enterprise Semantic Model from the BI Developer. Now its time to look at what we can do for the data analysis.

Previously, there has been lots of talk about composite modelling

“Allows a report to have multiple data connections, including DirectQuery connections or import”

Composite models allow the developer to created an aggregated data set which allows you to reduce table sizes by having imported data at granular level (So you get the full suite of DAX to work with) and then you can drill down to granular data in direct query mode.

Composite report models are basically composite reports as opposed to composite models. I got a little confused between the two as they are both called composites but they are quite different.

As a data analyst you get data from a Certified data set. this is essentially a Live Query because you are connecting to a Power BI tabular model


these screen grabs are from the Conference. We will be researching this with our own data sets in due course

The analyst will now be able to combine data from multiple data sets and create relationships between them. Composite modelling can be mashed up with local data by the analyst. This will bring so much more power to the analyst.

It will be really interesting to see how this works over the next few months. Again its uncertain if this will be available for Pro users but we will be looking at this in much more detail soon.

Full Application Lifecycle Management

  • Public Preview May 2020

Power BI currently consists of the App Workspace (for collaboration) and Apps for consumers. this gives you your development, test and production environments.

Deployment Pipelines is the next level of lifecycle management. If you use DevOps you have seen and probably used Pipelines for other business requirements. For Premium capacity Workspaces, Pipelines can now be created to deploy to Develop, test and production Environments

This is a fantastic new development for modern enterprise BI. Each Workspace can be compared within Service and allows you to be more agile and responsive to users needs. We are really excited about this one.

Drive a Data Culture with pervasive BI throughout your Organisation

Automatic Performance optimisation with Azure Synapse Analytics

Relates to the Data Stack. Microsoft are working on deep integration with Azure Synape Analytics.

We will be looking at this in more detail later but there are big changes coming:

  • Materialised views to improve performance within the Synapse layer.
  • Useage based Optimisation against Synapse.

Common Data Service

This sits with the Action key point for driving data culture. this is another area that the Microsoft team were very excited about. As yet we are being cautious and want to do some more research around this topic.

You will now be able to direct query the Common Data Service. the CDS ties in with Power Apps and seems to be used very much within that domain. Its worth noting again at this point that Power BI Does not exist alone. It is part of the Power platform.

Internal data is stored in CDS. External data is brought in via connectors. there are 350+ connectors that can be used for External data. However data within the CDS is Smart, Secure, and Scalable.

We will be looking at CDS in much more detail in relation to Power BI

This is just a first high level look at some of the offerings from the Business Applications summit. There are so many great sessions to look at for more in depth details. It looks like an incredibly exciting time to be involved with Microsoft business Apps.

Use Data Lake Storage V2 as Dataflow Storage

This blog post follows on from https://debbiesmspowerbiazureblog.home.blog/2019/11/28/setting-up-an-azure-data-lake-v2-to-use-with-power-bi-data-flows-in-service-as-a-data-source/

Dataflows are essentially an online collection and storage tool. Power Query Connects to data at source and collects and transforms that data. The dataflow then stores the data in a table within the cloud. They are stored in Data Lakes which is all automated for you.

Dataflows unify data from all your different sources. It should be noted that a Data Warehouse is still the recommended architecture with Data Flows over the top.

Dataflows also introduce the concept of the Common Data Service (CDS) and the Common Data Model (CDM). CDM allows organisations to use data formats to provide consistency across deployments. Now we have Azure Data Lake gen2 storage can be combined with data flows to store the data flows to provide and structured centralised data source.

Thanks to https://docs.microsoft.com/en-us/common-data-model/use for helping me understand the differences between the two

When you integrate CDM with Data Lake Gen 2 you get structural consistency and you can use CDM Folders in the lake that contain your schemas in standard CDM format.

Dataflow definitions and data are stored in Model.json format. If you have a Model.json file, it shows that you are compliant with CDM.

Dataflows store their definition and data in CDM folders, in the Model.json formats. If you have Model.Json it shows you are adhering to CDM.

Of course, this can be quite difficult when you are working with data that does not adhere to CDM format. I can see it being really useful when you are starting from scratch but I have done some work looking at my data sources and they are always quite far from CDM format.

You can find information about the CDM here https://docs.microsoft.com/en-us/common-data-model/

And more information about CDS here

https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/data-platform-intro

Advantages to setting up your own Data Lake Gen2 Store

Essentially, you can get by using the internal Data Lake Store but there are lots of reasons (And some of these advantages will be coming in future releases) why setting your own store up is a good thing.

  • Great for Re-use, If you are given access to the workspace you can use the dataflows already created for you.
  • Centralise your prepped data

Remember this doesn’t replace your data warehouse. It just adds a preparation and transformation layer above the data warehouse. Instead of having to wait to get your small change added to the warehouse you can add it to this layer.

  • Data within your data lake can be integrated into other solutions by developers
  • Data Lake Gen 2 is enormously scale-able for data
  • Dataflow data and definition file can be leveraged by developers for AI Services

Steps to Set Azure Data Lake Gen2 as Dataflow Storage

  • The storage account must be created in the same Azure Active Directory tenant as the Power BI tenant.
  • The storage account must be created in the same region as your Power BI tenant. To determine where you Power BI tenant is located
  • The storage account must have the Hierarchical Name Space feature enabled.
  • Power BI service must be granted a Reader role on the storage account.
  • A File system named powerbi must be created.
  • Power BI services must be authorized to the powerbi filesystem you create.

Once configured, it cant be changed. You cant go back to the default data store for your dataflows

Set Storage Account in Same Region as your Power BI tenant

Log into Power BI service , then Help ? and About Power BI

The Data is Stored in North Europe Ireland

When you set up your Data Lake Ensure North Europe region is selected

Set up Azure Data Lake V2 in Azure Portal

When you go into Azure Currently and look for Data Lake V2 you can only Find Gen 1

So the question is, how do you set up Gen 2 in Azure Portal? (Currently we are on the 25th November 2019. Hopefully this may get easier in the future)

First of all I go to the Subscription where I want to add the new data Lake v2

Open up the Portal menu (Now hidden to the left of the screen)

Choose Create a resource

next Choose Storage and Storage Account

Note that the Account kind is Storage V2 (General Purpose)

Make sure Location is the same as Power BI Service. I’m not using this functionality at the moment but there is no harm in applying this logic.

It is recommended to set replication setting to Read-access geo-redundant storage (RA-GRS)

For the time being, I am leaving everything else as standard

Next go to Advanced

the most important setting here is the Data Lake Storage Gen 2 . Enable the Hierarchical namespace and your storage account will now be created as data Lake Storage V2

Click Review and Create

Grant Reader Role to Power BI Service

This is all built in so it should be fairly straightforward.

In Azure go to your new storage account (If you aren’t already in it) and go to Add role Assignment

One there, choose the reader role and select Power BI Service which as you can see is already in the list.

It takes about 30 minutes for this to take effect.

Create a powerbi file System

Power BI Needs to use a Filesystem specifically named powerbi  so data flows can be stored in this specific file system.

We now have a few options available to us. I have some files to add so I am going to add them to a container

Click on Containers And then + File System

Note that to store dataflows its called powerbi Click OK

clicking on your new storage account(s) you are told to Download Azure Storage Explorer.

I already have this Azure Storage Explorer downloaded. If you don’t have this, its something you will absolutely need to work with Azure Storage accounts.

Once downloaded Open Azure Storage Explorer

You will need to Add in your Azure Storage Accounts by clicking the little connector icon

You will be asked to sign into your Account with your Office 365 credentials and 2fa authentication

This will log you into all your Subscriptions and Services

You are good to go

Here you find your subscription, Then go to the data Lake Storage Gen 2 and find the new File system powerbi.

Grant Power BI permissions to the file system

Before we connect we need to grant permission for Power BI to use the File System (Again this is specific to using DLV2 as a data flow store but at this point we may as well set up the permissions)

go to Azure Portal and Azure Active Directory

then select Enterprise Applications

Change the Application type Drop down to All Applications

Power Query Online and Power BI Premium and Power BI Service are in the list.

You will need the Object IDs of these applications.

Back to Azure Storage Explorer (Manage Access)

Navigate back to powerbi file system, Right click and Choose Manage Access

Click Add, Grab the object ID of Power BI Service to Manage Access

Set Read, Write and Execute Access to Service and Repeat the Process for Power BI Premium

Repeat for Power Query Online but Set Write and Execute Access

Other Also needs setting up as follows

Connect the datalake Gen Storage Account to Power BI Dataflows

To do this you need to be Power BI Admin. go to Power BI Service and navigate to the Admin Portal

From here Connect your Azure Data Lake Storage Gen2.

Add your Subscription ID, Resource group Name and Storage Account name of your Data Lake

It is now connected to Power BI

Allow Admins to Assign Workspaces

Finally, still in Admin Portal, go to dataflow Settings

Switch Allow Workspace admins to assign workspaces to this storage account to On

Workspace admins can now assign workflows to the filesystem created

Things to Consider

Here is where it starts to get a little bit hazy

  • This is all very much still in preview and there will be lots of updates coming
  • Once your dataflow storage location is configured it Cant be changed so dont do this on a whim.
  • You have to be an owner of the dataflow or be authorised to the CDM folder in the data lake to use the data flow
  • Once you have created a dataflow you ant change the storage location
  • It is your organisations data flow so there can only be one.

because of this, and the fact that its still in development I am going to wait to set up a central storage account for our workflows.

Im still unsure what you would do with Workflows that are already set up, Do they stay in the default area or can you reprocess them into the central data lake.

What happens if you want to move the data lake to a new subscription? is it not possible?

I will be going back to this when I have a few more answers to these questions

Setting up an Azure Data Lake V2 to use with power BI dataflows in Service (As a data source)

Previous to the brand new Azure Data Lake, I was adding all the files into Blob Storage. However Azure Data Lake V2 is built on Blob storage and DataLake V1

its built for big data and a fundamental change is that we now have a hierarchical namespace. This organises your files into directories.

So now, we can do things like use all files from a specific Directory, delete all files from a specific directory. We can categorise our files within the data lake.

Set up Azure Data Lake V2 in Azure Portal

When you go into Azure Currently and look for Data Lake V2 you can only Find Gen 1

So the question is, how do you set up Gen 2 in Azure Portal? (Currently we are on the 25th November 2019. Hopefully this may get easier in the future)

First of all I go to the Subscription where I want to add the new data Lake v2

Open up the Portal menu (Now hidden to the left of the screen)

Choose Create a resource

next Choose Storage and Storage Account

Note that the Account kind is Storage V2 (General Purpose)

Ive set the Location to North Europe, Simply because I know thats where our Power BI Data in Services is stored and I may as well stick with this.

For the time being, I am leaving everything else as standard

Next go to Advanced

the most important setting here is the Data Lake Storage Gen 2 . Enable the Hierarchical namespace and your storage account will now be created as data Lake Storage V2

Click Review and Create

Create a file System within a Container

We now have a few options available to us. I have some files to add so I am going to add them to a container

Click on Containers And then + File System

Click OK

clicking on your new storage account(s) you are told to Download Azure Storage Explorer.

I already have this Azure Storage Explorer downloaded. If you don’t have this, its something you will absolutely need to work with Azure Storage accounts.

Once downloaded Open Azure Storage Explorer

You will need to Add in your Azure Storage Accounts by clicking the little connector icon

You will be asked to sign into your Account with your Office 365 credentials and 2fa authentication

This will log you into all your Subscriptions and Services

You are good to go

Here you find your subscription, Then go to the Data Lake Storage Gen 2 and find the new File system.

I have added a folder here called Workshop1Files to my File System

Obviously Data Lake Storage gives you so many ways of working with files and automating the files to the storage area. In this case I am going to simply move a file into my new folder to work with

Double click on the folder and then Click Upload and Upload Files

And now your file is in the cloud, in an Azure Data Lake ready to use.

Connect to your Azure File with Power BI Desktop

The first test is can we access this data within Power BI Desktop.

Open Power BI Desktop and Get Data

Choose Azure Data Lake Storage Gen2 (Currently in Beta)

Add the URL

Data Lake Storage Gen2 have the following pattern https://<accountname>.dfs.core.windows.net/<filesystemname>/<subfolder> 

Data Lake Storage Gen2 have the following pattern https://<accountname>.dfs.core.windows.net/<filesystemname>/<subfolder> 

If you go to Right click on the file in Storage Explorer and go to properties, there is a difference in structure

http://<accountname&gt;.blob.core.windows.net/<filesystemname>/<subfolder>

If you try to connect with the original URL from Data Storage you get the following error

And if you change the URL from blob to dfs

There is a missing part to the puzzle. Go back to the Azure Data Lake Storage Account in Azure and Add Storage Blob Data Reader to your account

Then try again and hopefully you are in .

No need to combine because we have specified the file.

There are different ways you can load the file. I loaded one file but you can load all files in the File System

https://storageaccount.dfs.core.windows.net/filesystemname

or all files under a directory in the file system (You can include sub directories in this)

https://storageaccount.dfs.core.windows.net/filesystemname/directoryname/directoryname

Connect to your Azure File with Power BI Data Flow

I am creating data flows in the power BI Service to ensure they can be reused across the company. The question is, Can I Connect to the above File in Service via a data flow

In Power BI Service, add a Data Flow which takes you into Power BI Query Editor in the Service. I already had some data flows connected to an Azure database.

The data is in Azure Data Lake Storage so the first think I do is try the Azure route

However, there is no Azure Data Lake Storage Gen 2. This must be something coming in the future. so then I go to File and click on Get Data text / csv

You will need to add the File Path and your Credentials (As per previous advice use dfs not blob in the URL. this seems a little flaky at the moment. I choose Organisational Account first before adding the URL and then it seems to work.

Remember Go back to Azure Storage Explorer. if you click on properties, you can grab the URL from here

We don’t need a Gateway Setting up because everything is now in the cloud.

Clicking next, Nothing happens, it just keeps bouncing back to the same window.

Attempting to use the Blob Storage connector also doesn’t work (Using the Azure Account Key as authentication).

with blob in the URL
With dfs in the URL

It would appear that currently I have hit a brick wall and there is no current DLGen2 connector for Data Flows.

I will be keeping an eye open on this because obviously, when you are pushing the new generation of Data Lakes and Data Flows then there needs to be a DLGen2 Connector for Data Flows.

Update

Had a reply back on the Power BI Forum (Not a good one)

The feature haven’t been planed. If there is any new message, the document: What’s new and planned for Common Data Model and data integration  will be updated.

I have found this in Ideas

https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/38930032-add-azure-data-lake-storage-gen2-as-a-data-sourc

Please help us get this working by voting for this idea.

Create your website with WordPress.com
Get started