Where to create your reports. Power BI Service or the Power BI Desktop

Since starting to use Power BI we have been advocating the following ‘basic’ model

However. You can also create reports within Power BI Service. This is mentioned within the certification route for DA 100 Analysing Data with Power BI. Which makes things a little more confusing and I have been asked on lots of occasions. “If you can do everything in Service, what is the point of using Desktop?”

This post will hopefully answer that question.

Creating Reports in Service

First of all lets have a look at how creating a report would work in Service.

Within the Adventureworks Workspace, Click on New and go to Report

Now Select a dataset to create a report

You are shown your list of datasets that are available to you so immediately its clear what this route could be used for:

  • The Report Analyst. the DAX and Data model have been built and the analyst can simply add visuals over the dataset created

Within service, a very basic report is created.

As the analyst I realise that I want more DAX

If you click on to get to the options against a table, You cant create DAX.

  • So if you create your reports in Service you cant create DAX.
  • You also cant do anything with the data model if required
  • And you cant go to the data pane like you can in Desktop

Data and Model aren’t available like they are within desktop.

When you are dealing with a larger model, its always useful to be able to flick across to the model area to view your model, to get a good idea of what you are working with. Because of this, its not recommended to work with the Reports in Service if you look at the data model as a guide whilst creating reports.

Saving your reports in Service

Continuing on with the process. Click Save

The Adventureworks report is now added as content. Lets have a look at Lineage by changing the View to Lineage.

The new report is now up but this leaves some confusion. Which reports are maintained within Desktop and which reports aren’t? There is no way of knowing. Both the above reports look the same.

Within the Adventureworks report which was created in desktop, you can go to Edit

and Download report (which is currently still in Preview)

Obviously you would want to get the current file from the owner of that file but this is one way of generating pbix file is behind the report.

You can do the process above with the Service created report. This increases complexity and possible confusion. How do we know as users what reports have a pbix file and which do not?

The only answer is to go to the Lineage and find the contact to ask

Editing a Report in Service

Before looking at editing a report, lets get a quick overview of Roles in Power BI

Viewer

The viewer role isn’t set in the Workspace anymore. This is best done within the App to allow report readers to view content

It seems the only reason we have viewer if you wanted to test what your report users can do within the workspace.

They can only view content and set their own personal bookmarks. Create comments etc. they cant edit the content.

Contributor

the contributor role is the developer role.

  • Publish your report into the workspace
  • Edit the content in the workspace if required by building dashboards
  • Delete any content within the workspace
  • You get access to all the workspace objects and can analyse in Excel etc

At contributor level you cant publish into an app. Only the Member or Admin can do this.

Member

Members can do all the actions of Contributors but they can also publish and update apps. this essentially allows the member to push content into the production area. (Bearing in mind that Premium Power BI Pipelines allow you to fully manage your Lifecycle with Dev Test and production workspaces)

Admin

The Admin has full access to the workspace. In addition to Member actions they can add more admins and more users and delete the workspace.

Issues with Editing Content

For this example, I am a Contributor within someone else’s App Workspace

Contributor allows you to Edit, So without letting the owner of the report know, I add a new page with a visual and Save.

In the meantime, the report creator decides to update and republish their report from the pbix file.

Once republished by the pbix owner all my work is gone.

This is another reason to be very careful when editing reports in Service. If you intend to continue to update your pbix file, anything created in Service should be moved into the pbix before republishing, which relies on your business governance.

Final notes

The only data set that cant be added to a pbix file is a real time streaming data set. data is added to a temporary cache so it expires and you can only use streaming visuals for this type of streaming data set.

Otherwise, with all other data sets, the recommendation is to create a file within Power BI desktop.

This is a great visual to check if you are thinking of editing in Service

Even then you need to think about usage and if the report has a pbix file underneath it.

Personally, it creates too much added complexity when creating and editing reports in service. if you have contributor access you should have Power BI desktop.

lets have a final look at types of Power BI User

Your central BI dev team should be working with Power BI Dataflows in Service where possible to separate the transformations into Service away from the model and the DAX. they should be creating reports in desktop

Your Power users will also be using Desktop and may also be taking advantage of promoted dataflows

Consumers should be accessing from an App only.

The only use case of the edit could be at analyst level, where the analyst is taking advantage of a promoted or certified data set and building out from what is there (With contributor role)

At this point, its recommended to come up with some governance here. Is this use case ok within your company, OR should all reports have an underlying file created in Desktop?

The choice is at individual company level but we prefer to ensure that desktop is used and reports are not created or edited at Service Level. That we we can put governance in place on the pbix files themselves and be in a better position at managing the full scope of Power BI.

Power BI Service Connected to Azure SQL Server Data set with User credentials

We have an Azure Database which has an Admin UserName and Password

But we have also added the following to the Database In SQL Management Studio

— Create the user with Password for users that authenticate at the database

CREATE USER Org_powerbi WITH PASSWORD = 'password1';

–And Create a role

CREATE ROLE db_Org_read;

–Execute the Stored Procedure addrole member. Here the Org_powerbi user is being added to the above role

EXEC sp_addrolemember 'db_Org_read', ' Org_powerbi';

–We have a schema called org. We are granting the SELECT permission for this schema to the above Role. this means that you can perform Select data Statements

GRANT SELECT ON SCHEMA :: Org TO db_Org_read

So we added a user and granted the user SELECT permissions

When we use this user ID and password in Desktop it works fine. We can refresh the data.

However in Service (Dataset – Schedule Refresh), the login credentials are invalid

Scheduled refresh

Authentication Method – Basic

Choose Basic if you plan to use SQL authentication instead of Windows authentication

Privacy level Setting for this Data Source – Organizational

An Organizational data source limits the visibility of a data source to a trusted group of people. An Organizational data source is isolated from all Public data sources, but is visible to other Organizational data sources.

The Solution

The user account you create needs to be added to the Data Source Credentials in the Power BI Service once the dataset has been uploaded.

Go to Settings

And again. Settings

And Edit Data Source Credentials of the data Set

This will add the user to the Data Source Credentials

Once added you can go back to the Schedule Refresh in the Data set in your app workspace and the Log in Credentials should now work.

Power BI Service Data Lineage View

I was logging into Power BI this morning when I saw this exciting new feature

we are always looking at new solutions to provide good data lineage so this is well worth a look

Data lineage includes the data origin, what happens to it and where it moves over time. Data lineage gives visibility while greatly simplifying the ability to trace errors back to the root cause in a data analytics process. 

Wikipedia

I have an App workspace set up for Adventureworks so lets have a look at Lineage using this project

Column 1 is my data source. I can see I’m using a local database and I’m also using an xlsx spreadsheet to bring in data.

In most of my projects I’m working on the ETL in Data factory, transforming data in Stored Procedures etc. for example, for a social media feed, I have a logic app that moves tweets to an Azure Data Storage NOSQL table. Data Factory then transfers this data across into a central Azure Data Warehouse. The Power BI Lineage would pick up at the data Warehouse stage. It wont take into account that there is a lot of work previous to this

Column 2 is the data set in Power BI

Column 3 provides Report information

Column 4 displays the Dashboards

You can click on a data flow node to drill down into more detail

Currently you cant go any further to look at the data items

Click on the Link icon to see the data flow for that item. In this case the Report.

This is a great start but there definitely needs to be more information here to make it something that you would want to use as a proper Data Lineage Tool

  • It would be good to see the fields in each Entity for the Data Sets
  • As an extra, it would be great to see what fields are being used in Measures and calculated Fields
  • Reports – For me, Id like to know for every page in my report
    • What field am I using from the data source
    • What calculated columns I have created (Even better with the DAX Logic)
    • Any Name changes from Data Source to Power BI
    • What measures I have created (Even better with the DAX Logic)
  • For the Dashboard, What items I am using in the dashboards (Fields, Measures, Calculated Columns
  • An Important part of data lineage is getting and understanding of the entire process. This includes data transformations pre Power BI. If you cant do that in here, it would be great to be able to extract all the information out so you can use it in some way with your other Linage information to provide the full story. for example:

Azure Data Catalogue

Azure Data Catalog is a fully managed cloud service. Users can discover and consume data sources via the catalog and is a single , central place for all the organisation to contribute and understand all your data sources.

https://eun-su1.azuredatacatalog.com/#/home

I have already registered Our Data Catalog, and I have downloaded the desktop app

As an Example I want to connect to Azure Table Storage (Connect using Azure Account name and Access Key)

At this point I’m registering everything in the storage table. then I can view the information in the Azure Portal.

You can add a friendly Name, description, Add in expert (in this case me). Tags and management information

I have added Data Preview so you can view the data within the object. there is also documentation and Column information to look at

In the data catalog you can manually add lots of description to your tables along with documentation.

This is great for providing lots of information about your data . You can explore databases and open the information in other formats (Great if you need to supply information to another Data lineage package

I will be having a look at the Azure Data catalog in more detail later to see how it could help to provide full data lineage

Azure Data Factory

Data factory is the Azure ETL Orchestration tool. Go into Monitoring for Lineage Information. However, there doesn’t seem to be a way to export this information to use. Data Factory wont take into account the work done in, for example a stored Procedure

Again this is another area to look into more.

Stored Procedures

When you use Stored Procedures to transform you data, its harder to provide automated Linage on your code. There are automated data lineage tool for SQL out there, but it would be great if there was a specific Tool within Azure that creates Data Lineage information from your Stored Procedures

Azure Logic Apps

Data for my project is collected via Logic Apps before being Processed into an Azure Data Warehouse.

Essentially, we need out data lineage to capture everything all in one place.

And just as important. everything should be as automated as possible. If I quickly create a measure, the data lineage should reflect this with no manual input needed (Unless you want to add some description to the new measure as to why it was created)

Create your website with WordPress.com
Get started