In a previous blog we looked at setting up a Pipeline
However, we have found an issue, when you Deploy to Test or Production. the issue resulted in getting a better understanding of the permissions you need in order to create and control the process
The Deployment hangs and doesn’t continue on to creating a new Premium App Workspace. If you click refresh, a workspace gets created but it is none premium. In other cases nothing gets created at all.
This is due to the person who is setting up the Pipeline. They may be Admin in the initial Power BI App Workspace but they may not be able to continue on to actually create the Premium Workspaces
In our example, the Power BI Administrator set up the Premium App workspace and then assigned myself as admin. I didn’t set it up.
there are two ways of doing this, especially when working against Dev and Test. They can be set up as Power BI Embedded in an A SKU
Or you can have a Premium capacity Workspace (You must have this if you want the Production area)
Example using the A SKU
We are using Power BI Embedded created in Azure
I am able to create the Premium test and Premium Production environments. But after testing with a colleague, the issue happened.
Lets have a look at what we have set
Power BI App Workspace
We are both Admin
Azure Power BI Embedded
In Azure. Search for Power Bi Embedded. We already have this set up.
Go to Access Control (IAM)
We have User 2 (Rebecca) set as Owner. We also tried this at contributor level but the issue still occurred.
Contributor – Lets you manage everything except access to resources
Owner – Lets you manage everything including access to resources
You need to go to Power BI Capacity Administrator. I was previously set as the capacity administrator to the Power BI Embedded Capacity. Once Becka was added here we were able to successfully move through the Power Bi Pipeline steps without anything hanging.
Therefore, If you are a user in charge of setting up Power BI Pipelines, you must be a Power BI capacity Administrator
To Test. Do you have to be Owner or Contributor in order to use the Power BI Pipeline once it is set up?
Azure Power BI Premium capacity
Power BI Premium is not created as an Azure Service.
Power BI Premium is Managed by the Power BI administrator within Power BI service. Settings and Admin portal
You actually don’t have to be a Capacity Admin in Premium, but do need Capacity assignment privileges.
The Capacity or Power BI Service Admin can arrange for this to be sorted And you need to be able to create new workspaces,
At this point, make sure you change the current value of the parameter to check they are all working. This report has been published up to the Premium workspace
You don’t have to do this. You can use Rules within the Pipeline if you wish.
Start a Power BI Deployment pipeline
The Look and feel of the Deployment Pipeline is great
Create a Pipeline
Now we are into the actual main Pipeline area. You can only assign one workspace to the pipeline. When we move through the pipeline it automatically creates the other workspaces for you.
Our Pipeline testing workspace is going to be set up as the development area
Assign a Workspace
You dont need to start with development. You can start with test or Production but in this case we are going straight to the Dev area. Also you don’t need to have all three environments. This now gets assigned to the Development pipeline
At this point you can see what will be part of the Dev Pipeline. Show more shows you the content within. At this point you can visibly select items that you want to deploy up to the next stage, but in this example all three of them is required.
Rename Pipeline Testing to Pipeline testing (Dev). Click on … and go to Workspace settings
Your report consumers and testers will want to use your reports and dashboards as apps. You can create Apps at every stage. To do this click on … and Publish app
Deploy to test
Clicking Deploy to test creates a copy in your Test area
It will copy your Reports, Datasets and dashboards into the test area. If you have dataflows you should note that these currently don’t get included
Rename to Pipeline testing (Test) if required
At this point we may want the test area to be connected to a different data source than the development environment. Because we set up Parameters in the pbix file to change to different databases, we can use parameter rules. If if you dont have parameters set up you can create a data source rule.
At this point. go back to your New Test Premium Workspace.
Against the data set click … and Settings
I have changed the Parameter to the correct one
Now refresh the credentials
And repeat when you get to Production App Workspace
Deployment Settings (Back in the Pipeline)
Get to Deployment Settings when clicking on the lighting bolt
Parameters have been set in the pbix files so these should be used in this example.
You can use rules(Below) if you don’t have parameters but remember to check your Data set settings first.
Because the source is a database, the pipeline knows to ask for a server and a database. Make sure your Database is set up correctly first within Service.
Deploy to Production
Clean up and set up the same rules (Ensuring after deployment you check your Data set Settings before setting up the rules).
Updates to Development Area
You can also just deploy specified items that you have worked on
For this test, go back to the desktop and add a couple of visuals. Then Publish into the development workspace.
Look to compare the changes
The comparison shows that two items have changed. Deploying into test will copy the new information across
You will also see icons for new and deleted.
Now we can see that production is still using the previous datasets, reports and dashboards. we wont copy across until we are happy the changes are correct.
These are now three individual Workspaces with individual data sets and reports. You will need to set up scheduled refresh for each area.
You can also publish downstream if required by clicking … If the item is available
Not for Dataflows or Excel items
The Production workspace must be in premium. You can use A SKU or Power BI Embedded to save money. (A Sku’s can be set up within Azure and are Test environments. they can be paused)
The evening was incredibly well thought out with 30 minute sessions on each of the Power Platform areas.
Power Virtual Agents
Our attendance was mostly based on the Power BI sessions that night. We wanted to focus on what to get excited about with Power BI and when to get excited about it. However there were also some great overviews of how to use all the applications together which helped us to understand the Power platform as a whole.
Power BI Is split into key area to drive data culture in your organisation
And each of these areas contain some fantastic new updates.
Each area is going to be looked at in a lot more detail in blog posts to follow but in the first instance, lets take a look at all the exciting updates.
All this comes from providing amazing data experiences to customers.
AI Infused Experiences
The very first AI visual for Power BI was the Key Influencer. Next came the Decomposition Tree and then the Q&A visual. All these visuals have proved big hits with report consumers who get the ability to understand all the factors that drive a metric, and can ask more and more questions against data in their own way.
Lets have a look at some of the Updates, and even more exciting is the new visual coming for Smart Narratives
Key Influencers Update
Key influencers are fantastic and we have been using them the moment they were added into Power BI as preview.
We have used it across lots of projects, For example, Social Media influencers. What influences a negative tweet. Customer Churn is another great use case for the Key influencer
Key Influencers now supports continuous analysis for numeric targets
Binning Support, Formatting options and Mobile Support
More Visual Updates go into preview and will now be usable for Live Connect
Counts will go into preview
All the key Influences improvements should be moving to GA (General Availability)
Power BI Decomposition Trees Update
The Key influencer allows you to analyse a category within your data and discover influences and segments. The Decomposition tree allows a report consumer to analyse a business metric however they want.
You will be able to conditionally format your visual very soon. Using the above visual, you might have the most engaged businesses in Nottingham, but conditional formatting could show the most percentages of meeting cancellations. We can do conditional formatting on another metric
You will also be able drill through from the decomposition tree visual to more detailed data.
There is a reason why people love this visual and we cannot wait to start implementing these updates into our reports.
The Decomposition Tree will now be out of Preview and in General Availability
We can now include Q&A in the reports as well as just from the dashboards and there are some great new updates for this
Add Terms within Q&A allow for better synonym matching and Suggest questions will allow you to tailor some ready made questions for your user
New Q&A Visual Updates (TBA)
Direct Query will be coming for Q&A Visuals.
New AI Visual – Smart Narratives
Available Later this year
We got a sneak peak of the New Smart Narratives visual and it looks so good.
Report authors will be able to add dynamic interactive narratives to reports and visuals. These narratives update when you slice and dice the data.
It automatically does trend analysis
The visual calculates the growth automatically with no user imput required
You can also add dynamic values as part of the narrative and even use Q&A to create the value
This is one development we are really looking forward to.
Power BI End User Personalisation
Another development that is going to change things for report consumers in a really good way is personalisation
You may love a stacked area chart but Julie in HR May hate them. Consumers can now click on a visual, go to personalise and change the visual to suit their needs better. This visual is saved specifically for that user (As a modified view with a personal bookmark) and its easy to go back to the original visual.
This is currently in Preview so if you want to take advantage of it, make sure you go to Options and Settings > Options > Preview Features
PowerPoint for Data – Onboarding and Lessons Learned
Microsoft acknowledge that PowerPoint has really good on boarding features. Lots of people happily use Powerpoint. They should have the same experience with power BI
All the following updates come from lessons learned with PowerPoint:
Lassoo Select of visuals and Datapoints. this is great. finally you can lasso (Drag a rectangle around) a number of visuals together in desktop. You can even do this with data points
Drop Shadows. How to make a great report look even Nicer. Add Shadows to them. Another feature I cant wait to use
Power BI Templates Experience
Report authors will get lots of help to create report pages with pre-made templates like PowerPoint layouts. Obviously Templates can already be created for Power BI but this will make everything much more intuitive and easy to use.
I’m a big fan of Story boarding in PowerPoint. I wonder if we will see this come into play in power BI?
Modern Enterprise BI
Power BI is no more a business led self service tool. Its can now be used right across your large scale business enterprise. We can now use Power BI as an enterprise scale analytics solution bringing together all our insights to drive actions and improve performance.
There are lots of key points to consider within this Microsoft strategy area. For example:
Admin and Governance
Lineage and impact Analysis
The modern enterprise BI has the most impact when customers are using Power BI Premium capacity nodes. lets have a look at some of these areas in a little more detail, and specifically understand what Power BI License you need to have to make use of these new capabilities.
Power BI Lineage and Impact Analysis
Lineage and Impact Analysis went into Public Preview in October 2019. We are very much looking forward to looking at this in more detail very soon.
the real excitement is, the ability to incorporate more services within Azure into the Lineage which will make it much more essential when looking at how your data is structured
Within the Power BI service, Change the view to Lineage View
You get little logos to show if your dataflows or data sets are promoted or certified.
Impact analysis is available from your data set. clicking Impact Analysis will allow you to assess the impact of a data set change. How will your changes impact downstream reports and dashboards?
You can also see your visitors and views and even notify people about upcoming changes.
It appears to be available for Pro as well as Premium but as yet, we aren’t aware of any differences between the two.
This will be explored in much more detail in a post coming soon.
Enterprise Semantic Models
Another big game changer for Power BI Users
Again, we are moving away from creating your data set within a power BI pbix file which is only available for the user. Just like Analysis Services Tabular Model, we can now create the model with Power BI, available for everyone to use, From business users, analysts, to Power Users.
The enterprise semantic model comes with some great updates:
Shared and certified Datasets
When you certified a dataset in Power BI, You are stating that this data set is a single version of the truth. when we connect to a certified dataset the model may contain a large amount of data, and your specific reporting requirements may require you to only select a few tables from the central model.
Power BI Premium Only
XMLA Endpoint allows 3rd parties to connect just like you can with Analysis Services models. This is yet another game changer as it allows organisations to create the one version of the truth using power BI.
Previously, this could have been done using Analysis Service, either in the cloud or on premise. Your own centralised Tabular model. this data model could be connected into from various data visualisation tools, and data management tools, e.g SQL Service Management Studio, DAX Studio, ALM tookit etc.
Now with XMLA endpoints open platform connectivity, the datasets you create in Power BI will be useable from a variety of other data visualisation tools, if your users don’t want to use Power BI.
This is excellent for IT Led self service. Your centralised Power BI Team can create the dataflows and models and business users can take those models and run with them. obviously Power BI is fantastic but you don’t lose out on users who absolutely want to stick with the visualisation tool that they know.
This is all about delivering a single one version of the truth semantic data model
Power BI Extensibility
Available later this year
This will enable external tool extensibility to unlock additional semantic modelling capabilities.
will all be able to get access to the Power BI Tabular model (data set) in the same way as they would an Analysis Services Tabular model.
This is due out later this year and as yet, its unsure if this is just for Premium or if it will be available to pro users too.
Translations (Available with Power BI Extensibility)
Translations allows you to create multi cultural datasets. These meta data translations are an offering of the analysis services semantic model, and previously locked away in the Analysis Services engine.
The extensibility model for Power BI will soon allow us to finally use Power BI translations within power BI Desktop
Clicking Tabular Editor allows you to connect to your Power BI dataset and use Analysis Services Features. Translations being one of the major draws to Analysis Services Tabular.
This should be available later this year, and will be looked at in much more detail within future posts
Deploy to Workspace Incremental Metadata only deployment
This is a Premium Only service. Imagine that you have implemented your translations and want to publish your new data set.
There are no data changes so you don’t want publish to involve the data. When you publish you will get impact analysis
However, you actually want to do an Incremental meta data only deployment. So instead of simply publish, go to the Settings within the Workspace in Power BI Service.
Go to your Premium tab
And copy the Workspace connection link. this Workspace connection can be used just like an Analysis Services Workspace. You can use this workspace name with the ALM toolkit (Under Extensibility) to look at comparisons and pick and choose what you want to update.
The Power BI Tabular model has been processed in the same way as you would an Analysis model. Thanks to these new External tools we can do so much more with the power BI Datasets.
Composite Report Models
We have looked at the enterprise Semantic Model from the BI Developer. Now its time to look at what we can do for the data analysis.
Previously, there has been lots of talk about composite modelling
“Allows a report to have multiple data connections, including DirectQuery connections or import”
Composite models allow the developer to created an aggregated data set which allows you to reduce table sizes by having imported data at granular level (So you get the full suite of DAX to work with) and then you can drill down to granular data in direct query mode.
Composite report models are basically composite reports as opposed to composite models. I got a little confused between the two as they are both called composites but they are quite different.
As a data analyst you get data from a Certified data set. this is essentially a Live Query because you are connecting to a Power BI tabular model
The analyst will now be able to combine data from multiple data sets and create relationships between them. Composite modelling can be mashed up with local data by the analyst. This will bring so much more power to the analyst.
It will be really interesting to see how this works over the next few months. Again its uncertain if this will be available for Pro users but we will be looking at this in much more detail soon.
Full Application Lifecycle Management
Public Preview May 2020
Power BI currently consists of the App Workspace (for collaboration) and Apps for consumers. this gives you your development, test and production environments.
Deployment Pipelines is the next level of lifecycle management. If you use DevOps you have seen and probably used Pipelines for other business requirements. For Premium capacity Workspaces, Pipelines can now be created to deploy to Develop, test and production Environments
This is a fantastic new development for modern enterprise BI. Each Workspace can be compared within Service and allows you to be more agile and responsive to users needs. We are really excited about this one.
Drive a Data Culture with pervasive BI throughout your Organisation
Automatic Performance optimisation with Azure Synapse Analytics
Relates to the Data Stack. Microsoft are working on deep integration with Azure Synape Analytics.
We will be looking at this in more detail later but there are big changes coming:
Materialised views to improve performance within the Synapse layer.
Useage based Optimisation against Synapse.
Common Data Service
This sits with the Action key point for driving data culture. this is another area that the Microsoft team were very excited about. As yet we are being cautious and want to do some more research around this topic.
You will now be able to direct query the Common Data Service. the CDS ties in with Power Apps and seems to be used very much within that domain. Its worth noting again at this point that Power BI Does not exist alone. It is part of the Power platform.
Internal data is stored in CDS. External data is brought in via connectors. there are 350+ connectors that can be used for External data. However data within the CDS is Smart, Secure, and Scalable.
We will be looking at CDS in much more detail in relation to Power BI
This is just a first high level look at some of the offerings from the Business Applications summit. There are so many great sessions to look at for more in depth details. It looks like an incredibly exciting time to be involved with Microsoft business Apps.
Taking the first part of Azure DevOps, Azure is Microsoft’s Cloud computing platform. It hosts hundreds of Services in over 58 regions (e.g. North Europe,West US, UK South) and available in over 140 countries.
As you can see, lots of Azure services have already been consumed throughout these blogs. Azure SQL Databases, Azure Data Lake gen2, Azure Blob Storage, Azure Data Factory, Azure Logic Apps, Cognitive Services etc.
Business Processes are split into Infrastructure as a Service Iaas (VMs etc) , Platform as a Service PaaS (See the services above) and Software as a Servie SaaS (Office 365, DropBox, etc)
You can save money by moving to this OpEx model (Operational Expenditure) from the CapEx model (Capital Expenditure) because you pay for what you need as you go, rather that having to spend money on your hardware, software, data centers etc
Cloud Services use Economies of Scale, in that Azure can do everything at a lower cost because its operating at such a large scale and these savings are passed to customers.
On Demand Provisioning
When there are suddenly more demands on your service you don’t have to buy in more hardware etc. You can simply provision extra resources very quickly
Scalability in Minutes
Once demand goes down you can easily scale down and reduce your costs. Unlike on Premises when you have to have maximum hardware requirements just in case.
Pay as you Consume
You only pay for what you use
You can focus on your business needs and not on the hardware specs (Networking, physical servers, patching etc)
Every unit of usage is managed and measurable.
What is DevOps?
A set of practices intended to reduce the time between committing a change to a system and the change being placed into normal production, also ensuring high quality
Testing, Reviews, Moving to production. This is the place where developers and the Operations team meet and work together
If we dont work within a DevOps Framework. What do we do?
Developers will build their apps, etc and finally add it into Source Control
Source Control or Version Control allows you to track and manage code changes. Source Control Management Systems provide a history of development. They can help resolve conflicts when merging code from different sources
Once in Source Code the Testing team can take the source code and create their own builds to do testing
This will then be pushed back to the development team and will go back and forwards until everyone is happy. Here we can see that the environments we are using are Dev and Test
Once Complete, it is released into Production.
This is a very siloed approach. Everyone is working separately and things can take time and you will get bottlenecks
Everyone works together. You become a team and time to market becomes faster. Developers and Operations are working as a single team
Each stage uses specific tools from a variety of providers and here are a few examples
Code – Eclipse, Visual Studio, Team Foundation Services, Jira, Git
Build – Maven, Gradle, Apache Ant
Test – JUnit, Selenium
Release – Jenkins, Bamboo
Deploy – Puppet, Chef, Ansible, SaltStack
Monitor -New Relic, SENSU, Splunk, Nagios
We need all these tools to work together so we don’t need to do any manual intervention. This means that you can choose the ones that you have experience in.
Components of Azure DevOps
People with a Scrum and Project Management background will know how to create the features within the Boards. Epics, Stories, Tasks etc
Developers create and work on tasks. Bugs can be logged here by the testers
Push the development into Source Control to store your information. Check in your code within Azure Repos.
There are lots of repositories to choose from in Repos to suit your needs like GIT or TFS
Developers build code and that code need to get to the Repos via a Pipeline. The code is built within the Pipeline.
The code is then released into Dev, Test, Prod, Q&A etc, And from, say the test or Dev environments we can……..
Azure Test Plans
Test, using Azure Test plans. For example, if you have deployed a web service, you want to make sure its behaving correctly. Once tested the code will go back to the pipeline to be built and pushed to another environment
Collect dependencies and put them into Azure Artifacts
What are dependencies?
Dependencies are logical relationships between activities or tasks that means that the completion of one task is reliant on another.
The artifact that is used to track work on the Azure board.
So you create work items here and interact with them on the board
Work with Epics, Features, Tasks, Bugs etc.
Includes support for Scrum (agile process framework for managing complex work with an emphasis on software development) and Kanban (a method for managing and improving work across human systems. Balances demands with capacity)
How do you prioritise your work items?
Say your Sprint is 20 days (2 weeks) What work can be accomplished within this sprint?
Overall picture of the particular sprint or release
We can use GIT or Team Foundation Server TFS. The example uses GIT
You create your own branch from the master branch. Do your testing and changes and push back from your branch to the master branch.
Where is your Code? Its in GiT
Select the GiT source like Azure Repos GIT or GiTHub etc
Get the code from the master branch
How do you want to build the project?
Choose from lots of templates, Azure Web App, ASP.net , Mavern, Ant, ASP.NET, ASP.NET with containers, C# function, Python package, Andriod etc
Next provide the Solution path and the Azure Subscription that you want to deploy to
This takes the source code from the GiT repository and builds the source code.
The build will then give you logs to show you how the build of the project happened
Next time when you check in code, it will automatically trigger the pipeline to build the code
Then the build needs to be Released via a Release Pipeline
This is where you release to the correct Azure Subscription and the code will be deployed. You can also add in approvals to ensure you get the pre approval required to release the code.
This is just a whistle stop tour of Dev Ops. Test Plans and Artifacts haven’t been discussed in much detail but it gives you the basics of what is included in DevOps and how you can start to think about using it.
What do you create in Azure and can it be handled within DevOps?
Can we start using the Boards?
How do we can started with Azure Devops?
Which teams members have the right interests in the specific DevOps areas?