I love the show Taskmaster and I really want to see how much other people love taskmaster and when people are tweeting about it.
I want to see everything happening real time in my Power BI report.
For this example, I am going to use a streaming data set. The data
will be pushed into the Power BI service, but Power BI will only store the data
in a temporary cache which expires. We can give it a window of time to look at.
There is no underlying database with a streaming data set so I
wont be able to use my usual report visuals. I will need to use the Streaming
visuals available in Power BI
There are other ways to do real time data such as Push Data sets
and PubNub streaming datasets but for this example I want to
concentrate on the Streaming data set
Click on Create in the right hand corner of the screen
Then click on Streaming dataset
Choose API to create an API endpoint
API “a set of functions and procedures that allow the creation of applications which access the features or data of an operating system, application, or other service.”
Our Streaming Dataset will contain these values.
Extra time dimension fields have
been created to use drill/down features.
The outcome of the sentiment analysis
is placed in the field ‘score’. From 0 to 1 Positive to Negative
Historical data set’ to ‘on’, if
you want to save the data for analysis later in time.
Power BI makes a JSON like format
(a REST Webservice on the background).
JSON – is an open-standard file
format that uses human-readable text to transmit data objects consisting of
attribute–value pairs and array data types
REST – ‘Representational State Transfer’. It is primarily used to build Web services that are lightweight, maintainable, and scalable. A service based on REST is called a RESTful service. REST is not dependent on any protocol, but almost every RESTful service uses HTTP as its underlying protocol.
Once you click done, you can
see that we have the new data set in our datasets list.
Microsoft Flow / Azure Logic App
The next stop is to create a
Microsoft Flow (You can use Azure Logic App if you prefer)
Microsoft Flow is a cloud-based software tool that allows employees to create and automate workflows across multiple applications and services without the need for developer help.
When you are in the Flow you create
New Steps to build up the logic required for your data. The Flow will create
the Twitter feed for your Reports.
The first step I have simply added #Taskmaster to the When a new tweet is posted
You will also need to log into your twitter account at this point to get access to tweets
Click on New Step and add an action
The next step is to add some sentiment Analysis. I have used Text Analytics to do this
I have searched for sentiment and
can then click on Text Analytics – Detect Sentiment
I have already logged into my Azure
I used the Key, Name and Endpoint
details from Azure
You may need to set up your own
Cognitive Service in Azure before you do this section.
Next step is to search for Power BI
Select Power BI – Add Rows to a
Next, add all the information about
the RealTimeData streaming dataset
Most of the details can be added from the Dynamic Content picker. As you can see Score comes from the Detect Sentiment group. All the twitter data comes from the very first step ‘When a new tweet is posted ’ Group
The following are the expressions
you need to add for the Time fields
Save your Workflow
Testing Service in Microsoft Flow
The real time analysis will be triggered by hashtags in
This service only allows you to get current tweets. No
historical tweets can be generated at this point
The visualization tiles will only be kept for an hour so
this is the kind of service you need to monitor fast paced information.
In the Workflow you can save
Click Test and Create a tweet with one of the hashtags in
The service will run and you should get pass or fail
We can now use this information to create a sentiment report
Back to Power BI Service
We already found the dataset in the relevant workspace e.g. My
For this example, click on Create report against the dataset
and add a table tile to the report pane. (The data may be refreshing at this
Leave like this for the time being. Save your report and pin table visual to dashboard by clicking the pin
I created the table simply to create a report and then a dashboard.
Once at dashboard level Custom Streaming data tiles can be added. I am deleting
the table when the dashboard is created for simplicity
Click on +Add Tile
Then choose the Custom Streaming data
At present, there isn’t many visualisations to choose from
for this type of data
The #Taskmaster dataset has been chosen
I am going to create a very simple bar chart to look at the
last hours data and see what the sentiment score is based on location
Now I can see the sentiment score by Location.
Azure has provided the Cognitive Text services to analyse
the data for a score.
It would be useful to understand what tweets are driving the sentiments by location.
I would also prefer to group on Positive neutral and negative Scores
In my next post I am going to look at how to create a sentiment description and add Description Name and Count into the Flow and streaming dataset
The PowerApp will be a new enhancement to the reports
The PowerApp hasn’t been built yet
I am concerned about adding the visual into Power BI Service in Edit mode because the standard way to create reports is in Desktop. I have many more Updates to do after the power App has been added and I don’t know how adding visuals in Service only will work with the ongoing App amendments
Possible Data Solution 1
PowerApp, the User Adds in the CustID, (And an automatic date is assigned)
these get inserted into the table Staging Worst Served Amendments
Dim Customer and the fact table are checked through using the custID and the
items are set to ‘No’ and 0 as above (this is clearly the most difficult bits
because it’s an update based on the CustID
The next refresh will again change 153 to Worst Served Yes,
however an extra bit of script in the Stored Procedure that creates the table
will Check the Worst served Amendments table and if there, reset to No.
The above Staging table is only used for the overnight data refresh
Changing Import to Direct Query
To develop against the above resolution, both the fact table and the property table need to be changed to direct import. What do we lose if we do this?
All my DAX functions for time will be lost against my fact table (Year to Date, This time last month etc).
Therefore I really don’t want to lost these DAX queries by changing to Direct Query
Also currently I cant seem to change the setting from Import to direct Query in Properties. Unless this is a bug, It seems you would have to start again with the tables and re import them as direct Query.
Is there another way that the new logic can be added without setting both these tables to Direct Query?
Possible Data Solution 2
Filters in Power BI
Where Dim Customer WorstServed = Yes
Where Fact WorstServed = 1
And Staging Worst Served Amendments CustID is NULL
Issues with the Above Solution
You cant have a Filter for an empty customer ID because this is a Left Outer Join.
There may be other issues. What happens if the user accidentally adds in Multiple custIDs and the relationship changes to Many to Many?
Normally I would deal with this by merging the tables in a left join in Power Query Editor
As a consequence I need to think about another Solution to this problem
Patch – Modifies or creates one more record in a data
source, or merges records outside of a data source
Defaults – Use Patch with Defaults function to create
records (Insert a new record into a table)
Value() – This is converting a text value to a number
UpdateContext – Create a context variable to
temporarily hold a piece of information. In the above case we are calling it TimerStart
and setting it to true. We are basically starting the timer when we click
update because we have a timer hidden on the screen
On Start we are going to use the Context Variable TimerStart
Hide the Timer because we don’t need to see it
Create a context variable called SuccessMessage and
set to true
Create the context variable SuccessMessage and reset it to False
Create another Context variable called TimerStart and set to False. TimerStart was started on Update and now on Time its being turned off again
UpdateContext – Create a context variable to
temporarily hold a piece of information. In the above case we are calling it TimerStart
and setting it to true. We are basically starting the timer when we click Delete
because we have a timer hidden on the screen in the same way as when we update
Visibility relates to SuccessMessage context variable. Its
visible on timer start (true) and disappears on timer end (False)
What appears when the timer is ON
This is a very basic Power App that will now allow the user to add and remove customers from this table. Save and then Publish the App so it can be used in Power BI
Add the table into Power BI
The new worst served table needs to be Imported into Power
BI as a Direct Import so any changes the User makes will reflect straight away
in the reports
Just to make sure that everything is OK I’m going to add one
Customer into the table before its added just to have a record to work with.
In your Power BI Desktop file, Edit Queries and Choose
recent sources if you have already connected to your data source.
Select the Worst Served Table
We can now have composite models where some tables are
import and others are direct query. The
new table is added in as a direct query
Close and Apply
Note the message regarding potential risks when you are
querying data at source and have other tables imported in memory
Next go into Edit Queries and Merge Table
And merge the customer table with the direct Query table
this connect the table so grab customer ID
This will be added to your customer dimension
Note that so far, DimCustomer hasn’t been automatically changed to Dual Mode after being merged with the direct Query table so we dont need to worry about Dual mode in order to create our new logic.
Close and Apply
back in the desktop, go to Marketplace and grab the Power Apps Visual
The how to guide states to not add the Power App within your Desktop report so Publish the report and lets move to the Power BI Service
Power BI Service, Edit report and Add in PowerApps Visual
On the report you want to update Click Edit Report
The Power App visual will now be available in Visualisations because it was added in the desktop file
Add the Power App in power BI service
In order to test out the new service I’m adding in a blank page
Click on the power App visual and add it to the new blank
I want to add the CustID and the Date from the worst served
new staging table
Then I can choose App rather than create new because I have
already published an App
Im choosing my Worst served App and click Add
Ive clicked go to power Apps Studio which opens the PowerApp
in Studio and you also have the powerApp in Power BI Service
The very first issue I have is a formatting issue. My Power App is Tiny and unreadable. Time to go back to the power App Studio and change the Settings
App Settings – changing the App to Default size 16.9. For
the time being Im not going to think about making the app look good. I just
want to be able to see the text on the power BI page
Save and publish the PowerApp
Go back to the Power BI service
Power BI service
I had to delete and readd the PowerApp to get it to refresh.
Its also worth noting that if you dont create your visual to the right size before adding your App, the App will have scroll bars etc and you cant change the size of the PowerApp, only the size of the visual that holds it
The Power App doesn’t look great but it will do to test.
First of all we want to see if it works so add a table with the Worst served Data items
Add a CustID and click Update
It works. Now delete the item just added. Again it works. This is part one done. Part 2 is that we want
it to update worst served Customers from the customers table
How does the Updates Affect the pbix file?
Click Save in power BI service and go back to the power BI
The new visuals aren’t there. This is as expected, because
they were not created in Desktop.
Imagine that you have created your Power BI App Visual and
you still have a list of updates, changes and bug fixes that you want to do within
Power BI Desktop
If you update and rebublish you lose the PowerApp Visual
Currently this options is only viable if the PowerApp is the
very last thing you do and you don’t need to update the pbix file at all.
As a consequence I don’t think that Editing your reports within Power BI Service is a good idea.
having chatted to a few people on the forums, Editing reports in service is normally done when you need something quickly and you intend to go back and update the pbix file with the amendment.
What happens when you add the PowerApp in Desktop
In Desktop Add the PowerApp and the table to check its
working. Then Publish into the Service.
Note the change in how it works. If you Update or Delete, the table doesn’t change until you Click Refresh. If you add it in Service you don’t need to click refresh.
For the time being I’m going to accept this as it’s the only
way to move forward and keep working with the pbix file.
I have a Direct Query table in a composite model, all the other tables are Imported into Power BI
The whole reason for the Direct Query table is to use it
with a PowerApp. The PowerApp can be used to update the data in this table from
the PowerApp and the update should appear in the report straight away
However, I need to use it against an imported table, so the
report will remove records that are in the Direct Import table. Here is the
I need to join the tables together to ensure that the customer
in the direct Query table is not only updated here, but pulls out the record
from the Customer Table
Joining doesn’t work because it needs to be a left outer
join. You can’t filter for a NULL value in the staging table if you join the
Therefore, one of the best way of dealing with this would be to merge the tables in Power Query Editor using a Left outer Join
This blog post is to look at what happens when you merge a direct
query and an Imported table
Only one way to find out. Go into Edit Queries Mode
First of all, I click
on the main Customer table, Then merge Queries
Straight away, I notice that the record I have just added isn’t in the direct query table within Merge Query
Now add in the customer number of the direct query table and there it is, with the latest Customer ID linked to the Imported Data set
the CustomerID is back and we also have our Null values because a left join has been used
Could this work? Back in my test page (After closing and
It certainly seems to. Adding the filter removes this
customer from the table.
If I remove another using the PowerApp, will it work?
I have to refresh in the desktop to see what happens.
Well this is good news. You can have a direct Query table working with a PowerApp and use it to Merge with another table so you can manipulate this without it being in Direct Query mode
One of the great new features is you can now hover over a
table and see the Storage mode and last refresh date.
All the tables in the current report are set as Import. The data is imported into power BI memory and we have full functionality using the Vertipaq engine.
Dual mode can be Import OR direct query (BUT) the functionality for both is at direct Query level. You will lose some functionality if you move to Dual mode. for example, time period DAX cant be used in this mode.
Once you change to Dual Mode, its irreversible. Best to
check on a test copy before you go for it
Why Use Dual Storage?
Firstly, the new Composite storage mode allows you to set different types of storage per table. You don’t have to have the entire model set as Import or Direct Query.
With Dual Storage set Power Bi chooses which way to go, although you lose full functionality of Import mode if you are set to Dual storage.
Dual Tables are only ever on the One side of a many to many relationship
For example, If you are taking advantage of aggregation, and you were to run a query against the higher level aggregated Fact table (Import) with Date (Dual) and Product (Dual) Date and Product would be used as a direct import so the join is done in Power Query
if you then run a query that you cant use the aggregated table for and the Main Fact table at the lowest level of granularity is used , the Dual Table runs as Direct Query so the joins between the fact and dim are set at source level (In SQL) This is much more viable than trying to query data in SQL database and data in the in memory vertipaq engine.
Personally, I’m still unsure of Direct Query and Dual mode because you lose so much functionality. the time functionality is always my first go to’s when I create any report. Any measures and calculated columns that are too complex for SQL aren’t included and for me, these are the most powerful.
For this example, add a Table in Direct Query mode from SQL Server. Here I’m using my AdventureWorks DB from my local database
Add another file (any Excel file you have) Direct Query is not available for Excel. Before Composite models was available, you couldn’t add Excel if your other table was direct query. Everything had to be the same import mode
Close and Apply
Now Composite model is available you can mix storage models.
However you need to be aware that the Direct Query will have actual data within
the query (SQL) rather than all the querying being done within Power BI
You can hover over the tables to see what table storage is
set for each table
Note that we can see the data for Excel which can only be Imported data
However we cant see the data in the data pane for the Direct
Query Fact table
We can go to the model view and connect the tables together
To show this I’m going to add Dim date from the SQL Data base as an Imported table (Remember the fact is a direct query)
A composite model is one where tables have different Storage types. there are now three storage types.
import – the Best way to use Power BI because everything is cached in memory
Direct Query – As above Power BI will query your data source and your measures and calculated columns will be created within SQL if you are connecting to a SQL database
Dual – can be both Import or Direct query. We will look at this in more detail later
live Query is only available using analysis services as a data source. In this case you can only use SSAS and no other data source.
Audit Reports – Service is within the trust Portal
to determine Azure Compliance with GDPR
Compliance manager – Determines whether or not
your services meet industry standards
GDPR – Standards enforced by a government Agency
Germany – Country with a dedicated trustee for
Physically isolated instance of Azure
Azure government – Only available in the US
ISO- International Standards based on non reg
NIST – Standard based none reg agency based in
the United States
National Institute of Standards and technology
Cloud Shell, CLI and Powershell
Cross platform command based line tool
Azure Cloud Shell
Web based tool after you log onto the Azure
Use when you need to log into Azure without
opening a web browser
Locks – Prevent users from deleting resources
Advisor – Use information from the Security center
to best practices
Initiatives – Define a set of policies
Cloud Computing terms
Fault Tolerance – Power Outage in a data center.
Automatic Failover for continual operation
High Availability – Having data available when
you need it
Fault tolerance and High
Availability are both good for the scenario when you are moving on premise data
centers to the cloud. The data is mission critical, there is a need for access
to the data sources at all times. Changes are incremental and easy to predict.
Elasticity – Sudden spikes in traffic
Scalable – Increase the Number of VMs easily
Multiple Locks applied to different scopes. The most
restrictive lock is applied
The lock applies to all resources contained in a
scope and any new resources added to the scope
NSG – Network Security Group. Inbound traffic for a virtual machine from specified IP addresses
DDoS- Distributed Denial of Service Prevents a flood of HTTP traffic to a VN that hosts IIS
Firewall – Create a rule that restricts network traffic
Limit Access to Resources at the resource groups and
Notifies if App service usage exceeds the usage quota