We have looked into using Direct Query against an OLTP database (Transactional) in https://debbiesmspowerbiazureblog.home.blog/2020/02/07/the-dos-and-donts-of-power-bi-relationships-and-modelling/
In a nutshell, you will cause yourself a lot of headaches in the future if you do this because OLTP systems are specifically for users to enter data and not for analytics. You dont want to cause issues for this people trying to do the actual work by entering information onto your systems.
So what do you do when there is some parts of your data base that you need to have as real time reporting?
The first thing to understand is that, just because you need some real time reporting, it does not mean that you have to treat everything as real time reporting. You need to separate out your needs. It may be that only one report needs to be real time and the rest can be dealt with as overnight imports etc.
You also need to make sure that you do need real time reporting. In many cases, after some investigation, this isn’t actually required.
There are a few key things to look at here
Power BI Push data sets
With a push dataset, data is pushed into the Power BI service. The Power BI service automatically creates a new database in the service to store the data. Reports can be created over this data set rather than over your OLTP with Direct Import and your visuals will be updated in real time.
You cannot do as much with this data set as we will see later but it separates the two functions to a greater extent as your analytics will be run over the push data set
You can only keep 200K rows. Any more any old rows will be removed which is something to keep in mind
We will have a look at this in more detail later
Column Store Indexing in SQL
“A column store index is an index that was designed mainly for improving the query performance for workloads with very large amounts of data (eg. data warehouse fact tables). This type of index stores the index data in a column based format rather than row based as is done with traditional indexes.”
These can be added to OLTP systems to help you run real time from your transactional systems but there are a lot of issues with this and would need lots of admin time, so may not be the way forward. This will be looked at in another post.
SQL triggers
“A trigger is a stored procedure in database which automatically invokes whenever a special event in the database occurs. For example, a trigger can be invoked when a row is inserted into a specified table or when certain table columns are being updated.”
This could be used for creating a seperate table to work with, allowing you to run analysis over this rather than your tables within your OLTP Schema.
You could have this table in a separate database which would be ensure that no useability issues occur for your transactional system users.
I will have a look at triggers in another blog post
Example 1 Power BI Push data sets
Two very simple tables have been created in the Azure SQL Database. A customer table and a sales table (So we can see how more than one table is affected)
fact.RealTimedatatestingDE

dim.RealTimedatatestingcustomerDE

An Updatedate has been added by adding GetDate() into the binding for the column for both tables

Once the seed data is in (The starting test data), make sure the ID of the dim auto populates and you can set the key as a primary ID

The last table I want is an audit table to record what the max update date is for each table

Push Data Restrictions
It should be noted that you cant work with more than one data set if you use push

Notice here, where an example Push Dataset is imported into Desktop, thats the only data set you can add.
This is the same if you create the Report in the Service
If you go into Power BI Service and to the data set (This is just an example, we haven’t created our data set yet)

And Click to Create Report

A table is added but you can’t do much to the data. You cant create measures or remove rows you don’t want in Power BI within Service
Resolution
Two need to become one. For this, a view is added in SQL
CREATE VIEW dbo.VW_RealTimeDataStreamExample
AS
SELECT d.CustKey, f.DateKey, f.ProductKey, f.DateOfSale, f.Quantity, f.Cost, d.UpdateDate AS CustomerUpdateDate,
F.UpdateDate as FactUpdateDate, d.Name
FROM [fact].[RealTimedatatestingDE]f
INNER JOIN[dim].[RealTimedatatestingcustomerDE] d ON f.CustKey = d.CustKey
Power BI Service
Go to power BI service and choose + Create and Streaming Dataset

create one push dataset for the view


Setting Historical Data Analysis to On changes it into a Push data set. the only time you would ever want to leave this as off is if you are looking at IoT data in real time
Get the Powershell script

You now have your streaming data sets in your data sets list within the App Workspace. Click on the i to see the API info for the streaming data set

This gives you the sample scripts for pushing to your data set. The Push URL is all you need to populate your dataset.
Copy the Powershell Script
We have one streaming table to work with over a view of two tables
Test the Streaming DataSet
We can either create a report in Service

By clicking on Create Report, or in desktop by connecting to the hybrid dataset RealTimeDataTestingDE
For this test I am going to create the report in Service

You cant create measures in service. I am simply going to add a table in order to view the data
Open Powershell
Search for Powershell and then Run as Administrator

Quick tip, cls will clear the screen if you make any mistakes
then you can copy the script into Powershell. For this example, the dim table will be created first

PS C:\Users\Debbie Edwards> $endpoint = "https://api.powerbi.com/beta/REMOVED/datasets/REMOVED/rows?noSignUpCheck=1&key=REMOVED"
first it creates a variable to hold the Push URL from the streaming dataset
You can then test it by simply typing the variable
PS C:\Users\Debbie Edwards> $endpoint
I’ve omitted the codes and replaced with REMOVED so yours will look different on screen
the next variable to be created is the $payload containing all the columns within the table that we set up in the streaming data set
Next we want to create the dataset variable to the Azure database and we want to do this before we create all our end point variables.
At this point have a look at this fantastic blog at https://sqlitybi.com/streaming-data-from-sql-server-to-power-bi-using-powershell/?doing_wp_cron=1582621334.5706560611724853515625
which really helped to set up powershell.
Here is the code for Powershell
$userName = 'Demo' #SQL User Name $password = password' #SQL User Password $database = 'Demo Database' #Database Name
#Uncomment below connection string to use windows credentials if required
$connectionString = "Data Source=$dataSource;Initial Catalog=$database;Integrated Security=SSPI"
#Uncomment below connection string to use SQL Server authentication
$connectionString ="Server=$dataSource;Database=$database;trusted_connection=false; user id =$userName;Password=$password;Integrated Security=False"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
#take the update date from fact because this will always be updated in=f the dim is updated
$query = '
DECLARE @TableName Varchar(200) = ''fact.RealTimedatatestingDE'';
DECLARE @MaxUpdateDate DateTime = (SELECT MAX(FactUpdateDate) FROM [dbo].[VW_RealTimeDataStreamExample]);
DECLARE @OldUpdateDate DateTime = (SELECT ISNULL([UpdateDate],0) FROM audit.RealTimeDataAudit WHERE TableName = @TableName);
UPDATE audit.RealTimeDataAudit
SET [UpdateDate] = @MaxUpdateDate
WHERE TableName = @TableName;
SELECT [CustKey], [DateKey], [ProductKey], [DateOfSale], [Quantity], [Cost], [CustomerUpdateDate], [FactUpdateDate], [Name]
FROM [dbo].[VW_RealTimeDataStreamExample]
WHERE [FactUpdateDate] = @MaxUpdateDate; '
$command = $connection.CreateCommand()
$command.CommandText = $query
$dataset = $command.ExecuteReader()
#paste the endpoint URL which you have inside Power BI
$endpoint = "https://api.powerbi.com/beta/168c1fe3-a841-49b5-b692-7b3132c0a997/datasets/1007dc63-732c-482d-9a0b-b5bca977f38d/rows?key=X80TTAT4miAln6ftW9ielhjJdBvKGXxUdoIQvwJ62K5XVOt0eXgPHyL6wMaJMVSKu%2BbZN2PmSpFJjRRGWZ%2BTXA%3D%3D"
#push data to Power BI
while ($dataset.Read()) {
$payload = @{
"CustKey" =$dataset['CustKey']
"DateKey" =$dataset['DateKey']
"ProductKey" =$dataset['ProductKey']
"DateOfSale" =$dataset['DateOfSale']
"Quantity" =$dataset['Quantity']
"Cost" =$dataset['Cost']
"CustomerUpdateDate" =$dataset['CustomerUpdateDate']
"FactUpdateDate" =$dataset['FactUpdateDate']
"Name" =$dataset['Name']
}
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json @($payload))
}
$connection.Close();
$connection.Dispose();
There are a few things changed from the code taken from the push data set. Firstly, a While loop is added simply to incrementally send rows from the data set to Power BI, but you don’t need to use the while loop.
Then then default values are replaced with values from the data set
Straight away you can spot an issue. This is fine for new records but what if the data is also being updated?
This would involve actually deleting a record in real time streaming based on the key and at present I don’t know if that’s possible.
In this meantime lets stick to inserts only.
Click Enter
Test the Powershell Script
Time to add some data to see if its working via SQL
INSERT INTO [dim].[RealTimedatatestingcustomerDE]
(Name)
SELECT 'Example 3'
INSERT INTO [fact].[RealTimedatatestingDE] (CustKey, DateKey, ProductKey, DateOfSale, Quantity, Cost) SELECT CustKey, 20200220, 255, '2020-02-20 00:00:00.000', 100, 90 FROM [dim].[RealTimedatatestingcustomerDE] WHERE Name = 'Example 3'
Go back to the report in Report Server

The data can now be viewed in the report
Weakness
The push data set can only show data added from the time its turned on so we dont get to see any of the data already in the tables before the creation of the push data set
Using the information above its not going to be able to deal with updates. Inserts only. This can be looked at in a later post, but this would actually involve updating data in the push data set or deleting based on ID and then recreating.
Orchestrating the Powershell Script
Add another clean record with the above insert SQL Script
Going back to Service you can click Refresh but note that there is no new data in the push dataset. It only worked for the one record. We need to automate the push data to Power BI Automatically.
Azure Automation with Powershell
Currently the services being used are Azure SQL Database in POC_Resource_Group so it would be good to add an Automation account in here

Note that Create Azure Run As Account is Yes. This means its a Service Principal and has a contributer role in this subscription
Set up Credentials
Runbook credentials are very similar to Azure SQL Server credentials. in your new Automation Account click on Credentials

The new account is set as azure_pocmaintenance with the same user name and and a password that has been saved for later.

Next make sure that all the Powershell modules are up to date

Next go to Runbooks and Create a Runbook



Now we are ready to add the powershell script that we have created

You can Test Pane to check everything is working



It looks like there there is an issue. The the Server isn’t accessible to the run book so we need to give it access.
We need to amend the script slightly in order for it to work in the Runbook with one extra row (Row 1)
$dataSource = 'tcp:<server_name>.database.windows.net,1433'
Now the connection String
$connectionString ="Server=$dataSource;Database=$database;trusted_connection=false; user id =$userName;Password=$password;Integrated Security=False"
relates to the data source variable

This time the Test Completes.
(At this point you will need to be able to add some records into SSMS (SQL Server Management Studio) in order to test and use the INSERT INTO SQL above
Publish the RunBook


Has simply publishing the run book worked. Lets add a record and check the report
No the new records haven’t come through.
Schedule the Run Book
Back in the run book

We want to link the run book to a Schedule Link to Schedule

The schedules are only for hour day week or month. Is this a one time run book because it just continually pushes data into Power BI. I have started it for 11 AM. In 10 minutes time so this can be tested then.
Ran the views again from above

Its missing attempts before the schedule was created. Let try adding another one
Having a look at the Runbook in azure

It completed, but it seems to have just completed the one record and nothing after the last update. This means its not a one time run book. It has to be scheduled for a recurrance.
You can set the Runbook to run Once an hour Via Recurring but we need it to run more than this
Run via a Logic App
Can we sort out this issue using a Logic App?

Start with a Recurrance Trigger

For this example, set to 3 minutes. Choose Start Time and Time Zone as extra parameters for the recurrence trigger

Next, search for and choose Azure Automation – Create Job Azure Automation

For this example, sign in to the Tenant

Once Signed in, Add the Subscription, Resource Group, Automation Account and Runbook name of the Runbook above
You can Save and Test the Logic App to ensure it runs successfully
Fully Test the Logic App
Now the Logic App is active and is set to run every three minutes from the start date, Start inserting Records using the insert scripts above. You can check that the Logic App is running jobs

So 4 jobs have run. Lets have a look at the Report in Power BI Service

The records are being transferred to Power BI, although it seems that you have to refresh the screen for them to appear.
Conclusion
So, there are issues with this, you can only add new records, and you miss data in the table previous to the Push data set. Also, the Logic App probably needs setting to Service Principal. But for this example we have:
- Set up a Power BI Real Time Data set (Set to Push)
- Created a Powershell script moving data from the SQL View to the Push Data set.
- An Azure Automation Account and a Runbook containing the Powershell script
- An Azure Logic App to run the Automation Runbook every 3 minutes
Other sites used to help with this project
https://www.blue-granite.com/blog/power-bi-streaming-datasets-for-more-than-iot
https://docs.microsoft.com/en-us/azure/automation/shared-resources/schedules
https://docs.microsoft.com/en-us/power-bi/desktop-automatic-page-refresh (For Direct Query only)
https://stefanstranger.github.io/2017/06/24/AzurLogicAppsScheduleYourRunbooksMoreOftenThanEveryHour/