Azure Logic App – Copying a file from Sharepoint to a Data Lake

I have been asked to set up a Logic app in Azure (That is Power Automate for anyone outside Azure) to copy specific file(s) from a Sharepoint folder and add to an Azure Data Lake.

The first example file is around 16,00 rows and not likely to grow too significantly. This is the same with the other files.

There is a specific use case behind this First logic app:

  • The Data in the csv file(s) is updated every day so the file name remains the same
  • We need to copy the File and overwrite the file in the data lake every day after the task has been done to update the Sharepoint File (Around 5PM every day)
  • we want the Logic App to run via Data Factory
  • Once the logic app has run we want to trigger the pipeline to populate the SQL database from the file in the data lake.

Set up the Logic App

In azure go to Logic App and New

Log Analytics: to get richer debugging information about your logic apps during runtime

Consumption Plan: Easiest to get started and fully managed (Pay as you go model). Workflows increase slowly or are fairly static

Standard Plan: Newer than the consumption plan. Works on a single tenant. Works on a flat monthly fee which gives you potential cost savings.

Create the Logic App

Once you have added tags and created its time to create the logic App

Because we want to trigger in Azure Data Factory we want to go for When a HTTP request is triggered

The HTTP Post URL will be used in Data Factory to trigger the Logic App.

I have added a JSON Schema that supports some of the important information for this project. Like Container for the data lake, Folder , File name and isFolder (Which becomes more important a little later.

{   
     "properties": {       
        "Container": {            
           "type": "string"        
        },        
        "fileName": {            
           "type": "string"        
        },       
       "folder": {           
            "type": "string"        
       },        
       "isFolder": {            
           "type": "boolean"        
       }    
  },    
"type": "object"
}

List Folder

Now we want to List Sharepoint folder. So create a new step and search for List Folder

Returns files contained in a Sharepoint Folder.

Next you have to Sign into Sharepoint with a valid account that has access to the Sharepoint site.

Here is where we have a question. For this test, my own username and password has been used but obviously I change my password at certain points which means that this will need manually updating when this happens.

What we need is a way of logging into Sharepoint that isn’t user related and we can use within the logic app. This needs further thought and investigation.

When you log in you create a Sharepoint API connection in Azure Resource Group

To get the site address you can go into Sharepoint, Click on the … against the file and copy link.

The link needed amending slightly because it needs to be tenant.sharepoint.com/sites/ProjectMainArea

If you have access you should then be able to click the folder against File Identifier and select the correct area

For Each

Next Stop, For each ‘Body’ from the List Folder step. We get the File Content. Go to Next Step and choose the For Each Condition (Because there will be multiple files)

Get File Content

Now We want to Get File Content From Sharepoint

Gets File contents using the File Identifier. The contents can be copied somewhere else or used as an attachment

You need to access the same Sharepoint site address as before. Then click on File identifier and choose ID from the Sharepoint Dynamic Content pop up

so here we can see that from the list folder step we have lots of file metadata we can use like DisplayName. ID, LastModified etc.

We know we need ID for Get File Content

We are at a point where we can run this now as a test.

Note that so far we have this set up

but we hit specific issues

Status 404 File not found

cannot write more bytes to the buffer than the configured maximum buffer size of 10457600

So we have two issues to resolve and after a bit of help on the Q&A Forums we find out that:

List Files “Returns files contained in a Sharepoint Folder. ” Actually also returns folders which are erroring because they are not files

Logic Apps aren’t really set up for large files. There doesn’t appear any way we can get past the size issue. So we need to check our files and also think of ways to bring through smaller data sets if needs be.

Thankfully our files are way below the threshold and the business thinks that they won’t increase too much.

So here is where we can start applying criteria, which we want to do anyway because we only want certain files.

  1. If its a folder we don’t want to use it
  2. If its over 10457600 in size we don’t want to use it
  3. Only bring through files called…….

So we need to change our For Each

Within For each add a new step and search for Condition

And add your conditions (And Or)

Then you can move the Get File content into True

So If IsFolder is false and the size is less than 10457600 we can grab file A OR File B.

When you now test this Logic App Get File content should succeed with most not even hitting the criteria.

Create Blob

Finally within the True section we need to add the file to our Data Lake.

Search for Create Blob

Here you have to sign into your Blob Storage which again creates another API Connection in Azure

You have to supply the Storage account name and choose an authentication type. Access Key has been used, the details added here. Normally in data Factory the Access Key is obtained through a Key Vault so, more information is needed to come up with the most secure way of doing this. There are two other authentication types to choose from.

More investigation is needed into these other approaches.

Now we can do a full test of the Logic App

Testing the Logic App

When you trigger the logic app

The Body contains a long list of every object. Really handy to know what the details are inside this action.

To test this was copied into a word document.

Next comes the Get File Content

Now most of the files don’t satisfy the condition.

Next was clicked to get to a file in Get File Content (first one appeared as number 32)

And now we can see the Body of the Create Blob. (This happens for every file specified in the criteria

And if you use Microsoft Storage Explorer app you can check that they have indeed been updated (Either its a new file or it updates what is already there)

Data Factory

Now we have saved the Logic App we want to trigger it in Data Factory

Create a pipeline and choose a web activity

Copy the URL from the Logic App and paste here

For the Body I simply used the Simply JSON at the start of this article.

Now you can trigger this pipeline along with all your other pipelines to run the Data into your Data Lake and then into SQL to be used for Analytics.

https://www.mssqltips.com/sqlservertip/5893/transfer-files-from-sharepoint-to-blob-storage-with-azure-logic-apps/

Power BI February 2022 Updates Dynamic M Query Parameters

Now supports SQL Server and more data sources

But what are Dynamic M Query Parameters and what does this mean?

It feels like they have been upgraded to use with direct query data sources so you can restrict the amount of data being asked for at the lowest level.

Lets have a look at a simple example using Taxi data from a Microsoft learning path.

First of all you need to open Power BI – Options and Settings – Options

Its in Preview so make sure that is ticked before continuing

Get Data Azure SQL Database (The guidance mentions SQL Server but it seems that both can be used for this test)

Load

Then go to Transform data.

Right Click on trip fares to get to advanced editor

let    
Source = Sql.Database("dataengineering-sqlserv.database.windows.net", "taxi-data-db"),    dbo_TripFares = Source{[Schema="dbo",Item="TripFares"]}[Data]
in   
 dbo_TripFares

Its currently bound to a table but we need to bind it to a query for this process.

Click the cog against source.

Go into Advanced Options and add the SQL Statement

SELECT * FROM dbo.TripFares

And then go back and look at advanced editor

let   
Source = Sql.Database("dataengineering-sqlserv.database.windows.net", "taxi-data-db", [Query="SELECT * FROM dbo.TripFares"]),    
dbo_TripFares = Source{[Schema="dbo",Item="TripFares"]}[Data]
in    
dbo_TripFares

So now, its nearly bound to a query but you will note that it looks like the table is erroring.

You can go back to Advanced Editor and change to

let
Source = Sql.Database("dataengineering-sqlserv.database.windows.net", "taxi-data-db", 
[Query="SELECT * FROM dbo.TripFares"])in   
 Source

 We only need the query and not dbo_TripFares

Now we can add the  Dynamic M Query parameters. I will go for an easy one first as a demo.

And then I change the advanced code again

let 
Source = Sql.Database("dataengineering-sqlserv.database.windows.net", "taxi-data-db", [Query="SELECT * FROM dbo.TripFares Where payment_type = '" & paramPaymentType & "'"])
in    
#"Source"

Note the new WHERE Clause that concatenates the value in our parameter

It will read in SQL SELECT * FROM dbo.TripFares Where payment_type = ‘CRD’

When it runs the first time you are asked to approve and you can actually see the SQL its going to use which is good. (Note I had to change to CSH to get the message up but I am running with CRD)

When it comes through its restricting to the selected payment type

We are going to change the code again

let    
filterQueryPaymentType = "SELECT * FROM dbo.TripFares Where payment_type = '" & paramPaymentType & "'",    
Source = Sql.Database("dataengineering-sqlserv.database.windows.net", "taxi-data-db", 
[Query=filterQueryPaymentType])
in  
#"Source"

This sets the SQL command first and we pass the Filter query into the data source

Now we know that the query works. Lets use it in Power BI Reporting and Assign to table.

This will need a lookup table of all the payment types to work

I am going to simply create the reference table in M

let
Source = Sql.Database("dataengineering-sqlserv.database.windows.net", "taxi-data-db", 
[Query="SELECT DISTINCT payment_Type FROM dbo.TripFares"])
in    #"Source"

Close and Apply

Now bind the table to the parameter in Modelling tab

Click on Payment_Type column. Advanced. Bind to parameter

Click Continue

A multi select is not going to be used for this demo

I have added a quick table. the metrics have come through as strings and there will be lots of things you need to test in direct query mode but I will ignore for the time being.

I dragged in Payment type from the Payment Type lookup into a slicer.

Click the slicer and see your data change. every time you click the slicer a direct query will happen but only for the payment type selected, hopefully making things much quicker.

And there you go. You have set up a restricted direct query. This will help with any direct query reports you need to create based on real time data.

You are still hitting the SQL DB though a lot so this would need thinking out.

And remember, Direct query doesnt give you the full Power BI reporting suite so your reports may be more basic. And usually I like to work with Star schemas but here we have the extra complexity of lookup tables to work with the parameters.

I will be looking at a date time example soon hopefully. This is clearly an extremely important piece of the Direct query real time puzzle.

Power BI Admin APIs to return a list of email subscriptions

Get Dashboard Subscriptions brings back a list of everyone who has subscribed to a dashboard

What is a Power BI Subscription?

Subscriptions are a great way to assign yourself and other users to get emails regarding report content.

There are certain governance rules we follow.

  1. Report viewers views content via an app. We don’t want report viewers coming into the App workspace. we want them to see carefully selected and brought together content.
  2. If we use Subscriptions we want to push though a really nice screen shot of a report that changes and gets the users wanting to come and see more content within that app. therefore we always have a report or dashboard with visuals that don’t need scroll bars to engage the viewer so they want to see more.
  3. because of this, we want to be able to subscribe people to App content

Go to an App. Note you can add your Subscription here which is a link to the dashboard

for this example, the App dashboard is subscribed to

then we go to try it out from the Microsoft API Page

Dashboard Subscriptions

https://docs.microsoft.com/en-gb/rest/api/power-bi/admin/dashboards-get-dashboard-subscriptions-as-admin

and try the API

Add the Dashboard ID to parameters

But this is where logic is not quite working (the hope is that this will resolve fairly quickly). The above API doesn’t give you information back if you subscribe via the app. Only when you subscribe to the actual Dashboard in the workspace.

We want all our report viewers accessing the pre built app so this is where the information is most required.

When the user is added to a dashboard subscription in the workspace. The API is tested again.

What this can show us is anyone in the workspaces that has subscribed to the actual dashboard.  We want all viewers with App access.

Get report Subscriptions as Admin

https://docs.microsoft.com/en-gb/rest/api/power-bi/admin/reports-get-report-subscriptions-as-admin

This is the same as above but with reports

Get user Subscriptions as Admin

https://docs.microsoft.com/en-gb/rest/api/power-bi/admin/users-get-user-subscriptions-as-admin

I get my user ID from Azure Active Directory

And see what I’m subscribed too but again, only workspace content

Logically, I feel like our viewers should be subscribing through the apps

this is really good stuff but I feel like they need to resolve the issue with Apps. Apps are the go to areas for users to view content so this is where we want people to subscribe too.

If you look at the information coming back. Here we can see the artifact type is report but there is no where that mentions if the report is in an App or in the workspace and I feel like this is actually important information. I only know because I have tested against both the App and the workspace.

If this could be resolved these APIs would be really useful to help us understand the subscription uptake.

Power BI Sparklines

Sparklines are small charts you can add to a table or Matrix and are new for the start of 2022.

Above we have a Matrix showing Products by country and I would like to also see this measure on a timeline.

Select the metric Value : Add a Sparkline Button

Here we choose the measure Total Sales which has already been summarised. The Date will be used as the X Axis so we can see the trend across time.

Rename to Sparkline if you want (double click the value to highlight as editable) and now you can see the trends

And you can also format the spark line (Right at the bottom of formatting)

So Sparklines are great for adding you your tabular information, I think the line charts are pretty good.

Current restrictions

Only supports 5 per visual so if you have more categories that wont work.

Apparently Date hierarchies don’t work at the moment but they will in the future

It would be good to assign conditional formatting to the colours. So going up could be green. Staying around the same amber and going down red.

they really do bring tables to life though and are are really great addition to power BI

Create your website with WordPress.com
Get started