this is the measure that has been created in the table so it doesn’t exist in the table structure. It returns the value, in this case when % Sales Forecast calculated column has been filtered down to one distinct value
Create a measure in new % sales Forecast table to take sales and multiply by the measure
For training courses and general lets have a look at updates to Power BI, an Adventureworks database instance has been set up on my machine, complete with 10 years of data that is great for using for training courses.
However, this database needs moving from its local machine to Azure
Previously bacpac files have been created to import into Azure via SQL Server Management Studio but they have always errored.
An Azure service principal is a security identity used by user-created apps, services, and automation tools to access specific Azure resources. Think of it as a ‘user identity’ (login and password or certificate) with a specific role, and tightly controlled permissions to access your resources
I am constantly having to remind myself how to set up the Service Principal for Access to things like Azure Data Lake Gen 2 when I am setting up a data factory (Or using the storage with another web app).
So I wanted to write a blog post specifically on this.
As the example, imagine you are moving data from an Azure SQL Database to files in Azure Data Lake Gen 2 using Azure Data Factory.
You attempt to add a Data Lake Connection but you need a Service Principal account to get everything Authorised.
You need this so the Data Factory will be authorised to read and add data into your data lake
An application (E.g. data Factory) must be able to participate in a flow that requires authentication. It needs to establish Secure credentials. The default method for this is a client ID and a Secret Key.
There are two types of permissions
Application Permissions No user context is required. The App (E.g. data Factory) needs to access the Web API By its self
Delegated Permissions The Client Application (E.g. data Factory) needs to access the Web API as a Signed in User.
Create an App
In Azure choose App Registrations
Here you can create an app – New Registration
Provide a name for your app. e.g. DataFactoryDataLakeApp
Grant your Registered App permissions to Azure Storage
This will enable your app to authorise Requests to the Storage Account With Azure Active Directory (AD)
You can get to your app by going to Azure Active Directory
Then App Registrations and choose the App
In your new App, go to Overview and View API Permissions
Next go to Add a permission
Go to Azure Storage API which contains Data Lake Gen 2
Notice that we are setting up Delegated Permissions for Azure Storage
You are warned that Permissions have been changed and you need to wait a few minutes to grant admin consent.
I am not an admin so I always get my admin to go into Azure Active Directory and Grant Admin Consent for Peak Indicators
Note that your app now has configured permissions for Azure Active Directory Graph and Azure Storage
Assign your new app to a subscription
Now you have an app you need to assign Contributor status to it to the level of service you require in Azure, Subscription level, Resource group level or resource level.
For this app I am going to set it up against the subscription. First go to the Subscription you want to add it to and then Access Control (IAM)
I have added the app as a contributor
Creating a Key Vault
We will be selecting and creating IDs in the next steps, but instead of simply remembering your secret. Why not store it in a Key Vault.
Centralise Application Secrets
Store Secrets and Keys Securely
Monitor Access And Use
Lets set one up in our Proof of Concept area.
Create a Key vault if you don’t have one already
remember to add any tags you need before Review + Create
Once completed you can go to the resource (E.g. Data Factory) but for the time being that is all you need to do
Application ID and Tenant ID
You can now go into your new app in Azure (App registrations) to get more details for Data Factory (When you set up the connection)
Tenant from Data Factory will be mapped to Directory (Tenant ID) from the App Overview
Service Principal ID from Data Factory will be mapped to Application (Client) ID From the App Overview
Create a Client Secret
Next, create your Client Secret.
In your App go to Certificates and Secrets
Click New Client Secret
Im going to allow this secret to Expire in a year (Anything using the app will start to fail so you would need to set a new secret and re-authorise)
We can add this into the Key vault so we don’t lose it because once you have finished here you dont see it again.
Open a new Azure Window and Go to your new Key Vault
Go to Secrets
Click + Generate Import
Notice I have set the expiration date to match the expiry date of the app
Ensuring the Access is set for the Data Lake Storage
Finally, something we have been waiting for for a long time.
Its amazing how such a little thing can case so much extra complexity.
Previously you could only show this on the KPI
The Goal States 1749 but this isnt a Goal. This is Last Months measure. Not a goal
This basically means that for the KPI you can only use them in this way if your data makes sense for there to be a Goal. Maybe you might have to add more information about the Goal in the Visual Header Tool tip icon.
However, for my report I need it to be stated that this is last month figure as compared to this months. I have lots of KPIs based on Current and this time last month so I had to create them like this
Instead of 1 visual I have 2. A KPI and a card, Hiding Goal in the KPI. this creates a lot more complexity in the reports but I had to do it to avoid Confusion.
Now, we can reset the Goal to whatever we want
Finally I can reset all my reports so I can reduce the amount of visuals shown.
Such a small change but it makes a big difference.