Power BI Streaming Data sets Part 1

What you need

  • Power BI
  • Microsoft Flow  Or Logic Apps in Azure
  • Cognitive Service. Text Analytics in Azure

I love the show Taskmaster and I really want to see how much other people love taskmaster and when people are tweeting about it.

I want to see everything happening real time in my Power BI report.

For this example, I am going to use a streaming data set. The data will be pushed into the Power BI service, but Power BI will only store the data in a temporary cache which expires. We can give it a window of time to look at.

There is no underlying database with a streaming data set so I wont be able to use my usual report visuals. I will need to use the Streaming visuals available in Power BI

There are other ways to do real time data such as Push Data sets and PubNub streaming datasets but for this example I want to concentrate on the Streaming data set

Log into the power BI Service https://powerbi.microsoft.com/en-us/landing/signin/

I’m going to do this simply in my Workspace

Click on Create in the right hand corner of the screen

Then click on Streaming dataset

Choose API to create an API endpoint

API “a set of functions and procedures that allow the creation of applications which access the features or data of an operating system, application, or other service.”

Our Streaming Dataset will contain these values.

Extra time dimension fields have been created to use drill/down features.

The outcome of the sentiment analysis is placed in the field ‘score’. From 0 to 1 Positive to Negative

Historical data set’ to ‘on’, if you want to save the data for analysis later in time.

Power BI makes a JSON like format (a REST Webservice on the background).

JSON – is an open-standard file format that uses human-readable text to transmit data objects consisting of attribute–value pairs and array data types

REST – ‘Representational State Transfer’. It is primarily used to build Web services that are lightweight, maintainable, and scalable. A service based on REST is called a RESTful service. REST is not dependent on any protocol, but almost every RESTful service uses HTTP as its underlying protocol.

Once you click done, you can see that we have the new data set in our datasets list.

Microsoft Flow / Azure Logic App

The next stop is to create a Microsoft Flow (You can use Azure Logic App if you prefer)

Microsoft Flow is a cloud-based software tool that allows employees to create and automate workflows across multiple applications and services without the need for developer help.

https://flow.microsoft.com.  And sign up for a new account if you need one

Go to My Flows and Automated Flow

Click on When a new tweet is posted

And Create

When you are in the Flow you create New Steps to build up the logic required for your data. The Flow will create the Twitter feed for your Reports.

The first step I have simply added #Taskmaster to the When a new tweet is posted

You will also need to log into your twitter account at this point to get access to tweets

Click on New Step and add an action

The next step is to add some sentiment Analysis. I have used Text Analytics to do this

I have searched for sentiment and can then click on Text Analytics – Detect Sentiment

I have already logged into my Azure Text Analytics

I used the Key, Name and Endpoint details from Azure

You may need to set up your own Cognitive Service in Azure before you do this section.

Next step is to search for Power BI

Select Power BI – Add Rows to a Dataset

Next, add all the information about the RealTimeData streaming dataset

Most of the details can be added from the Dynamic Content picker. As you can see Score comes from the Detect Sentiment group. All the twitter data comes from the very first step ‘When a new tweet is posted ’ Group

The following are the expressions you need to add for the Time fields

  • Time formatDateTime(utcNow(),’MM/dd/yyyy HH:mm:ss’)
  • Month formatDateTime(utcNow(),’MM’)
  • Day dayOfMonth(utcNow())
  • Hour formatDateTime(utcNow(),’HH’)
  • Minute formatDateTime(utcNow(),’mm’)

Save your Workflow

Testing Service in Microsoft Flow

The real time analysis will be triggered by hashtags in Twitter

This service only allows you to get current tweets. No historical tweets can be generated at this point

The visualization tiles will only be kept for an hour so this is the kind of service you need to monitor fast paced information. 

In the Workflow you can save

Click Test and Create a tweet with one of the hashtags in the workflow.

The service will run and you should get pass or fail information.

We can now use this information to create a sentiment report in PowerBI

Back to Power BI Service

We already found the  dataset in the relevant workspace e.g. My Workspace

For this example, click on Create report against the dataset and add a table tile to the report pane. (The data may be refreshing at this level)

Leave like this for the time being. Save your report and pin table visual to dashboard by clicking the pin

I created the table simply to create a report and then a dashboard. Once at dashboard level Custom Streaming data tiles can be added. I am deleting the table when the dashboard is created for simplicity

Click on +Add Tile

Then choose the Custom Streaming data

At present, there isn’t many visualisations to choose from for this type of data

Click Next

The #Taskmaster dataset has been chosen

I am going to create a very simple bar chart to look at the last hours data and see what the sentiment score is based on location

Now I can see the sentiment score by Location.

Azure has provided the Cognitive Text services to analyse the data for a score.

It would be useful to understand what tweets are driving the sentiments by location.

I would also prefer to group on Positive neutral and negative Scores

In my next post I am going to look at how to create a sentiment description and add Description Name and Count into the Flow and streaming dataset

PowerApps in Power BI – Simple Power App to Insert a record into a Table to be used for a report Filter

First of all I would like to big up guy in a Cube https://www.youtube.com/channel/UCFp1vaKzpfvoGai0vE5VJ0w

For being one of my go to’s for anything Power BI.

I had seen in the August power BI release that there was a new PowerApps visual in preview but its Guy in a Cube that gave me the confidence to try it out.

In the September 2019 release its become fully available and I have quite a lot of business logic that I could get off the ground and into my projects using PowerApps

My initial challenge to resolve for the business:

I have worst served customers in my Data

SELECT [Top 20 Worst Served] FROM [fact].[Fact]

  • Which is a 1 or 0

SELECT [Top 20 Worst Served]  FROM [dim].[Customer]

  • Which is a true or false flag

I can then have reports for worst served customers filtered by the above

However, in some cases there may have been an investigation and its been deemed that the customer shouldn’t be in worst served. Note that this information isn’t capturing in the source data set.

They want to be able to immediately say this customer is fine and the results will then update accordingly.

Every time the data is refreshed, these data items get fully refreshed in case they are not worst served any more.

In the current Reports, Filters are set on the page for both the above columns in the dim and the fact

  • Where Dim Customer WorstServed = Yes
  • Where Fact WorstServed = 1

Question 1

  • Do the users want to see an immediate change to the report?
  • Or are they happy for it to come through on the next refresh?

The users want to see the report metrics change immediately based on them adding in information

Quick Steps to adding the POWERAPP into Power BI From Guy in a Cube

  • The Tables that are to be updated need to be in Direct Query Mode
  • The tables connected to these tables  on the 1 side of the relationship, should be in Dual Mode (Which acts as Import or Direct Query)
  • Design your reports in the Desktop
  • Once designed DO NOT ADD THE POWER APP INTO THE PBIX FILE
  • Publish your report
  • In Power BI Service click on edit Report
  • Then in the Visualisations pane, If you don’t have already, Go to the Market Place and choose PowerApps
  • Add the POWERAPP into your report
  • Choose App (Or Create New App)
  • Add in the columns required to use the POWERAPP
  • In PowerApps give the people access who need to be able to update the data

Things to Consider

  • The users want to see the change immediately rather than on the next refresh
  • My report pages are already created and I have many tables (All Imported) and relationships in the model.
  • Dual Storage Mode, I have never used this before. How many tables would need this change in the data source?
  • The PowerApp will be a new enhancement to the reports
  • The PowerApp hasn’t been built yet
  • I am concerned about adding the visual into Power BI Service in Edit mode because the standard way to create reports is in Desktop. I have many more Updates to do after the power App has been added and I don’t know how adding visuals in Service only will work with the ongoing App amendments

Possible Data Solution 1

in the PowerApp, the User Adds in the CustID, (And an automatic date is assigned) these get inserted into the table Staging Worst Served Amendments

Then Dim Customer and the fact table are checked through using the custID and the items are set to ‘No’ and 0 as above (this is clearly the most difficult bits because it’s an update based on the CustID

The next refresh will again change 153 to Worst Served Yes, however an extra bit of script in the Stored Procedure that creates the table will Check the Worst served Amendments table and if there, reset to No.

The above Staging table is only used for the overnight data refresh

Changing Import to Direct Query

To develop against the above resolution, both the fact table and the property table need to be changed to direct import. What do we lose if we do this?

All my DAX functions for time will be lost against my fact table (Year to Date, This time last month etc).

Therefore I really don’t want to lost these DAX queries by changing to Direct Query

Also currently I cant seem to change the setting from Import to direct Query in Properties. Unless this is a bug, It seems you would have to start again with the tables and re import them as direct Query.

Is there another way that the new logic can be added without setting both these tables to Direct Query?

Possible Data Solution 2

Filters in Power BI

  • Where Dim Customer WorstServed = Yes
  • Where Fact WorstServed = 1
  • And Staging Worst Served Amendments CustID is NULL

Issues with the Above Solution

You cant have a Filter for an empty customer ID because this is a Left Outer Join.

There may be other issues. What happens if the user accidentally adds in Multiple custIDs and the relationship changes to Many to Many?

Normally I would deal with this by merging the tables in a left join in Power Query Editor

As a consequence I need to think about another Solution to this problem

Possible Solution 3

I have already checked that you can merge a direct query and and Imported Table. See https://debbiesmspowerbiazureblog.home.blog/2019/09/19/what-happens-when-you-merge-a-direct-query-and-an-imported-table-in-a-power-bi-composite-model/

This means that we can left join the table and filter out the none nulls in the report

Creating the Staging table for the Insert

Top tip is that within the Power APP your tables will need a Primary key or else the POWERAPP will fail

First I need an actual table in my Azure SQL database. In SQL Server Management Studio I created the table via SQL

CREATE TABLE [staging].[WorstServedAmendments](
	[CustID] [varchar](255) NOT NULL,
	[AmendmentDate] [datetime] NULL,
 CONSTRAINT [PK_WorstServedAmendments] PRIMARY KEY CLUSTERED 
(
	[CustID] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]

Creating the Power App

I work with such a great team. Thanks to Liam Chapman and Happy Wired for giving me a heads up on PowerApps and writing the code for me

Log into https://powerapps.microsoft.com/en-us/

I need an update button. A delete button and a text box to add the customer number into.

It would also be good to have a little message to say that the button has been successful

These objects can be added via Controls.

  • Timer1 Relates to a hidden Timer on the page
  • txtUpdateMsg relates to a hidden text box that will appear on update
  • btnDelete relates to the Delete Button
  • btnUpdate relates to the Update Button
  • txtCustID (hidden in image) relates to the textbox where the user will add the customer number
  • lblForTxtBox relates to the description label

Data Source (Start Amending from here)

I’ve connected to my Azure database and I’m connected to my new worst served table

The CustID is a varchar within the SQL Table

Update Button

Lets start with the basics and start with the Update button

OnSelect

Patch('[staging].[WorstServedAmendments]',Defaults('[staging].[WorstServedAmendments]'),{ProNumber:Value(txtProNumber.Text), AmendmentDate:Now()});
UpdateContext({TimerStart:true});

Code

Patch – Modifies or creates one more record in a data source, or merges records outside of a data source

Defaults – Use Patch with Defaults function to create records (Insert a new record into a table)

Value() – This is converting a text value to a number

UpdateContext – Create a context variable to temporarily hold a piece of information. In the above case we are calling it TimerStart and setting it to true. We are basically starting the timer when we click update because we have a timer hidden on the screen

Timer1

On Start we are going to use the Context Variable TimerStart

Hide the Timer because we don’t need to see it

Create a context variable called SuccessMessage and set to true

Create the context variable SuccessMessage and reset it to False

Create another Context variable called TimerStart and set to False. TimerStart was started on Update and now on Time its being turned off again

Delete Button

OnSelect

RemoveIf('[staging].[WorstServedAmendments]',ProNumber = txtProNumber.Text)
UpdateContext({TimerStart:true});

Code

RemoveIf – Remove a record if it exists

UpdateContext – Create a context variable to temporarily hold a piece of information. In the above case we are calling it TimerStart and setting it to true. We are basically starting the timer when we click Delete because we have a timer hidden on the screen in the same way as when we update

txtUpdateMsg

Visibility relates to SuccessMessage context variable. Its visible on timer start (true) and disappears on timer end (False)

What appears when the timer is ON

This is a very basic Power App that will now allow the user to add and remove customers from this table. Save and then Publish the App so it can be used in Power BI

Add the table into Power BI

The new worst served table needs to be Imported into Power BI as a Direct Import so any changes the User makes will reflect straight away in the reports

Just to make sure that everything is OK I’m going to add one Customer into the table before its added just to have a record to work with.

In your Power BI Desktop file, Edit Queries and Choose recent sources if you have already connected to your data source.

Select the Worst Served Table

We can now have composite models where some tables are import and others are direct query.  The new table is added in as a direct query

Close and Apply

Note the message regarding potential risks when you are querying data at source and have other tables imported in memory

Next go into Edit Queries and Merge Table

And merge the customer table with the direct Query table

Click OK.

this connect the table so grab customer ID

This will be added to your customer dimension

Note that so far, DimCustomer hasn’t been automatically changed to Dual Mode after being merged with the direct Query table so we dont need to worry about Dual mode in order to create our new logic.

Close and Apply

back in the desktop, go to Marketplace and grab the Power Apps Visual

The how to guide states to not add the Power App within your Desktop report so Publish the report and lets move to the Power BI Service

Power BI Service, Edit report and Add in PowerApps Visual

On the report you want to update Click Edit Report

The Power App visual will now be available in Visualisations because it was added in the desktop file

Add the Power App in power BI service

In order to test out the new service I’m adding in a blank page

Click on the power App visual and add it to the new blank page

I want to add the CustID and the Date from the worst served new staging table

Then I can choose App rather than create new because I have already published an App

Im choosing my Worst served App and click Add

Ive clicked go to power Apps Studio which opens the PowerApp in Studio and you also have the powerApp in Power BI Service

The very first issue I have is a formatting issue. My Power App is Tiny and unreadable. Time to go back to the power App Studio and change the Settings

PowerApps studio

App Settings – changing the App to Default size 16.9. For the time being Im not going to think about making the app look good. I just want to be able to see the text on the power BI page

  • Save and publish the PowerApp
  • Go back to the Power BI service

Power BI service

I had to delete and readd the PowerApp to get it to refresh.

Its also worth noting that if you dont create your visual to the right size before adding your App, the App will have scroll bars etc and you cant change the size of the PowerApp, only the size of the visual that holds it

The Power App doesn’t look great but it will do to test.

First of all we want to see if it works so add a table with the Worst served Data items

Add a CustID and click Update

It works. Now delete the item just added. Again it works.  This is part one done. Part 2 is that we want it to update worst served Customers from the customers table

How does the Updates Affect the pbix file?

Click Save in power BI service and go back to the power BI Desktop file

The new visuals aren’t there. This is as expected, because they were not created in Desktop.

Imagine that you have created your Power BI App Visual and you still have a list of updates, changes and bug fixes that you want to do within Power BI Desktop

If you update and rebublish you lose the PowerApp Visual

Currently this options is only viable if the PowerApp is the very last thing you do and you don’t need to update the pbix file at all.

As a consequence I don’t think that Editing your reports within Power BI Service is a good idea.

having chatted to a few people on the forums, Editing reports in service is normally done when you need something quickly and you intend to go back and update the pbix file with the amendment.

What happens when you add the PowerApp in Desktop

In Desktop Add the PowerApp and the table to check its working. Then Publish into the Service.

Note the change in how it works. If you Update or Delete, the table doesn’t change until you Click Refresh. If you add it in Service you don’t need to click refresh.

For the time being I’m going to accept this as it’s the only way to move forward and keep working with the pbix file.

I have created an idea on the power BI forum to deal with this issue https://community.powerbi.com/t5/Issues/PowerApp-with-DirectQuery-Mode-Need-to-be-able-to-have-correct/idi-p/791690#M48425

Omitting Worst Served Customers using the new Filter

I have Customer 1000000 and 1000001 in my Direct Query table and they have both been removed from the report

Great, I now have a Power App that can be used to omit records in a report without straight away.

There are some issues to be ironed out but this really opens up what you can do with Power BI

What happens when you merge a direct query and an Imported table in a Power BI composite Model

I have a Direct Query table in a composite model, all the other tables are Imported into Power BI

The whole reason for the Direct Query table is to use it with a PowerApp. The PowerApp can be used to update the data in this table from the PowerApp and the update should appear in the report straight away

However, I need to use it against an imported table, so the report will remove records that are in the Direct Import table. Here is the example

I need to join the tables together to ensure that the customer in the direct Query table is not only updated here, but pulls out the record from the Customer Table

Joining doesn’t work because it needs to be a left outer join. You can’t filter for a NULL value in the staging table if you join the tables.

Therefore, one of the best way of dealing with this would be to merge the tables in Power Query Editor using a Left outer Join

This blog post is to look at what happens when you merge a direct query and an Imported table

Only one way to find out. Go into Edit Queries Mode

First of all,  I click on the main Customer table, Then merge Queries

Straight away, I notice that the record I have just added isn’t in the direct query table within Merge Query

Now add in the customer number of the direct query table and there it is, with the latest Customer ID linked to the Imported Data set

the CustomerID is back and we also have our Null values because a left join has been used

Could this work? Back in my test page (After closing and applying)

It certainly seems to. Adding the filter removes this customer from the table.

If I remove another using the PowerApp, will it work?

I have to refresh in the desktop to see what happens.

Well this is good news. You can have a direct Query table working with a PowerApp and use it to Merge with another table so you can manipulate this without it being in Direct Query mode

Great news

Power BI Cross Filter Direction

Your Power BI Data Model contains multiple tables that need to be joined in the Relationship Pane.

Cardinality is well understood (1 to many, many to many etc) But what does the Cross filter direction mean?

Example using Azure SQL Database Adventureworks Data Warehouse

You can Create a SQL data warehouse in Azure.

Note that Sample Source has been chosen which opens up the AdventureWorksDW option

  • DimProduct
  • DimProductCategory
  • DimProductSubCategory
  • FactResellerSales

In Power BI Get Data and Azure Data Warehouse

Please note. If you are using the Warehouse to do a quick demo against, always remember to either Delete the Warehouse or Pause the Warehouse unless you want to create unnecessary costs within Azure

Login to your Server in SQL Server Management Studio with the Username and Password set up in Azure

Note, if there are issues you may need to go to the SQL Server in Azure and add your IP address to the firewall configuration.

Choose the Dims and Fact table(s) and then click OK

Then Close and Apply (Power Query Editor)

Next, In Power BI Desktop go to the relationships pane

The relationships are already set up with Cross filtering set to single.

Create a Report to test the Filtering

Add a Slicer and then go to DimProductCategory and Choose EnglishProductCategoryName for the Slicer Value

Add  Slicer 2 and then go to DimProduct and Choose EnglishProductName for the Slicer Value

Now you can select a Product Category and only those Products in that Product category will be shown

In this example I have chosen Accessories and EnglishProductName only displays Accessories

If you unselect the Category ALL products are then available in Slicer 2

If I then, for example, Choose Cable Lock from the list, Nothing happens to slicer 1. Everything is still visible

How do you enable cross filtering?

Enable Cross Filtering

Go back to the Relationships pane and click on Manage Relationshipsedit the Relationship between DimProduct and DimProductSubCategory

Set the Cross Filter Direction to Both (And tick apply security filter in both directions)

You can now select an English Product Name and Slicer 1 will only show the Category that the Product Name Belongs to

Conclusion

Cross Filtering, in the case of Slicers allows you to not only move top down but also from bottom up of the hierarchy

This is great when the reports created need Multiple slicers to select the data for the Reports

Dual Mode Storage mode Setting in Power BI

Added July 2018 as part of the composite model. See https://debbiesmspowerbiazureblog.home.blog/2019/09/10/power-bi-composite-models/

One of the great new features is you can now hover over a table and see the Storage mode and last refresh date.

All the tables in the current report are set as Import. The data is imported into power BI memory and we have full functionality using the Vertipaq engine.

Dual mode can be Import OR direct query (BUT) the functionality for both is at direct Query level. You will lose some functionality if you move to Dual mode. for example, time period DAX cant be used in this mode.

Once you change to Dual Mode, its irreversible. Best to check on a test copy before you go for it

Why Use Dual Storage?

Firstly, the new Composite storage mode allows you to set different types of storage per table. You don’t have to have the entire model set as Import or Direct Query.

With Dual Storage set Power Bi chooses which way to go, although you lose full functionality of Import mode if you are set to Dual storage.

Dual Tables are only ever on the One side of a many to many relationship

For example, If you are taking advantage of aggregation, and you were to run a query against the higher level aggregated Fact table (Import) with Date (Dual) and Product (Dual) Date and Product would be used as a direct import so the join is done in Power Query

if you then run a query that you cant use the aggregated table for and the Main Fact table at the lowest level of granularity is used , the Dual Table runs as Direct Query so the joins between the fact and dim are set at source level (In SQL) This is much more viable than trying to query data in SQL database and data in the in memory vertipaq engine.

Personally, I’m still unsure of Direct Query and Dual mode because you lose so much functionality. the time functionality is always my first go to’s when I create any report. Any measures and calculated columns that are too complex for SQL aren’t included and for me, these are the most powerful.

Power BI composite Models

For this example, add a Table in Direct Query mode from SQL Server. Here I’m using my AdventureWorks DB from my local database

Add another file (any Excel file you have) Direct Query is not available for Excel. Before Composite models was available, you couldn’t add Excel if your other table was direct query. Everything had to be the same import mode

Close and Apply

Now Composite model is available you can mix storage models. However you need to be aware that the Direct Query will have actual data within the query (SQL) rather than all the querying being done within Power BI

You can hover over the tables to see what table storage is set for each table

Note that we can see the data for Excel which can only be Imported data

However we cant see the data in the data pane for the Direct Query Fact table

We can go to the model view and connect the tables together

To show this I’m going to add Dim date from the SQL Data base as an Imported table (Remember the fact is a direct query)

A composite model is one where tables have different Storage types. there are now three storage types.

  • import – the Best way to use Power BI because everything is cached in memory
  • Direct Query – As above Power BI will query your data source and your measures and calculated columns will be created within SQL if you are connecting to a SQL database
  • Dual – can be both Import or Direct query. We will look at this in more detail later

live Query is only available using analysis services as a data source. In this case you can only use SSAS and no other data source.

Power BI grouping fields within a table

I have a fact table that now contains many measures and calculated columns. So many in fact that its quite difficult to scroll through.

There is now a way to group all your fields within a table to make it easier to use.

Lets look at the first group. I have lots of measures relating to rolling Quarter. These can all go in a folder together

Go to model view where you can find Properties for the fields

Click on one of the fields in your table that you want to group and set the Display folder name

You can then go through each field and set the display folder to group the measures

To speed up the process, hold down ctrl and select multiple columns and set the Display column for all the selected fields.

this makes it much easier to look after your measures

Power BI Connecting to a Sharepoint Document Library

I have some folders in Sharepoint and I want to be able to connect to these in Power BI

The folders are in

Get Data

There is no specific connector to Sharepoint Document Library at present

Choose Sharepoint Folder. Add in the Sharepoint Location. The route URL is required here

You may then need to authenticate your Sharepoint connection

When you get to the next screen notice that the content that is shown are files and we don’t want this. Click Edit

Simply Remove Files and change it to Contents

Click the tick and your contents will be refreshed to show Files

In this case click on Table against the Uploads folder which takes the user down to the next level.

Here are the 3 files that I want to append with all the data

Hopefully Power BI will introduce a better way of adding in Sharepoint Document Library soon

Power BI Forecasting

In many situations there is a business need to look at forecasts of, for example sales.

The following table shows example data of measures created to use for a Sales Forecasting measure

Creating some Key Measures

Using DAX

We have a Sales Table that contains Quantity and a Products table that contains Price

The following is a calculated column to get the total sales from Quantity and Current Price for each row

Add a filter to your report page for the year in order to see the above table

To create the data for the table above, create a table and drag Date (Date table joined to sales)

Next create a new measure

This becomes column 2 of the above table. At each date it looks at the Totals Sales at that point last year

Create your 2 years ago measure

Next is the 3 years ago measure

These become the next columns in the table above

Creating the Sales Forecast using variables

Or, instead of creating three different measures to use for forecasting we can use variables (The 3 measures are basically set inside the forecast measure instead of creating all 4 measures

We can improve on this. We want our forecast to show a 5% growth rate

If you look at the sales forecast in a bar chart by day it would show you how much money you need to earn every day over the year based on the previous years data

Showing the cumulative Sales Forecast

Forecasts can now be derived by anything in your model. Product name etc

Another forecasting example using Power BI Analytics

Now if you go into the analytics pane

Scroll all the way to the bottom you will see forecasting. The following are the forecasting defaults

  • Add: Forecast 1
  • Forecast Length: 10 Points (10 days into the future because we are using   day)
  • Ignore Last: 0 points
  • Confidence: interval 95%
  • Seasonality: Auto Points

Change Forecast Length to 12 months (Into the future)

Seasonality: Change to 365 points because there are 365 days in the year and this now takes you to a full forecasting year

this is a great way of playing with the Forecasting to see the effects of any change in the forecasting defaults

Currently you can only forecast on a line chart. Hopefully there will be lots more Forecasting updates in the future.

Azure Fundamentals training (Quick Notes)

I’ve been spending some time revising for the Azure Fundamentals course. Here is a quick list of some of the  more problematic test questions I’ve come across

Azure Advisor

  • Detects threats and vulnerabilities
  • Ensures Fault Tolerance
  • Helps reduce spending
  • Protects data from accidental deletions
  • Speeds up your apps

Application Gateway

Multiple instances of a web application are created across three availability Zones. The company also configures a networking product to evenly distribute service requests based on 3 different URL’s

Application insights

  • Feature of Azure monitor
  • Visually analyse telemetry data

ATP (Azure threat Protection)

  • Pass the ticket – Attacker stealing KERBEROS data
  • Pass the hash – Attacker stealing NTLM data
  • Suspected Brute Force attack – Multiple attempts to guess a users password

Compliance

http://servicetrust.microsoft.com – Compliance manager URL

  • Audit Reports – Service is within the trust Portal to determine Azure Compliance with GDPR
  • Compliance manager – Determines whether or not your services meet industry standards
  • GDPR – Standards enforced by a government Agency
  • Germany – Country with a dedicated trustee for customer data.
    • Physically isolated instance of Azure
  • Azure government – Only available in the US
  • ISO- International Standards based on non reg agency
  • NIST – Standard based none reg agency based in the United States
    • National Institute of Standards and technology

Cloud Shell, CLI and Powershell

Azure CLI

  • Az login
  • Cross platform command based line tool

Azure Cloud Shell

  • New-AzureRmVM
  • Web based tool after you log onto the Azure portal

Azure Powershell

  • Connect -AzureRMAccount
  • Use when you need to log into Azure without opening a web browser

Azure Governance

  • Locks – Prevent users from deleting resources
  • Advisor – Use information from the Security center to best practices
  • Initiatives – Define a set of policies

Cloud Computing terms

  • Fault Tolerance – Power Outage in a data center. Automatic Failover for continual operation
  • High Availability – Having data available when you need it

Fault tolerance and High Availability are both good for the scenario when you are moving on premise data centers to the cloud. The data is mission critical, there is a need for access to the data sources at all times. Changes are incremental and easy to predict.

  • Elasticity – Sudden spikes in traffic
  • Scalable – Increase the Number of VMs easily

Azure Locks

  • Multiple Locks applied to different scopes. The most restrictive lock is applied
  • The lock applies to all resources contained in a scope and any new resources added to the scope

Networking

  • NSG – Network Security Group. Inbound traffic for a virtual machine from specified IP addresses
  • DDoS- Distributed Denial of Service Prevents a flood of HTTP traffic to a VN that hosts IIS
  • Firewall – Create a rule that restricts network traffic

RBAC

Limit Access to Resources at the resource groups and resource Scope

Service Health

  • Notifies if App service usage exceeds the usage quota
  • Respond to planned Service outages
  • Implement a web hook to display health incidents