Visually presenting Windows Virtual Desktop monitoring information in Power BI

The question I get asked frequently is "how do I visualise Windows Virtual Desktop logging and monitoring information"?

So a new PaaS role of Windows Virtual Desktop has been the new Diagnostics service, from which you can get a whole host of - well, diagnostics. As part of the Spring Release of Windows Virtual Desktop we have now made it easier to connect every Windows Virtual Desktop ARM object to three services for the storage of those diagnostics - more on that later. One of those three services: Log Analytics can be easily integrated with another service called Power BI for the visualisation of a whole host of data sources. The official Windows Virtual Desktop documentation on this is here.

This article is broken down to three sections.

Section 1 - how to create a Log Analytics Workspace.
Section 2 - how to connect the Windows Virtual Desktop ARM objects to Log Analytics. This also has a                   growing collection of Kusto sample queries to help you out.
Section 3 - how to integrate Log Analytics with Power BI to visualise that data. 

This article assumes you know about Windows Virtual Desktop already, and is written in the context of the Spring 2020 release.

Section 1 - Create a Log Analytics Workspace

1. In the Azure portal create a new service and search for "Log Analytics"











Click on Create

2. Give your Log Analytics Workspace a name and Resource Group.

3. Choose a pricing tier

4. Click on Next: Tags5. Click on Review and Create6. Click on Create. 

Section 2 - Connect Windows Virtual Desktop Objects to Log Analytics1. Go to the Windows Virtual Desktop Hub in the Azure portal and select Workspaces2. Select the Workspace you want to log. Click on Diagnostics Settings in the Monitoring section, and on the left click on + Add diagnostics setting

3. Give your connection a name and then select the items that you would like to log and then on the right select "Send to Log Analytics" and choose your Subscription and your Log Analytics Workspace:

Click on Save at the top4. Now repeat this for all Windows Virtual Desktop objects such as Host pools and Application Groups that you also want to log.























Once data has been sent to Log Analytics we can go an take a look

5. In the Azure portal go to your Log Analytics Workspace. In the General section go to Logs

You may need to wait some time for these events to start appearing in Log Analytics







 








In the new query section you will see all of the tables with data. The Windows Virtual Desktop specific tables are at the very bottom, starting with WVD....


6. On the right is where you would enter you queries against this data. To do this you need to run a Kusto query. This article wont cover how to do this, as this is a broad subject. Kusto query syntax is available here.

But to get started below are some sample queries you can run. 

You enter your query and select your timeframe click on the Run button, which will return the data beneath. The results of these queries will look something like this:



Sample Kusto Queries

This interim section has a collection of Kusto queries you can modify and run in your workspace. This is quite a long list of queries that you might find useful. If you want you can skip down to Section 3 on PowerBI to see how you can visualise this data within PowerBI reports.

Get all connections for a single user. (replace userupn with your user)WVDConnections
|where UserName == "userupn"
|take 100
|sort by TimeGenerated asc, CorrelationId
Get the number of connections for a user by day
WVDConnections
|where UserName == "userupn"
|take 100
|sort by TimeGenerated asc, CorrelationId
|summarize dcount(CorrelationId) by bin(TimeGenerated, 1d)

Get the session duration by user
let Events = WVDConnections | where UserName == "userupn" ;
Events
| where State == "Connected"
| project CorrelationId , UserName, ResourceAlias , StartTime=TimeGenerated
| join (Events
| where State == "Completed"
| project EndTime=TimeGenerated, CorrelationId)
on CorrelationId
| project Duration = EndTime - StartTime, ResourceAlias
| sort by Duration asc

Get errors for a specific user
 WVDErrors
| where UserName == "tom@tomhicklingcom"
|take 100

Get occurences of a specific error
WVDErrors
| where CodeSymbolic =="ErrorSymbolicCode"
| summarize count(UserName) by CodeSymbolic

Get error occurrence across all users
WVDErrors
| where ServiceError =="true"
| summarize usercount = count(UserName) by CodeSymbolic
| sort by usercount desc
| render barchart

Get a list of connections made by your users
WVDConnections
| project-away TenantId,SourceSystem
| summarize arg_max(TimeGenerated, *), StartTime = min(iff(State== 'Started', TimeGenerated , datetime(null) )), ConnectTime = min(iff(State== 'Connected', TimeGenerated , datetime(null) )) by CorrelationId
| join kind=leftouter (
WVDErrors
|summarize Errors=makelist(pack('Code', Code, 'CodeSymbolic', CodeSymbolic, 'Time', TimeGenerated, 'Message', Message ,'ServiceError', ServiceError, 'Source', Source)) by CorrelationId
) on CorrelationId
| join kind=leftouter (
WVDCheckpoints
| summarize Checkpoints=makelist(pack('Time', TimeGenerated, 'Name', Name, 'Parameters', Parameters, 'Source', Source)) by CorrelationId
| mv-apply Checkpoints on
(
order by todatetime(Checkpoints['Time']) asc
| summarize Checkpoints=makelist(Checkpoints)
)
) on CorrelationId
| project-away CorrelationId1, CorrelationId2
| order by TimeGenerated desc

View feed activity of your users
WVDFeeds | project-away TenantId,SourceSystem
| join kind=leftouter (
WVDErrors |summarize Errors=makelist(pack('Code', Code, 'CodeSymbolic', CodeSymbolic,
'Time', TimeGenerated, 'Message', Message ,'ServiceError', ServiceError, 'Source', Source)) by CorrelationId
) on CorrelationId
| join kind=leftouter (
WVDCheckpoints
| summarize Checkpoints=makelist(pack('Time', TimeGenerated, 'Name', Name, 'Parameters', Parameters, 'Source', Source)) by CorrelationId
| mv-apply Checkpoints on
(
order by todatetime(Checkpoints['Time']) asc
| summarize Checkpoints=makelist(Checkpoints)
)
) on CorrelationId
| project-away CorrelationId1, CorrelationId2
| order by TimeGenerated desc
Count of brokered sessions by state
WVDConnections
| where State =~ "Started" and Type =~"WVDConnections"
| extend CState=iff(SessionHostOSVersion=="<>","Failure","Success") | summarize Count=count() by State=CState
| render piechart

Failed connections
WVDConnections
| where State =~ "Started" and Type =~"WVDConnections"
| extend Multi=split(_ResourceId, "/")
| extend CState=iff(SessionHostOSVersion=="<>","Failure","Success")
| where CState=~"Failure"
| project TimeStamp=TimeGenerated, UserName, ResourceGroup=Multi[4],
HostPool=Multi[8],ResourceAlias, SessionHost=SessionHostName, ClientOS=ClientOS,
ClientWvdVersion=ClientVersion, CorrelationId
| order by TimeStamp desc

Details of failed connections
WVDConnections
| where State =~ "Started" and Type =~"WVDConnections"
| extend Multi=split(_ResourceId, "/")
| extend CState=iff(SessionHostOSVersion=="<>","Failure","Success")
| where CState =~"Failure"
| order by TimeGenerated desc
| where State =~ "Started" | extend Multi=split(_ResourceId, "/")
| project ResourceAlias, ResourceGroup=Multi[4], HostPool=Multi[8],
SessionHostName, UserName, CState=iff(SessionHostOSVersion=="<>","Failure","Success"),
CorrelationId, TimeGenerated
| join kind= leftouter (
WVDErrors
) on CorrelationId
| extend DurationFromLogon=datetime_diff("Second",TimeGenerated1,TimeGenerated)
| project TimeStamp=TimeGenerated, DurationFromLogon, UserName, ResourceAlias,
SessionHost=SessionHostName, Source, CodeSymbolic, ErrorMessage=Message, ErrorCode=Code,
ErrorSource=Source ,ServiceError, CorrelationId
| order by TimeStamp desc
Session duration by host pool
WVDConnections
| where Type =~"WVDConnections" and State =~ "Started"
| extend Multi=split(_ResourceId, "/") | project ResourceAlias, HostPool=toupper(HP=Multi[8]), SessionHostName, UserName,
CState=iff(SessionHostOSVersion=="<>","Failure","Success"), CorrelationId,
TimeGenerated, ResourceGroup=Multi[4],
DesktopGroup_s=toupper(strcat(RG=Multi[4],".", HP=Multi[8]))
| join kind= leftouter (
WVDCheckpoints
) on CorrelationId
| extend DurationFromLogon=datetime_diff("Second",TimeGenerated1,TimeGenerated)
| where Name=~"RdpStackLogon"
| project UserName, ResourceGroup, DesktopGroup_s,SessionHost=SessionHostName,
TimeStamp=TimeGenerated1, DurationFromLogon
| summarize DurationInSeconds=avg(DurationFromLogon) by HostPool=DesktopGroup_s
| render columnchart kind=unstacked

Session duration by user
WVDConnections
| where Type =~"WVDConnections" and State =~ "Started"
| extend Multi=split(_ResourceId, "/")
| project ResourceAlias, HostPool=toupper(HP=Multi[8]), SessionHostName , UserName,
CState=iff(SessionHostOSVersion=="<>","Failure","Success"), CorrelationId,
TimeGenerated, ResourceGroup=Multi[4],
DesktopGroup_s=toupper(strcat(RG=Multi[4],".", HP=Multi[8]))
| join kind= leftouter (
WVDCheckpoints
) on CorrelationId
| extend DurationFromLogon=datetime_diff("Second",TimeGenerated1,TimeGenerated)
| where Name=~"RdpStackLogon"
| project UserName, ResourceGroup, DesktopGroup_s,SessionHost=SessionHostName,
TimeStamp=TimeGenerated1, DurationFromLogon
| summarize DurationInSeconds=avg(DurationFromLogon) by HostPool=UserName
| order by DurationInSeconds desc

Admin activities by time
WVDManagement
| summarize Count=count() by bin(TimeGenerated,15)
| render scatterchart

Get average session logon time by hostpool
WVDConnections
| where TimeGenerated > ago(24h)
| where State == "Started"
| project CorrelationId , UserName, ConnectionType,
StartTime=TimeGenerated, _ResourceId
| join (WVDConnections
| where State == "Connected"
| project EndTime=TimeGenerated, CorrelationId)
on CorrelationId
| project Duration = EndTime - StartTime, _ResourceId
| summarize AvgDuration=avg(Duration) by _ResourceId
| extend Multi=split(_ResourceId, "/")
| project AvgDuration, ResourceGroup=Multi[4], HostPool=Multi[8]
| sort by AvgDuration desc

Get total session connection for all users that have been active in the last 7 days
WVDConnections
| where TimeGenerated > ago(7d)
| where State == "Connected"
| project CorrelationId , UserName, ConnectionType , StartTime=TimeGenerated
| join (WVDConnections
| where State == "Completed"
| project EndTime=TimeGenerated, CorrelationId)
on CorrelationId
| project SessionDuration = EndTime - StartTime, ConnectionType, UserName
| extend SessionDuration
| summarize Duration = sum(SessionDuration) by UserName
| sort by Duration desc

Just for reference the difference between Completed, Started and Connected in many of these outputs is as follows:

Started: User clicked the icon in the app to initiate the remote connection. Connected: User has successfully established a connection to the VM.

Completed: User/Server disconnected the session.


Get the amount of data ingested into Log Analytics over a time period per day
Usage
| where TimeGenerated > startofday(ago(31d))
| where IsBillable == true
| where Solution == "LogManagement"
| where DataType == "WVDFeeds" or DataType == "WVDAgentHealthStatus"
or DataType == "WVDCheckpoints" or DataType == "WVDConnections"
or DataType == "WVDErrors"
| summarize TotalVolumeGB = sum(Quantity) / 1000 by bin(TimeGenerated, 1d),

Solution
| render barchart


Get the amount of data a metric produces
let daystoSearch = 31d;
let tables = search *
| where TimeGenerated >= ago(daystoSearch)
| summarize  RecordCount=count() by $table
| project leftName=$table, RecordCount, Clause=1 ;
Usage
| where TimeGenerated >= ago(daystoSearch)
| where IsBillable == true
| where DataType !=""
| summarize TotalSize=sum(Quantity) by DataType
| project rightName=DataType, TotalSize, Clause=1
| join kind=leftouter (tables) on Clause
| where leftName == rightName
| project TableName=leftName , MBperRecord=round(TotalSize / RecordCount,6),
RecordCount, TotalGB = round(TotalSize/1024, 6)
| sort by MBperRecord desc
| where TableName == "Perf" or TableName == "WVDManagement" or
TableName == "WVDCheckpoints" or TableName  == "WVDFeeds" or
TableName == "WVDConnections" or TableName == "WVDAgentHealthStatus"
or TableName == "Event"


Get the clients running which version of the client
WVDFeeds
| where TimeGenerated > ago(24h) and ClientVersion == "1.2.2061.0" | sort by UserName desc, TimeGenerated asc


Get the regional Gateway a user has connected through
WVDConnections
project TimeGenerated,UserName,GatewayRegion,ClientVersion,
SessionHostName, SessionHostSxSStackVersion, SessionHostAgentVersion


Correlate all connection events for one user
WVDConnections
| where State == "Started" | join ( WVDConnections | where State == "Connected" ) on CorrelationId, UserName | join ( WVDConnections | where State == "Completed" ) on CorrelationId, UserName | extend Duration = datetime_diff('minute', TimeGenerated2, TimeGenerated) | project StartTime = TimeGenerated, EndTime = TimeGenerated2, UserName,
ConnectionType, ClientOS, ClientVersion, ClientSideIPAddress, SessionHostName

Get end user diagnostic details for theis session connection
An end user needs to open the Experience Monitor in the Windows AVD client (MSRDC) and then click on Send Diagnostics and Close button. 

WVDCheckpoints
| where Name == "ExperienceMonitor::User clicked on the Send Diagnostics button"
| where UserName contains "tom"

Get the logged in duration for user connections
WVDConnections 
| project CorrelationId,State,TimeGenerated, UserName
| evaluate pivot(State,min(TimeGenerated),CorrelationId,UserName)
| project UserName, Duration=Completed-Connected
| summarize DurationInHours=sum(Duration)/1h by UserName
| order by UserName




What you will have seen is there is lots of data and a powerful query language to retrieve this data,
however most of these queries just return lines of data, and is not visual.
This is where Power BI comes in.
Power BI allows you to visualise data, and could be useful in presenting the data shown in these queries. The additional thing is that Power BI reports can be scheduled so that yo can refresh the reports to get the latest data rather than running point in time queries.

For this section you will need access to Power BI. You can download the Power BI desktop application at the Microsoft store from here

To get this data into Power BI we first need to export it from Log Analytics in the correct format.

1. Go back to Log Analytics and run a query that is relevant to you. When it has completed click on the Export button and select "Export to Power BI (M query)















That will download a text file. You might want to rename this file to represent the query.

2. Now open Power BI. Click on Get Data and at the bottom select Blank Query







































3. This will open a New Query window. Click on "Advanced Editor"













4. In the Query1 window paste all the text as is from the text file(s) you just downloaded over the existing text.





















Click on Done. You may be asked to authenticate to Azure at this point, if you haven't already done this.

5. This will then run this query. On the right hand side give this query a name

















You can optionally add in any number of Queries in here, by right clicking in the Queries section and selecting New Query


6. Once you are happy with your queries, at the top left click on Close & Apply. This will add this new query to the dataset to the Power BI report.











Now we need to publish this query in to Power BI so that others can consume this as a report. By default, this will publish a Dataset and a blank Report. So before publishing it you should modify this default Report to present the data using the visualisation you want, so that it is ready to go when viewing through Power BI
Again, this is a broad subject and you can spend a lot of time configuring your reports with a large number of data visualisations. More information on how to do this and the options is available here.

7. In Power BI you will now see a blank report, but on the right-hand side in the "Fields" section will be the data retrieved from the query. Just to the left of this is the "Visualizations" section.
In order to create a visual report, you need to select your data and choose an appropriate Visualization.


As a simple example, most people will be interested in errors on the platform. So I have loaded both of the "Get error occurrence across all users" and "View feed activity of your users queries into this report"
I have then added three visualisations. 
 
The Fields section above shows that data that I have selected and in the Report section it displays the specifics of that data that I need.

To start I have added both sets of data from the Get error occurrence across all users:


This has created a simple table showing me the number of errors:

Then to get more detail about where those errors were occurring, I have added a second visualization using the Error Type, username, ClientOS, ClientType and ClientVersion - all the types of information an admin may need trying to troubleshoot a connection issue.


To produce this table:


And finally, I have added a graph of the types of error being reported

 
 


The world is your oyster here in terms of the data and visualizations that you can add. This is just a very simple introduction to what is possible to get you started.

Now to publish this report into Power BI so that it can be consumed by others using the Power BI web component.

8. Click on the Publish button (top right) - You will be prompted to save your report as a pbix file.











9. Select the Power BI Workspace you want this published to. Select "My Workspace" unless you want it in another workspace







Click on Select.
This will now publish this to the workspace.
















Click on the first link Open 'name of your pbix file' in Power BI.
This will Open Power BI in your Browser.

Now you you need to configure this data set so that it is regularly refreshed so that it is always current.

10. Select your Workspace to which you published this report and then click on More options for the Dataset - not the Report, in question, and then click on Settings.

11. Click on Data Source credentials and Edit credentials and Sign in. Then sign in with OAuth2 using your Azure credentials.

























12. To set a schedule to update this dataset click on, Scheduled Refresh and switch "Keep your data up to date" on, then you can set up your update schedule i.e. Daily. You can also add additional specific times for refreshes within your refresh frequency.




So now in Power BI you have a visual report of Windows Virtual Desktop errors that will be auto updated, that you can use as a support tool to manage your Windows Virtual Desktop estate.
 
Power BI has so much capability that you should further investigate in order to derive more value for your environment. To get started to see the massive choice and capability, take a look at this Visual Vocabulary guide.


 

Comments

  1. Thank you so much it is actually a very nice blog written to provide adequate information about Power BI and its related aspects.

    Powerbi Read Soap

    ReplyDelete
  2. Hello, many thanks for all of this blog and this blogpost.
    i'm looking for a Kudo query to get the number of active session in the time of execution. The goal is to monitor if we reach the hostpool limit to put an alert by mail.
    Do you know how can I built this type of query ?

    ReplyDelete

Post a Comment

Popular posts from this blog

Reassign a WVD Personal Session Host

AVD and Azure Active Directory Domain Join public preview

How to deploy a Windows Virtual Desktop host pool using Infrastructure as code from Azure DevOps