Babbling About Azure B-Series Burstable VMs

The Azure B-Series burstable VMs have been around for quite some time now, but up until now, I hadn’t had a reason to use them.

For those of you who are unfamiliar with them, the B-Series VMs are a family of Azure virtual machines that provide burstable compute power. They’re based on the Intel Haswell or Broadwell E5-2673 CPUs, which have a base clock speed of 2.3-2.4GHz and can “turbo” up to 3.2GHz.

The idea is that servers that are idle lots of the time, and then occasionally really really busy, can use more compute power than expected when they need it (sounds like me at work).

However, funnily enough, the pricing is also burstable, in that it can spike beyond what you expect if you’re not watching it carefully. Unsurprisingly, the pricing can be quite complex to understand.

Each size VM has a “baseline” CPU amount, from 10% to 135%. Your VM will earn CPU credits any time that it is under the baseline CPU threshold, and spend CPU credits any time that it is over (up to a maximum)

Preview pricing – no longer accurate!

They all have the “s” moniker, so support SSDs, and there are options for “memory-optimized” VM sizes (those tagged with an “m”).

So recently, we had a customer that needed to move a server over to Azure quick smart, and we only had estimates from them on what the compute profile would be like.
We decided to start with a 2vCPU 8GB RAM D2v3 (a comparable customer running the same application was quietly working away with a single-core VM), but as Murphy would have it, we started seeing performance issues when full load hit the server.

Interestingly enough, we noticed that this was only intermittent, and so suggested that one the B-series VMs would work. The added advantage was that we could double the core count and RAM allocation of the server while keeping the cost of the server roughly the same.

Monthly cost D2s vs. B4ms

And the result? Fantastic. Now, the server can idle along most of the time, and then boost up to 400% of the baseline performance – i.e. it can use up to 16 cores if needed, rather than just the allocated 4 cores.

Before, and after.

Now, having run for a few more weeks, we’ve got enough data to see that even the B4ms is too large for this server, and we could consider downsizing this to the B2ms – a saving of around 50%!

Search for and Delete Emails in PowerShell

I am on the on-call roster for work. Most of the time, it’s just annoying servers alerting that they’ve reached a critical drive space level, some service failing that someone thought was important to monitor, and someone who has gotten themselves locked out.

However, I occasionally get an interesting call. The other night it was a panicked C-level who wanted to delete an email that was sent to all of their users (from an external party). I put them through the normal obstacles, do you want this done right away (it was close to midnight already), do you want to pay the associated charges, are you authorised to do this, etc. etc., but when it came down to it they were adamant that this email needed to be deleted, and the person blocked from emailing their staff in the future.

Well, screw logging into a few hundred mailboxes manually, searching for an email, deleting it, deleting it from Deleted Items, then deleting it from Recoverable Deleted Items… Surely I can script this, right?

Thankfully there’s a lot of scripts/posts out there that show you how to do bits and pieces of this. I just had to piece together the bits that I needed and rewrite for my particular use case.

Step 1 – Install the EWS Managed API
https://www.microsoft.com/en-us/download/details.aspx?id=42951

Step 2 – Ensure your ‘audit’ account has Impersonation Rights to the mailboxes in question – set up a new Management Role (https://msdn.microsoft.com/en-us/library/office/dn722376(v=exchg.150).aspx)

Step 3 – View script on GitHub

Further Reading and credit:

Virtually none of this code is original, all stolen from other better authors.

StackOverflow (I’m really sorry I can’t find the original Q&A to give full credit!)

https://blogs.technet.microsoft.com/circularlogging/2015/02/10/using-ews-impersonation-and-powershell-to-log-into-an-exchange-online-mailbox/

Different Search Filters
https://msdn.microsoft.com/en-us/library/office/dn579422(v=exchg.150).aspx

EWS Best Practice
https://blogs.msdn.microsoft.com/webdav_101/2015/05/11/best-practices-ews-authentication-and-access-issues/

Change Your Netscaler Login Page to Bing Wallpaper of the Day

If you have done a Bing search for “How to change my Netscaler Login page to the Bing daily Wallpaper”, you have come to the right place.

I first thought of this after I was woken up by our on-call team at 3:00a.m. one morning. While I was busily a) cursing their mothers and b) trying to resolve some now forgotten issue, I thought Hey, you know what would be cool? If the background picture on our Netscaler login page changed to match the Bing Wallpaper of the Day every day!

Disclaimer: I’m not sure if this abides by Bing’s Terms and Conditions – I advise you to seek your own legal advice before implementing anywhere except a lab.

This process involves three parts:

  • A bash script to copy the Bing Wallpaper of the Day down to your Netscaler
  • Some edits to our crontab & rc.netscaler files
  • Some edits to a CSS file

Part one – the cool part!

This part is a modified version of jadijadi’s bash script that does the same thing for Ubuntu Desktop: https://gist.github.com/jadijadi/56d90cc8c2956dd59119. I had to modify some bits as the Netscaler didn’t have wget or a version of grep with the -P option

P.S. don’t judge me on my bash scripting, it’s not my forte!

 

#!/bin/bash

# $bing is needed to form the fully qualified URL for
# the Bing pic of the day
bing="www.bing.com"

# $xmlURL is needed to get the xml data from which
# the relative URL for the Bing pic of the day is extracted
#
# The mkt parameter determines which Bing market you would like to
# obtain your images from.
# Valid values are: en-US, zh-CN, ja-JP, en-AU, en-UK, de-DE, en-NZ, en-CA.
#
# The idx parameter determines where to start from. 0 is the current day,
# 1 the previous day, etc.
xmlURL="http://www.bing.com/HPImageArchive.aspx?format=xml&idx=1&n=1&mkt=en-US"

# $saveDir is used to set the location where Bing pics of the day
# are stored.  $HOME holds the path of the current user's home directory
saveDir="/var/customisations/images/"

# Create saveDir if it does not already exist
mkdir -p $saveDir

# The file extension for the Bing pic
picExt=".jpg"

# Extract the relative URL of the Bing pic of the day from
# the XML data retrieved from xmlURL, form the fully qualified
# URL for the pic of the day, and store it in $picURL

# Form the URL for the default pic resolution
# Netscaler grep doesn't have -P, so we use perl instead
defaultPicURL=$bing$(echo $(curl -s $xmlURL) | perl -nle "print $& if m{<url>(.*)</url>}" | cut -d ">" -f 2 | cut -d "<" -f 1)

# Set picName to the desired picName
picName=bingWallpaper.jpg
# Download the Bing pic of the day at desired resolution
curl -s -o $saveDir$picName $defaultPicURL

echo Saved Bing Wallpaper to $saveDir$picName and Netscaler images directories
cp $saveDir$picName /var/netscaler/gui/vpn/images/
cp $saveDir$picName /netscaler/ns_gui/vpn/images/

Copy this file somewhere persistent onto your Netscaler, e.g. /var/customisations/get-bingWallpaper.sh
Log into the Netscaler, then open the shell
Navigate to the location you saved the script cd /var/customisations
Make the file executable with chmod u+x get-bingWallpaper.sh
Run the script! sh get-bingWallpaper.sh

Part Two – Make It Happen Daily

Now, we need to add the script into our crontab file. For those non-Linux types out there, the crontab file is like Windows Scheduled Tasks. We can add stuff into this file and make it run periodically.

However, the crontab file will get wiped on reboot of the Netscaler, so we need a way to make sure the crontab file always has our command to run get-bingWallpaper.sh – this is where the rc.netscaler file comes in.

Anything in the /nsconfig/rc.netscaler file will automatically get executed at first boot of the Netscaler.

Add the following to your rc.netscaler file


echo "30 5 * * * nsroot sh /var/customisations/get-bingwallpaper.sh" >> /etc/crontab
sh /var/customisations/get-bingwallpaper.sh

Crontab format is “minute hour mday month wday who command”, so our entry reads “On the 30th minute of the 5th hour every day of the month every month every day of the week, get nsroot to execute sh /var/customisations/get-bingwallpaper.sh”

Now, at 5:30a.m. every day, we should get a new Bing Wallpaper file downloaded!

Part Three – CSS

If you’re using the default theme, then your CSS file will be in /var/netscaler/logon/themes/Default/css/base.css

You want to add/edit the “background” section as so:


background: black url(/vpn/images/Awesome-Starfish-Wallpaper-HD.jpg) no-repeat center center fixed;
background-size:   cover;

If you’re using a custom theme, then you’ll have a different CSS file (and hopefully you know where that is!)

 

I hope you enjoy the wonderful comments from all your users about the lovely pictures they get on their logon screen every day.

Clean Citrix UPM Profiles

UPDATE: This is now built into Profile Manager, and I’d recommend you use this instead of the below script.
https://docs.citrix.com/en-us/profile-management/current-release/configure/include-and-exclude-items/enable-logon-exclusion-check.html
Thanks Citrix!

I have a hate/love relationship with Citrix Profile Manager.

On the other hand, I HATE how much time I seem to spend on it, tweaking my UPM policies to troubleshoot slow logons, trying to figure out which parts of the Google Chrome User Data folder I need to make it work properly, trying to figure out what some obscure folder in AppData is for and what will potentially break if I exclude it.

On the one hand, I love the fancy features like Profile Streaming, Active Writeback, and mirroring of credentials (and browser sessions!) across different servers between logon/logoff. I also appreciate some handy tools that Citrix/the Citrix Community have produced for UPM, like the UPM Log Parser and UPMConfigCheck.

 

Now, I’m happy to be able to add to this with a useful script of my own!

One thing I’ve noticed with Profile Manager is that the profile store doesn’t get ‘cleaned’ when you change your Citrix Policies.

So, let’s say you’ve been including all of AppData\Local\Google Chrome in your “Folders to synchronize” list (or you selected “Migration of existing profiles: Local and Roaming”) , and start getting complaints of slow logins because the AppData\Local\Google Chrome\User Data\Default\Cache folder is growing large.

After some brief Googling and a few prayers, you add AppData\Local\Google Chrome\User Data\Default\Cache to your list of excluded folders.

But lo, and behold! Your logins are still slow? The AppData\Local\Google Chrome\User Data\Default\Cache folder still exists in your users’ profile store? Whaaa…?

Unfortunately, possibly for clever reasons, Profile Manager won’t retroactively clean up your profile store, nor do the exclusion lists etc. apply on logon (i.e., UPM just copies everything in your profile store down to the server).

This presents you with the difficult choice of either a) PowerShell’ing your way through each user profile and deleting the folders you no longer want, or b) resetting the profile.

 

Muralidhar Maram from Citrix wrote a handy little CLI tool that will do this for you, but with one caveat… it only works if you’re using Citrix GPOs to deploy your UPM settings – not Citrix Studio Policies. Muralidhar probably has a lot to do, so I’ve written what is hopefully a useful script to clean up a UPM profile.

Caveats:

  • It only works (has been tested) if you’re using Citrix Studio policies 😛
  • It will ignore the AppData\Roaming folder (i.e., just copy it in its entirety). If you don’t want it to do this, comment out line 100 (and test, because I haven’t)
    $savedFoldersList += “$($pathToUserStore)\AppData\Roaming”
  • It doesn’t delete the old UPM folder, but you could easily modify the script to.
  • It assumes your PathToUserStore setting is based on the #SAMAccountName# variable. If you’re using something different (e.g. #profilePath#), you’ll need to modify the top do-until loop.

How it works:

It looks in the registry for your Profile Manager policies (so you have to run this on a VDA), and copies any files/folders in your “Directories to synchronize”, “Folders to Mirror”, “Files to synchronize” list from your current UPM folder to a new UPM folder.

It then goes through your “Exclusion list – directories” and “Exclusion list – files” policies and deletes any files/folders in your new UPM folder that match these.

Then, it resets the permissions on the folder, and renames your old UPM folder to UPM_PROFILE_backup…. and renames your new UPM folder to UPM_PROFILE (so it’ll get used at next logon/logoff).

The script:

Ta da.

 


#########################################################################################
# Start up
# Run from a VDA
#########################################################################################

do{
	$user = Read-Host "Enter the SAMAccount name of the user you wish to clean up"
	#PathToUserStore
	$pathToUserStore = ((Get-ItemProperty "HKLM:\Software\Policies\Citrix\UserProfileManagerHDX").PathToUserStore -replace "#SAMAccountName#",$user) + "\UPM_PROFILE"
	if(!(Test-Path $pathToUserStore)){
		Write-Host "Can't find $($pathToUserStore), re-enter your username." -fore Red
	}
}until(Test-Path $pathToUserStore)

Write-Host "Scanning UPM folder..." -fore yellow
$savedFoldersList = @()
$savedFilesList = @()
$deleteFoldersList = @()
$excFoldersList = @()
$excFilesList = @()
#SyncFileList
foreach($file in (Get-ItemProperty "HKLM:\Software\Policies\Citrix\UserProfileManagerHDX\SyncFileList").SyncFileList){
    $file = $file `
        -replace "!ctx_localappdata!","AppData\Local" `
        -replace "!ctx_internetcache!","AppData\Local\Microsoft\Windows\Temporary Internet Files" `
        -replace "!ctx_localsettings!","AppData\Local" `
        -replace "!ctx_roamingappdata!","AppData\Roaming" `
        -replace "!ctx_startmenu!","AppData\Roaming\Microsoft\Windows\Start Menu"
        # Also need to check if the file has a wildcard in it
        if($file -match "\*" -and $file -match "\."){
                # Get the parent directory of the file
                $periodIndex = $file.LastIndexOf(".")
                $parentDir = $file.Substring( 0,$periodIndex ) -replace "\*"
                $fileExt  = $file.Substring( $periodIndex, ($file.Length - $periodIndex) )
                foreach($wildcardFile in (gci -Path "$($pathToUserStore)\$($parentDir)*" -Include "*$($fileExt)" -Force)){
                        $savedFilesList += $wildcardFile.FullName
                }
        }else{
            $savedFilesList += "$($pathToUserStore)\$($file)"
        }

}
#SyncDirList
foreach($folder in (Get-ItemProperty "HKLM:\Software\Policies\Citrix\UserProfileManagerHDX\SyncDirList").SyncDirList){
    $folder = $folder `
        -replace "!ctx_localappdata!","AppData\Local" `
        -replace "!ctx_internetcache!","AppData\Local\Microsoft\Windows\Temporary Internet Files" `
        -replace "!ctx_localsettings!","AppData\Local" `
        -replace "!ctx_roamingappdata!","AppData\Roaming" `
        -replace "!ctx_startmenu!","AppData\Roaming\Microsoft\Windows\Start Menu"
    $savedFoldersList += "$($pathToUserStore)\$($folder)"
}
#MirrorFoldersList
foreach($folder in (Get-ItemProperty "HKLM:\Software\Policies\Citrix\UserProfileManagerHDX\MirrorFoldersList").MirrorFoldersList){
    $folder = $folder `
        -replace "!ctx_localappdata!","AppData\Local" `
        -replace "!ctx_internetcache!","AppData\Local\Microsoft\Windows\Temporary Internet Files" `
        -replace "!ctx_localsettings!","AppData\Local" `
        -replace "!ctx_roamingappdata!","AppData\Roaming" `
        -replace "!ctx_startmenu!","AppData\Roaming\Microsoft\Windows\Start Menu"
    $savedFoldersList += "$($pathToUserStore)\$($folder)"
}
#SyncExclusionListDir
foreach($folder in (Get-ItemProperty "HKLM:\Software\Policies\Citrix\UserProfileManagerHDX\SyncExclusionListDir").SyncExclusionListDir){
    $folder = $folder `
        -replace "!ctx_localappdata!","AppData\Local" `
        -replace "!ctx_internetcache!","AppData\Local\Microsoft\Windows\Temporary Internet Files" `
        -replace "!ctx_localsettings!","AppData\Local" `
        -replace "!ctx_roamingappdata!","AppData\Roaming" `
        -replace "!ctx_startmenu!","AppData\Roaming\Microsoft\Windows\Start Menu"
    $excFoldersList += "$($pathToUserStore)\$($folder)"
}
#SyncExclusionListFiles
foreach($file in (Get-ItemProperty "HKLM:\Software\Policies\Citrix\UserProfileManagerHDX\SyncExclusionListFiles").SyncExclusionListFiles){
    $file = $file `
        -replace "!ctx_localappdata!","AppData\Local" `
        -replace "!ctx_internetcache!","AppData\Local\Microsoft\Windows\Temporary Internet Files" `
        -replace "!ctx_localsettings!","AppData\Local" `
        -replace "!ctx_roamingappdata!","AppData\Roaming" `
        -replace "!ctx_startmenu!","AppData\Roaming\Microsoft\Windows\Start Menu"
    # Also need to check if the file has a wildcard in it
    if($file -match "\*" -and $file -match "\."){
        # Get the parent directory of the file
        $periodIndex = $file.LastIndexOf(".")
        $parentDir = $file.Substring( 0,$periodIndex ) -replace "\*"
        $fileExt  = $file.Substring( $periodIndex, ($file.Length - $periodIndex) )
        $exFile = $null
        foreach($exFile in (gci -Path "$($pathToUserStore)\$($parentDir)*" -Include "*$($fileExt)" -Force).FullName){
            if($exFile){
                $excFilesList += $exFile
            }
        }
    }else{
        $excFilesList += "$($pathToUserStore)\$($file)"
    }
}

# Add in system folders/folders
$savedFoldersList += "$($pathToUserStore)\Citrix"
$savedFoldersList += "$($pathToUserStore)\WINDOWS"
$savedFoldersList += "$($pathToUserStore)\AppData\Roaming"
foreach($file in (gci "$($pathToUserStore)\*" -Include *.dat,*.log*,*.blf,*.ini,*.pol,*.bin -File -Force)){
    $savedFilesList += $file.FullName
}

#########################################################################################
# How many files...?
#########################################################################################

$preFileCount = (Get-Item $pathToUserStore).GetFiles("*",[System.IO.SearchOption]::AllDirectories).Count

#########################################################################################
# Copy only the saved FOLDERS to a new location
#########################################################################################

Write-Host "Copying saved folders to new location"

$folder = $null
$pathToNewUserStore = $pathToUserStore -replace "UPM_PROFILE","UPM_PROFILE_NEW"
if(!(Test-Path $pathToNewUserStore)){
    mkdir $pathToNewUserStore | Out-Null
}
foreach($folder in $savedFoldersList){
    $destDir = ( $folder -replace [regex]::Escape($pathToUserStore),$pathToNewUserStore )
    robocopy $folder $destDir /e /r:0 /w:0 /mt:64 /dcopy:t /copyall /log:robocopy.log | Out-Null
}

#########################################################################################
# Copy only the saved FILES to a new location
#########################################################################################

Write-Host "Copying saved files to new location"

$file = $null
foreach($file in $savedFilesList){
    $destFile = ( $file -replace [regex]::Escape($pathToUserStore),$pathToNewUserStore )
    if(Test-Path $file){
        New-Item -ItemType File -Path $destFile -Force | Out-Null
        Copy-Item -Path $file -Destination $destFile -Force | Out-Null
    }
}

#########################################################################################
# Now delete the $excFoldersList from our copied profile
#########################################################################################

Write-Host "Deleting excluded folders from new profile"

$folder = $null
foreach($folder in $excFoldersList){
    $folder = ($folder -replace [regex]::Escape($pathToUserStore),$pathToNewUserStore)
    if(Test-Path $folder){
        Remove-Item $folder -Recurse
    }
}

#########################################################################################
# Now delete the $excFilesList from our copied profile
#########################################################################################

Write-Host "Deleting excluded files from new profile"

$file = $null
foreach($file in $excFilesList){
    $destFile = ( $file -replace [regex]::Escape($pathToUserStore),$pathToNewUserStore )
    Remove-Item $destFile -Force
}

#########################################################################################
# Remove/rename the old profile folder
#########################################################################################

Write-Host "Renaming profile folders"

Rename-Item $pathToUserStore "$($pathToUserStore).upm_backup_$(Get-Date -Format dd_MM_yy)"
sleep 1
Rename-Item $pathToNewUserStore $pathToUserStore

#########################################################################################
# Reset permissions
#########################################################################################

Write-Host "Resetting permissions"

icacls.exe $pathToUserStore /setowner $user /T /C /Q | Out-Null
icacls.exe $pathToUserStore /reset /T /C /Q | Out-Null

$postFileCount = (Get-Item $pathToUserStore).GetFiles("*",[System.IO.SearchOption]::AllDirectories).Count

Write-Host "Removed a total of $($preFileCount - $postFileCount) files." -fore yellow

Allowing non-Admins to Shadow, Message, and more on 2012 R2 RDS…

Although this one has been posted a few times before, I thought I’d add it in for my own reference as it taught me something I wasn’t aware of – setting permissions through WMI objects.

Windows Server 2012 is great. In some respects, even better than Server 2008 R2.

In many respects however, it is absolute bollocks, so I’m glad they came along with 2012 R2 to improve some of this.

One of the really bollocks-y things about the 2012 range of Server OSs is having Terminal Services Manager removed… and not replaced (until R2 – and even then…)

This can make managing RDS installations a pain.

 

We had a number of clients moving from XenApp 6.5 to XenApp 7.6 (and 2008 R2 servers > 2012 R2 servers), who used to be able to shadow other users using the Citrix Shadow Taskbar. Unfortunately, this was no longer possible, and because of some Active Directory peculiarities, we weren’t able to give them access to Citrix Director.

So. We had to allow some non-admins to shadow some other non-admins. How?

Initially we looked at Remote Assistance (Citrix uses this in Citrix Director), but because the admin-initiated-request way required making an RPC call to the DCOM server on the VDA itself to generate a request etc. etc., and getting users to initiate the Remote Assistance request was always a pain, we scrapped this idea.

Luckily, someone invented the internet (and Google).

It turns out, the Terminal Services Managers group still exists in WMI, and the required permissions are still there – all you have to do is grant them. Warning: I don’t know how to reverse this, so YMMV, etc.

This will allow your selected group to do the following:
Shadow other users*
List logged in users
Disconnect users
Log users off
Kill processes of other users
Open an administrative command prompt and type the following command (make sure you’ve actually created the group first):
$protocols = `
"ICA-CGP",
"ICA-CGP-1",
"ICA-CGP-2",
"ICA-CGP-3",
"ICA-TCP",
"ICA-SSL",
"ICA-HTML5"</div>
<div></div>
<div>foreach($protocol in $protocols){</div>
<div>(Get-WmiObject -Namespace "root/cimv2/terminalservices" -Class win32_tspermissionssetting | Where-Object {$_.TerminalName -eq $protocol}).AddAccount("SAAAS\Client_RDSAdmins",2)
}</div>
<div>
You will also need to add in the Group Policy settings as per https://technet.microsoft.com/en-us/library/cc771538.aspx

Citrix XenApp – Login Monitor

This has been an ambition of mine ever since I started playing with some of the ICA Client DLLs in PowerShell and watching Citrix sessions launch by magic (if you could get it to work, it was magic, plain and simple).

Since then, I’ve wanted to be able to have my monitoring system (in this case, Nagios) log into all of my XenApp servers, and report back whenever there was an issue getting into one.

It’s taken months of working on this on and off, but I finally have something in a reasonably stable state… almost stable enough to add it to our after-hours alerting group (not while I’m on call – just the other engineers, of course!)

 

The Overview

The script runs on one VM, contacts the controllers for my XenApp 6.5 and XenApp 7.6 farms, gathers a list of current XenApp nodes, then iterates through each one and attempts to log in.

If it logs in successfully, a file is updated on a central share, and the monitoring script moves onto the next server in the queue. If it waits for longer than the timeout period, it’ll assume failure and alert in Nagios.

Disclaimer!

This script is not a clean, finished, completely working product. There are better ways to do what I’ve done. There are probably safer ways to do this without periodically killing wfica32 processes… there are definitely smarter ways to do a lot of the below… But hopefully this will provide you with inspiration to rewrite this to suit your own environment.

Requirements

Permissions

I’ve set up 3 accounts in total: one account that runs the monitoring script (MonitorService), one account that logs into the Server 2012 R2 nodes (MonitorService-2012R2), and one account that logs into the Server 2008 R2 nodes (MonitorService-2008R2).

MonitorService needs to have local administrative permissions on the server that you’re running this check from, as well as on your controller servers (i.e., the XenApp 6.5 ZDC and the XenApp 7.6 DC), as well as being a read-only administrator to both farms.

MonitorService-2012R2 and MonitorService-2008R2 need to have permission to log directly onto the XenApp nodes – this is important, as we aren’t using the brokering process through a Web Interface/Storefront/Netscaler – we’re connecting directly to the XenApp server itself.
For XenApp 6.5, you need to create a user policy, then under ICA > select Desktop Launches and allow it (I do this only for a specified AD group, lets call it MonitoringServiceAccounts). Then, you need to add this AD group to the Remote Desktop Users group on each 2008 R2 XenApp node – I used a GPO to do this.
For XenApp 7.6, you only need to add the group to the “Direct Access Users” group on the 2012 R2 XenApp node.

Firewall

The server running the script needs to be able to reach your XenApp nodes on 2598/1494, or port 443 if you can set up SSL Relay.
It also needs to be able to reach your controllers on the WinRM port 5985/5986.

Share

I’ve set up a central share that stores the result entries – this should be accessible from all XenApp nodes and all MonitoringService accounts should have modify permissions to the share.

Roaming Profiles

I’ve set up my Monitoring Service accounts to use Roaming Profiles, there is a bit of pain involved in that – first and foremost – you should use a separate account or at least profile path for logging into different OSs. Secondly, I’ve scripted in clearing out the Roaming Profile for each account at the end of each monitoring run. This helps keep things running more stable.

Client Drives, Audio Redirection & Printer Mapping

Disable these for your monitoring accounts, ideally through Citrix policies. It will speed up the login process and prevent the Citrix Receiver from crashing under stress.

The Script

OK, here goes… I always hate posting code online as I tend to see the glaring errors, poor naming conventions and general shoddiness of the code… if it bugs you – clean it up and post me a nicer, shinier version!


$start = Get-Date

#Get list of servers
Write-Host "Generating list of servers from XenApp 6.5 and XenApp 7.6 environments" -fore Cyan

## XenApp 7.6
try{
$xa76session = New-PSSession -ComputerName XA76DC01.saaas.com
Invoke-Command -Session $xa76session -ScriptBlock {Add-PSSnapin Citrix*}
$XA76Servers = Invoke-Command -Session $xa76session -ScriptBlock {Get-BrokerMachine | select @{n="ServerName";e={$_.DNSName -replace "\.saaas.com"}},InMaintenanceMode} | Select ServerName,InMaintenanceMode

}catch{
Write-Host "Failed to retrieve list of servers from XenApp 7.6 farm. Using default list." -fore Yellow
$XA76Servers = Import-CSV "C:\Scripts\LoginMonitor\ServerLists\XA76Servers.csv"
}

## XenApp 6.5
try{
$xa65Session = New-PSSession -ComputerName XA65ZDC01.saaas.com -ErrorAction SilentlyContinue
Invoke-Command -Session $xa65session -ScriptBlock {Add-PSSnapin Citrix*}
$XA65servers = Invoke-Command -Session $xa65session -ScriptBlock {Get-XAServer | select ServerName,@{n="InMaintenanceMode";e={ if($_.LogOnMode -like "Prohibit*"){$true}elseif($_.LogOnMode -eq "AllowLogons"){$false} }} } | Select ServerName,InMaintenanceMode
}catch{
Write-Host "Failed to retrieve list of servers from XenApp 6.5 farm. Using default list." -fore Yellow
$XA65Servers = Import-CSV "C:\Scripts\LoginMonitor\ServerLists\XA65Servers.csv"
}

Write-Host "Got list of servers, closing connection to XenApp farms" -fore Cyan
if($xa65Session){
Remove-PSSession $xa65Session
}
if($xa76session){
Remove-PSSession $xa76session
}

# Global variables
$masterResultTable = Import-CSV "C:\Scripts\LoginMonitor\LoginMonitorResults.csv"
## Set the logon user for the 7.6 and 6.5 farms respectively
$XA76Servers | Add-Member -MemberType NoteProperty -Name LogonUser -Value "MonitoringService-2012R2"
$XA65Servers | Add-Member -MemberType NoteProperty -Name LogonUser -Value "MonitoringService-2008R2"
$servers = $null
$servers += $XA76Servers
$servers += $XA65Servers

# Create ICA Template
$icaTemplate = '[Encoding]
InputEncoding = ISO8859_1
[WFClient]
Version=2
ProxyType=None
HttpBrowserAddress=XASERVER:80
ConnectionBar=0
CDMAllowed=False
CPMAllowed=Off

[ApplicationServers]
XASERVER=

[XASERVER]
Address=XASERVER
InitialProgram=
CGPAddress=*:2598
ClientAudio=Off
DesiredColor=2
DesiredHRes = 1024
DesiredVRes = 768
TWIMode = False
KeyboardTimer = 0
MouseTimer = 0
ConnectionBar=0
Username=XAUSERNAME
Clearpassword=MonitoringServicePassword
Domain=saaas
TransportDriver=TCP/IP
WinStationDriver=ICA 3.0
BrowserProtocol=HTTPonTCP
Compress=On
EncryptionLevelSession=Basic
[Encrypt]
DriverNameWin32=PDCRYPTN.DLL
DriverNameWin16=PDCRYPTW.DLL
[Compress]
DriverName=PDCOMP.DLL
DriverNameWin16=PDCOMPW.DLL
DriverNameWin32=PDCOMPN.DLL
'

# Find my session ID (so I don't go closing other people's processes!)
$session = [System.Diagnostics.Process]::GetCurrentProcess().SessionId

# Launch Desktops

foreach($server in $servers){
if(Get-Process wfica32 -ErrorAction SilentlyContinue){
#Close hung wfica32 and wfcrun32 processes
Write-Host "Force restarting wfcrun/wfica processes" -fore yellow
Get-Process wfica32 | ? {$_.SessionId -eq $session} | Stop-Process -Force
Get-Process wfcrun32 | ? {$_.SessionId -eq $session} | Stop-Process -Force
}

# Sleep a bit for wfica32 to catch its breath.
Start-Sleep 3

$result = $null

# Create launch result file
Write-Host "Creating result file for $($server.ServerName)..."
Write-Output "ComputerName,LastLogonTime,InMaintenanceMode" | Out-File "\\saaas.com\LoginMonitor\$($server.ServerName).txt" -Force -Encoding ascii

# Create ICA file
$icaFile = "C:\Scripts\LoginMonitor\LaunchFiles\$($server.ServerName).ica"
$icaTemplate -replace "XASERVER","$($server.ServerName)" -replace "XAUSERNAME","$($server.LogonUser)" | Out-File $icaFile -Force -Encoding ASCII

# Launch Desktop
Start-Process "C:\Program Files (x86)\Citrix\ICA Client\wfica32.exe" "$($icaFile)"
Write-Host "Launching desktop on $($server.ServerName)."

$launchTime = Get-Date

# If this server isn't in our result table already, add it. Else, update the existing entry.
if($server.ServerName -notin $masterResultTable.Server ){
$masterResultTable += [PSCustomObject]@{Server = $server.ServerName;LaunchTime = $launchtime; LastLogonTime = $null; Result = $result}
}else{
$masterResultTable[$masterResultTable.Server.IndexOf("$($server.Servername)")].LaunchTime = $launchtime
$masterResultTable[$masterResultTable.Server.IndexOf("$($server.Servername)")].LastLogonTime = $null
$masterResultTable[$masterResultTable.Server.IndexOf("$($server.Servername)")].Result = $result
}

# Get Results
$loginResultFile = "\\saaas.com\LoginMonitor\$($server.ServerName).txt"
$launchTime = $masterResultTable[$masterResultTable.Server.IndexOf("$($server.Servername)")].LaunchTime
# Check to see if the user has logged in
Write-Host "Checking to see if the login result file for $($server.ServerName) has been updated" -NoNewLine

# Check to see if the LogonTime attribute in the CSV file has been updated more recently than the LaunchTime, if its been more than 3 minutes waiting, or if the server is in Maintenance mode.
# if so, try log in anyway but don't mark as a failure if it doesn't log in.
do{
Write-Host "." -NoNewline
$loginResult = Import-Csv $loginResultFile
Start-Sleep 1

}until((($loginResult.LastLogonTime -as [datetime]) -gt $launchtime) -or ((Get-Date).AddMinutes(-2) -gt $launchTime))

# Create the result variable depending on whether the LastLogonTime has been updated or not or whether the server is in Maintenance Mode
if(($loginResult.LastLogonTime -as [datetime]) -gt $launchtime){
Write-Host " Logged into $($server.ServerName) successfully!" -Fore Green
$result = "Success"
}elseif($server.InMaintenanceMode -eq $true){
Write-Host " $($server.Servername) is in maintenance mode, skipping checks..." -fore yellow
$result = "InMaintenanceMode"
}else{

Write-Host " User has not successfully logged into $($server.ServerName) in two minutes, skipping :(" -Fore red
$result = "Failure"
}

# Update the master table
$masterResultTable[$masterResultTable.Server.IndexOf("$($server.ServerName)")].Result = $result
$masterResultTable[$masterResultTable.Server.IndexOf("$($server.ServerName)")].LastLogonTime = ($loginResult.LastLogonTime -as [datetime])

# Update the CSV
$masterResultTable | Export-CSV C:\Scripts\LoginMonitor\LoginMonitorResults.csv -NoTypeInformation -Force

}

# Stop any still-running Citrix processes - this tends to ruin Citrix Receiver. Better to just log off.
Get-Process wfica32 | ? {$_.SessionId -eq $session} | Stop-Process -Force
Get-Process wfcrun32 | ? {$_.SessionId -eq $session} | Stop-Process -Force
# Gives "Receiver has stopped working" error
#Get-Process receiver | Stop-Process -Force -Confirm:$false -ErrorAction SilentlyContinue
#Get-Process SelfServicePlugin | Stop-Process -Force -Confirm:$false -ErrorAction SilentlyContinue

# Reset the roaming profile to avoid corruption issues
if((Test-Path .\Blank) -eq $false){
mkdir blank
}
# MonitoringService needs to have modify perms to the below folders
robocopy ".\Blank" "\\saaas.com\RoamingProfiles\MonitoringService-2008R2.v2\" /MIR /NJH /NJS /NDL /NS
robocopy ".\Blank" "\\saaas.com\RoamingProfiles\MonitoringService-2012R2.v2\" /MIR /NJH /NJS /NDL /NS

$masterResultTable | Export-CSV C:\Scripts\LoginMonitor\LoginMonitorResults.csv -NoTypeInformation -Force
$end = Get-Date
$length = $end - $start
Write-Output "$($end) - Script took $($length.Hours) hours, $($length.Minutes) minutes, and $($length.Seconds) seconds to complete." | Out-File "C:\Scripts\LoginMonitor\LoginMonitor.log" -Force

# Log off
shutdown /l

Nagios

The last piece of the puzzle, Nagios simply runs a check using the NSClient plugin to see if the result file (LoginMonitorResults.csv) has been updated and what the last logon result for each host was.

I aim to “decentralize” this more in future – by getting the NSClient check to run the “Was the login successful” logic. The login monitor itself could also be repurposed to run on each XenApp node individually, so each server is checking itself, rather than one server checking them all (which tends to cause more false positives etc.)

Logging Perfmon to SQL Database

Disk Queue Length!

This is what we wanted to track prior to making a significant change to our Citrix environment. Namely, moving AppData\Roaming from being redirected to our file server and moving it back to local disk.

It seems to be Citrix’s recommendation to redirect AppData\Roaming, but everyone we talked to said it wasn’t a great idea. We definitely found this when we had the occasional file server issue and everyone’s Citrix sessions would grind to a halt – which we didn’t use to see prior to making this change.

So, we’ve come full circle again and are looking at not redirecting AppData\Roaming. But – before we made the change, we wanted to get some data before and after the change – to make sure that the change wouldn’t just move the problem from our file server to our Citrix servers.

So – how do you log using Perfmon without actually writing to the disk that you’re monitoring? Well, you could a) add another drive, and save the Perfmon data to that drive, or way cooler – b) save the Perfmon data to a SQL database!

Now, technically we are using typeperf – not perfmon. But it works the same-ish (just commandline).

Steps:

  1. Create an empty SQL Database named PerfmonDB on your SQL server SAAAS-SQL01
  2. Create a user DSN on the VDA that you’re wanting to log
  3. Create a batch script that starts typeperf at boot, or some interval
    e.g. typeperf -f SQL -s SAAAS-VDA01 `”\LogicalDisk(C:)\Current Disk Queue Length`” -si 5 -o SQL:PerfmonDB!RedirectAppData
  4. Log!

This will create 3x tables in PerfmonDB

PerfmonDB

CounterData – the data

PerfmonDB 2

CounterDetails – details on the counter(s) – I have used 3x in here

PerfmonDB 3

DisplayToID

PerfmonDB 4

With this data, you should be able to chart the Average Disk Queue length on your VDAs over a number of hours/days!

But hang on… that sounds like a lot of manual work… what if I’m doing this for SAAAS-VDA01 to SAAAS-VDA67??? I’m going to get RSI before lunch!

Glad you asked… Here’s a handy little script that will do all of the typeperf stuff in a single step. You’ll still need to create the SQL Database manually. And assuming that you are running this under an account that has permission to the SQL Database…

#Create ODBC connection
Add-OdbcDsn -Name PerfmonDB -DsnType User -Platform "64-bit" -DriverName "SQL Server" -SetPropertyValue @("Trusted_Connection=Yes", "Database=PerfmonData", "Server=SAAAS-SQL01")

#Create the typeperf batch script
#Counter - C: Current Disk Queue Length, run every 5 seconds
mkdir C:\Perfmon
$task = "typeperf -f SQL -s SAAAS-VDA01 `"\LogicalDisk(C:)\Current Disk Queue Length`" -si 5 -o SQL:PerfmonData!RedirectAppData" $task | out-file "C:\Perfmon\StartPerfmon.bat" -Force -Encoding ascii

#Schedule the batch script to run on startup
schtasks.exe /create /TN "Start Perfmon Logging" /SC ONSTART /TR "C:\Perfmon\StartPerfmon.bat" /ru SAAAS\Administrator /rp *