DSC: Script Resource GetScript
If you look on the TechNet page for the Script Resource you will see
GetScript = { <# This must return a hash table #> }
Which is technically speaking, true...usually...right up until the point you try to run Get-DscConfiguration on a machine, in which case it will get to that script resource and die saying:
The PowerShell provider returned results that are not valid from Get-TargetResource. The <keyname> key is not a valid property in the corresponding provider schema file. The results from Get-TargetResource must be in a Hashtable format. The keys in the Hashtable must be the same as the properties in the corresponding provider schema file.
The consensus around the web is that the error is saying you have to return a hashtable with keys that match the properties of the schema, so in this case the schema for the Script resource is:
#pragma namespace("\\\\.\\root\\microsoft\\windows\\DesiredStateConfiguration") [ClassVersion("1.0.0"),FriendlyName("Script")] class MSFT_ScriptResource : OMI_BaseResource { [Key] string GetScript; [Key] string SetScript; [Key] string TestScript; [write,EmbeddedInstance("MSFT_Credential")] string Credential; [Read] string Result; };
Which means in order for your Script resource to be compliant you need to return:
GetScript = {return @{ Result = ();GetScript=$GetScript;TestScript=$TestScript;SetScript=$SetScript}}
But when you think about it, this doesn't make a lot of sense. In every other resource I can think of it makes absolute sense, because the parameters in the schema determine the status of the resource you want to control, not how you control it and how you test for it.
It would be like Get-TargetResource for the Registry resource not returning the information about the key, its value, etc. but rather returning that AND returning the entire contents of MSFT_RegistryResource.psm1 which would make literally no sense. We don't care HOW you check or HOW you set, and returning a Get-Script with the contents of Get-Script is...batty...we care about the resource being controlled.
Luckily, the statement that "the keys need to match the parameters" can be interpreted to mean you need to match ALL of them, or it can be interpreted to mean "they just need to exist" and in the case of the Script resource Result does exist. And that is what we need to return.
GetScript = {return @{Result=''}}
They really need to update the TechNet page to say "GetScript needs to return a hash table with at least one key matching a parameter in the schema for the resource".
No need to return potentially hundreds of lines of code in some M.C. Escher-like construct containing itself. Just stick to returning information about the resource you are controlling. If your script sets the contents of a file, return the contents of that file. Not the contents of the file AND the script you used to set it AND the script you used to test it.
PowerShell DSC: Remote Monitoring Configuration Propagation
So if you are like me you are not really interested in crossing your fingers and hoping your servers are working right. Which is why it is uniquely frustrating that DSC does not have anything resembling a dashboard (not a complaint really, it is early days, but in practical application not knowing something went down is...not really an option unless you like being sloppy).
The way I build my servers is, I have an XML file with a list of servers, their role, and their role GUID. Baked into the master image is a simple bootstrap script that goes and gets the build script, since I'm using DSC the "build" script doesn't really build much, itself mostly just bootstrapping the DSC process. The first script to run is:
$nodeloc = "\\dscserver\DSC\Nodes\nodes.xml" # Get node information. try { [xml]$nodes = Get-Content -Path $nodeloc -ErrorAction 'Stop' $role = $nodes.hostname.$env:COMPUTERNAME.role } catch{ Write-Host "Could not find matching node, exiting.";Break } # Set correct build script location. switch($role) { "XenAppPKG" { $scriptloc = "\\dscserver\DSC\Scripts\pkgbuild.ps1" } "XenAppQA" { $scriptloc = "\\dscserver\DSC\Scripts\qabuild.ps1" } "XenAppProd" { $scriptloc = "\\dscserver\DSC\Scripts\prodbuild.ps1" } } Write-Host "Script location set to:"$scriptloc if((Test-Path -Path "C:\scripts") -ne $true){ New-Item -Path "C:\scripts" -ItemType Directory -Force -ErrorAction 'Stop' } Write-Host "Checking build script availability..." while((Test-Path -Path $scriptloc) -ne $true){ Start-Sleep -Seconds 15 } Write-Host "Fetching build script..." while((Test-Path -Path "C:\scripts\build.ps1") -ne $true){ Copy-Item -Path $scriptloc -Destination "C:\scripts\build.ps1" -ErrorAction 'SilentlyContinue' } Write-Host "Executing build script..." & C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -file "C:\scripts\build.ps1"
The information it looks for in the nodes.xml file looks like this:
<hostname> <A01 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <A02 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <A03 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <A04 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <B01 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <B02 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <B03 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> <B04 role="XenAppProd" guid="22e35281-49c6-40f3-9fd7-ad7f8d69c84d" /> </hostname>
I wont go any further into this as most of it has already been covered here before, the main gist of this is, my solution to this problem relies on the fact that I use the XML file to provision DSC on these machines.
There are a couple modifications I need to make to my DSC config to enable tracking, note the first item are only there so I can override the GUID from the CMDLine if I want. In reality you could just set the ValueData to ([GUID]::NewGUID()).ToString() and be fine.
The first bit of code take place before I start my Configuration block, the actual Registry resource is the very last resource in the Configuration block (less chance of false-positives due to an error mid-config).
param ( [string]$guid = ([GUID]::NewGuid()).ToString() ) ... Registry verGUID { Ensure = "Present" Key = "HKLM:\SOFTWARE\PostBuild" ValueName = "verGuid" ValueData = $verGUID ValueType = "String" }
From here we get to the important part:
[regex]$node = '(\[Registry\]verGUID[A-Za-z0-9\";\r\n\s=:\\ \-\.\{]*)' [regex]$guid = '([a-z0-9\-]{36})' $path = "\\dscserver\Configuration\" $pkg = @() $qa = @() $prod = @() $watch = @{} $complete = @{} [xml]$nodes = (Get-Content "\\dscserver\DSC\Nodes\nodes.xml") # Find a list of machine names and role guids. foreach($child in $nodes.hostname.ChildNodes) { switch($child.Role) { "XenAppPKG" { $pkg += $child.Name;$pkgGuid = $child.guid } "XenAppQA" { $qa += $child.Name;$qaGuid = $child.guid } "XenAppProd" { $prod += $child.Name;$prodGuid = $child.guid } } } # Convert DSC GUID's to latest verGUID. $pkgGuid = $guid.Match(($node.Match((Get-Content -Path ($path+$pkgGuid+".mof")))).Captures.Value).Captures.Value $qaGuid = $guid.Match(($node.Match((Get-Content -Path ($path+$qaGuid+".mof")))).Captures.Value).Captures.Value $prodGuid = $guid.Match(($node.Match((Get-Content -Path ($path+$prodGuid+".mof")))).Captures.Value).Captures.Value # See if credentials exist in this session. if($creds -eq $null){ $creds = (Get-Credential) } # Make an initial pass, determine configured/incomplete servers. if($pkg.Count -gt 0 -and $pkgGuid.Length -eq 36) { foreach($server in $pkg) { $test = Invoke-Command -ComputerName $server -Credential $creds -ScriptBlock{ (Get-ItemProperty -Path "HKLM:\SOFTWARE\PostBuild" -Name verGUID -ErrorAction 'SilentlyContinue').verGUID } if($test -ne $pkgGuid) { Write-Host ("Server {0} does not appear to be configured, adding to watchlist." -f $server) $watch[$server] = $pkgGuid }else{ Write-Host ("Server {0} appears to be configured. Adding to completed list." -f $server) $complete[$server] = $true } } }else{ Write-Host "No Pkg server nodes found or no verGUID detected in Pkg config. Skipping." } if($qa.Count -gt 0 -and $qaGuid.Length -eq 36) { foreach($server in $qa) { $test = Invoke-Command -ComputerName $server -Credential $creds -ScriptBlock{ (Get-ItemProperty -Path "HKLM:\SOFTWARE\PostBuild" -Name verGUID -ErrorAction 'SilentlyContinue').verGUID } if($test -ne $qaGuid) { Write-Host ("Server {0} does not appear to be configured, adding to watchlist." -f $server) $watch[$server] = $qaGuid }else{ Write-Host ("Server {0} appears to be configured. Adding to completed list." -f $server) $complete[$server] = $true } } }else{ Write-Host "No QA server nodes found or no verGUID detected in QA config. Skipping." } if($prod.Count -gt 0 -and $prodGuid.Length -eq 36) { foreach($server in $prod) { $test = Invoke-Command -ComputerName $server -Credential $creds -ScriptBlock{ (Get-ItemProperty -Path "HKLM:\SOFTWARE\PostBuild" -Name verGUID -ErrorAction 'SilentlyContinue').verGUID } if($test -ne $prodGuid) { Write-Host ("Server {0} does not appear to be configured, adding to watchlist." -f $server) $watch[$server] = $prodGuid }else{ Write-Host ("Server {0} appears to be configured. Adding to completed list." -f $server) $complete[$server] = $true } } }else{ Write-Host "No Production server nodes found or no verGUID detected in Production config. Skipping." } # Pause for meatbag digestion. Start-Sleep -Seconds 10 # Monitor incomplete servers until all servers return matching verGUID's. if($watch.Count -gt 0){ $monitor = $true }else{ $monitor = $false } while($monitor -ne $false) { $monitor = $false $cleaner = @() foreach($server in $watch.Keys) { $test = Invoke-Command -ComputerName $server -Credential $creds -ScriptBlock{ (Get-ItemProperty -Path "HKLM:\SOFTWARE\PostBuild" -Name verGUID -ErrorAction 'SilentlyContinue').verGUID } if($test -eq $watch[$server]) { $complete[$server] = $true $cleaner += $server }else{ $monitor = $true } } foreach($item in $cleaner){ $watch.Remove($item) } Clear-Host Write-Host "mConfigured Servers:`r`n"$complete.Keys Write-Host "`r`n`r`nmIncomplete Servers:`r`n"$watch.Keys if($monitor -eq $true){ Start-Sleep -Seconds 10 } } Clear-Host Write-Host "Configured Servers:`r`n"$complete.Keys Write-Host "`r`n`r`nIncomplete Servers:`r`n"$watch.Keys
End of the day is this a perfect solution? No. Bear in mind I just slapped this together to fill a void, things could be objectified, cleaned up, probably streamlined, but honestly a powershell script is not a good dashboard. I would also rather the servers themselves flag their progress in a centralized location rather than being pinged by a script.
But that is really something best implemented by the PowerShell devs, as anything 3rd party would, IMO, be rather ugly. So if all we have right now is ugly, I'll take ugly and fast.
As always, use at your own risk, I cannot imagine how you could eat a server with this script but don't go using it as some definitive health-metric. Just use it as a way to get a rough idea of the health of your latest configuration push.
App-V: ADM-Get-Assoc
A script to connect to the App-V DB and find what all software is assigned to a particular user or group.
All the instructions you should need are in the script itself.
App-V: PrimalScript 2011
Pretty straightforward on this one. But a couple steps you need to take beforehand. First, download the following:
Install both of these on your clean Sequencer, when it comes time to install PrimalScript (you should already know how to get to this point) set the install directory to the folder on your Q: drive, choose Complete and uncheck the two VC++ 2k8 options. The rest of the install should go pretty smooth.
The only oddness I saw with the sequence (after the basic cleanup) was the first launch it PowerShell.exe hung (I may have been impatient) and I had to enter the serial # twice, which may have been residual issues from the first launch. Haven't been able to reproduce it yet.
First update: The oddness WAS caused by my impatience with it's initial load. :sheepish:
Second: This same thing applies to the new 2012 batch of all these apps, just install the redists first, select custom blah blah blah.
PowerShell: Run via SCCM with Administrative rights.
If you have tried to run a PowerShell script before with SCCM you might have found it odd and not exactly intuitive. Here are a couple tips.
The most frustrating part of this problem is simply...not being able to tell what is wrong. The error message comes and goes before you have a chance of seeing it.
It isn't neccessary to do this (as I've already done it), but to solve the problem I modified my SCCM command line as follows:
%COMSPEC% /K powershell.exe -noprofile -file script.ps1 -executionpolicy Bypass
When running this the first thing you will notice is an error from cmd.exe saying UNC paths are not supported, reverting back to the windows directory. Well now your script wont launch because there is no correct working directory (which WOULD have been the network share on your distribution point). To get around THIS you have to set the following:
Key: HKLM\Software\Wow6432Node\Microsoft\Command Processor
Value: DisableUNCCheck
Data (DWORD): 1
Now if you run your program again you will see it says the execution of scripts on the local machine has been disabled. Luckily you have a hand dandy command prompt (thanks to the /K switch) so you can type powershell -command "Get-ExecutionPolicy -list".
You will see that everything is Undefined. If you go open up a regular command prompt and type the same thing, you should see whatever your actual settings are (in my case it was Bypass set to the LocalMachine scope and everything else undefined, this was set to Bypass for TESTING reasons).
A whoami in the SCCM kicked command prompt shows nt authority\system as you would expect.
So the problem appears to be that when run as the system account -executionpolicy is ignored and it doesn't appear to be getting/setting it's execution policy in the same place everything else is.
For instance, right now on the same machine I have two windows open, one powershell run as administrator (via a domain account in the local admins group), the other via the command prompt SCCM launches. Here are the Get-ExecutionPolicy -list results from each:
Local Admin:
SCCM
Same machine, two different settings. First attempt was to use:
powershell.exe -noprofile -command "Set-ExecutionPolicy Bypass LocalMachine" -File script.ps1
This failed and ultimately it appears that powershell will either run -command or -file, but not both.
So the solution to running PowerShell scripts as admin via SCCM is to do the following:
Create an SCCM Program with the following command line:
powershell.exe -noprofile -command "Set-ExecutionPolicy Bypass LocalMachine"
Then one with the following:
powershell.exe -noprofile -file script.ps1
And finally a cleanup program:
powershell.exe -noprofile -command "Set-ExecutionPolicy RemoteSigned LocalMachine"
Obviously RemoteSigned should be whatever your organization has decided as the standard Execution Policy level, the default I believe is Restricted, most will probably use RemoteSigned, security (over)conscious will probably use AllSigned.
Is this ideal? No. Of course not. It paying attention to the -ExecutionPolicy switch would be ideal.
But it works.
And on a bit of a side tangent, I think it is a pretty "microsoft" view of security to put this rather convoluted security system in place, and then still have VBScripts executable right out the gate on Windows 7 box. And batch files.
What is the point of all this obnoxious security when, at the end of the day...they can just use VBScript. Had they made you toggle a setting somewhere to enable VBScript and Batch files, had they not made 5 scopes of security policy for powershell, had they basically not admitted that they don't know how to do a secure scripting language (your security should probably come from the user rights, and not purely the execution engine, though, that's a finer point you could argue several ways), I'd probably be a lot more sympathetic.
Had SCCM not been such a myopic monolithic dinosaur, this wouldn't be a problem. These are all symptoms of what will eventually kill them. Legacy. 64-bit filesystem redirection, the registry as a whole, more specifically Wow6432Node, Program Files (x86), needing to use PowerShell, VBScript or Batch files to do a simple file transfer/shortcut placement.
This is System Center Configuration Manager. It can use bits to copy a multi-gigabyte install "safely" to a client machine...but only if it's then going to run an installer.
It can't copy a file and place a shortcut, or add a key to the registry, it only manages the configuration in the most outmoded and obsolete ways possible.
/rant
Few quick notes. You see how MachinePolicy is set? That WILL override your SCCM packages, that means GPO is controlling the ExecutionPolicy, so while your command will take effect, it will be overruled.
Second note is the reason the two are different is because the SCCM version is using the x86 settings, not x64. Which, while it explains the difference, does not explain why running it with -ExecutionPolicy Bypass is ignored, nor why running a script as a user in SCCM works fine, but as an admin, it does not. End result being, you still need the workaround.
App-V: Scripting Multi-Line Batch Files.
Adding a script to an App-V package is all fine and good (even though it doesn't support anything beyond batch file era scripting languages) but if your script consists of multiple lines and you are adding it after the fact (especially for testing purposes) you may very well find it likes to butcher the script, condensing it all down to one long line.
The solution is very simple.
\r\n
Put that at the end of each line. It is the equivalent of a carriage return (enter...ish), and a new-line.
Is this a bit duct-tapey? Why yes, yes it is. But I suppose pointlessly "enhancing" the UI is a lot more important than making a sophisticated product, so get used to duct taping App-V together.