Quantcast
Channel: ConfigMgr – Adam, the Automator
Viewing all 19 articles
Browse latest View live

Tales of a Forced ConfigMgr Site Restore

$
0
0

sccmI was at Microsoft’s TechEd North America 2014 last week learning all kinds of new and cool technology. On Monday, I got pinged by my client’s helpdesk support that no ConfigMgr operations could take place. No one could connect with the ConfigMgr console, no OS deployments would work nor would any software deployments. WTF? This has never happened before where absolutely nothing would work. “Great”, I thought. “What better time for this to happen than when myself and the two other guys that help admin SCCM are gone. I’m going to have to miss a session to look into this.”

I grabbed a comfy chair there at the Houston Convention Center and thought this shouldn’t to too difficult. I mean, what is the worst that could have happened? All 3 of us admins were gone and no one else makes any changes to the core site server. I verified they weren’t crazy and managed to find there was no directory on the file system of the site server where the installation files were. I thought I was nuts and just forgot where it was installed to. Proceeding to open up SQL Server Management Studio shows no ConfigMgr database at all! OK, I really thought I was off my rocker now. How could the entire install and database just vanish? After some frantic minutes I finally saw that the site server itself was UNINSTALLED. This ended up being due to another helpful admin that decided to push an untested antivirus upgrade to it. For anyone that has Kaspersky antivirus and Configuration Manager 2012 be forewarned!

Needless to say I was forced to “test” out my site backup last Monday. I can’t complain about that. I was just using the integrated site backup maintenance task every night and the afterbackup.bat batch file to backup the certificates and SSRS database and it restored without a hitch; files, database and all. Fast forward through TechEd and me coming back to work…

I open up the ConfigMgr console and notice all of my clients are showing up as Inactive. That’s strange, however it appears everything else is working. I did a couple software deployments, no one’s complaining so I get busy doing something else. The next day, I’m trying a new deployment and notice that the clients aren’t reporting to the management point and haven’t been since the restore. Looking…looking…looking… Aha! …a problem with the management point. No problem, it’s probably just some fallout and needs to be reinstalled. After the MP role removal from the site server and a reinstall I notice in the MPSetup.log file that it’s failing on a MSXML6 prereq by looking into that wrong file path again (the old installation file path). What does a good IT pro do whenever a piece of software is looking at a wrong file path? The registry! Time to start replacing all instances of the folder path in the registry.  But wait…there’s more!  Turns out there’s also file path references in the database as well.  Never fear.  ApexSQL’s SQL Search tool made that a cinch by searching all tables at once.

The next management point role went without a hitch and the clients were now showing up as Active in the console and successfully getting their new policies from the server. Success!

Now, after all this I have some rhetorical questions as to why it had to be this hard.

  1. Why didn’t any component under the monitoring tab show a problem?  According to the component statuses all was good. No problems to see here.
  2. Why didn’t the new Support Center tool see that a problem was afoot.  I ran the troubleshooting diagnostics on a client and all green checks abounded.  In hindsight, it appears that tool just checks to ensure the client was able to process it’s policy correctly which it did.

In summary, this was unfortunate but allowed me to see the holes in my restore procedure.  Here’s what I should have done but will do from now on.

  1. Backup the \\SITESERVER\SMS_SITECODE\Client directory.  This isn’t backed up by default and when the site system is uninstalled, it’s gone.  This includes your client (which DOES get restored) but does not include any hotfixes you had been getting installed on the client.  I used the little trick to put a ClientPatch folder under the i386 and x64 client folders to install the MSP immediately after all client installs (yet unsupported).
  2. Backup your afterbackup.bat file.  It is not backed up, by default.  This file is located in CONFIGMGRINSTALLPATH\inboxes\smsbkup.box in case you forgot.
  3. Pay attention to the Post-Recovery tasks instead of just thinking it’s done after the database is back.  I missed a couple minor things there as well.
  4. Look into forgoing the built-in backup task all together in lieu of a ConfigMgr SQL Server maintenance plan.

 

The post Tales of a Forced ConfigMgr Site Restore appeared first on Adam the Automator.


Remove All Direct Membership Rules from a ConfigMgr Collection

$
0
0

Download this script on the Technet Script Repository

This script’s pretty self-explanatory.  I wrote it because I had a few collections that had a ton of direct membership rules and using the GUI was oh, so painful, as it usually is.

<#
.NOTES
Created on: 5/22/2014 10:56 AM
Created by: Adam Bertram
Filename: Remove-CMDirectoryMembershipRule.ps1
Credits: http://www.david-obrien.net/2013/02/24/remove-direct-membership-rules-configmgr/
.DESCRIPTION
This script removes all direct membership rules from a specified collection name
.EXAMPLE
.\Remove-CMDirectMembershipRule.ps1 -SiteCode 'CON' -SiteServer 'SERVERNAME' -CollectionName 'NAMEHERE'
.PARAMETER SiteCode
Your Configuration Manager site code
.PARAMETER SiteServer
Your Configuration Manager site server name
.PARAMETER CollectionName
The collection name you'd like to remove direct membership rules from
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $False,
ValueFromPipeline = $False,
ValueFromPipelineByPropertyName = $True)]
[string]$SiteCode = 'UHP',
[Parameter(Mandatory = $False,
ValueFromPipeline = $False,
ValueFromPipelineByPropertyName = $True)]
[string]$SiteServer = 'CONFIGMANAGER',
[Parameter(Mandatory = $True,
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True)]
[string]$CollectionName
)

begin {
try {
if ([Environment]::Is64BitProcess) {
# this script needs to run in a x86 shell, but we need to access the x64 reg-hive to get the AdminConsole install directory
throw 'This script must be run in a x86 shell.'
}
$ConfigMgrModule = "$($env:SMS_ADMIN_UI_PATH | Split-Path -Parent)\ConfigurationManager.psd1"
if (!(Test-Path $ConfigMgrModule)) {
throw 'Configuration Manager module not found in admin console path'
}
Import-Module $ConfigMgrModule

$BeforeLocation = (Get-Location).Path
} catch {
Write-Error $_.Exception.Message
}
}

process {
try {
Set-Location "$SiteCode`:"
$CommonWmiParams = @{
'ComputerName' = $SiteServer
'Namespace' = "root\sms\site_$SiteCode"
}
#HACK: This should be 1 WQL query using JOIN but it's not immediately obvious why that doesn't work
#Get-WmiObject -ComputerName $SiteServer -Namespace "ROOT\sms\site_$SiteCode" -Query "SELECT DISTINCT * FROM SMS_CollectionMember_a AS collmem JOIN SMS_Collection AS coll ON coll.CollectionID = collmem.CollectionID"

$CollectionId = Get-WmiObject @CommonWmiParams -Query "SELECT CollectionID FROM SMS_Collection WHERE Name = '$CollectionName'" | select -ExpandProperty CollectionID
if (!$CollectionId) {
throw "No collection found with the name $CollectionName"
}

## Find the collection members
$CollectionMembers = Get-WmiObject @CommonWmiParams -Query "SELECT Name FROM SMS_CollectionMember_a WHERE CollectionID = '$CollectionId'" | Select -ExpandProperty Name

if (!$CollectionMembers) {
Write-Warning 'No collection members found in collection'
} else {
foreach ($member in $CollectionMembers) {
Remove-CMDeviceCollectionDirectMembershipRule -CollectionID $CollectionID -ResourceName $member -force
}
}
} catch {
Write-Error $_.Exception.Message
}
}

end {
Set-Location $BeforeLocation
}

#>

The post Remove All Direct Membership Rules from a ConfigMgr Collection appeared first on Adam the Automator.

Software Updates Stuck in “AssignmentStateDetecting” Status

$
0
0

Head in HandsToday, it looks like I’m still running into issues relating to last week’s site system restore. For whatever reason, when attempting to deploy software updates all the clients could not find a WSUS location. I kept seeing this line in the UpdatesDeployment client log was “AssignmentStateDetecting” after repeated Update Deployment evals. I was also noticing that in ScanAgent.log client log file, it would stick on the line “Locations requested for ScanJobID….”. Let’s get to troubleshooting!

Under the Component Status monitoring tool in the console I went into the SMS_SITE_SYSTEM_STATUS_SUMMARIZER component and found that it was telling me “Site System Status Summarizer still cannot access storage object “\\SUPPOINT\D$\SMS” on site system…. I also noticed that going to the Configuration Manager Service Manager tool, right-clicking on any component that was on that site server would fail. I immediately tried to go to the D:\ on SUPPOINT and it didn’t exist! Bingo! For some reason, that entire SMS directory was gone but the site server was still there in the console.

At first, I attempted to remove the software update point and reinstall which did not good at all. The reason was because the entire site system role was hosed. So, I deleted all the roles from the site server (since this server only housed the software update point role) and readded it again. After waiting a little while and getting impatient by staring at “Site Component Manager detected that this component should be install on this site system…” over and over, I stopped and restarted the SMS_SITE_COMPONENT_MANAGER which forced an install. After a little bit, the D:\SMS folder showed up on SUPPOINT.

I then initiated an update deployment scan on a client saw “Job Error (0x87d00600)…”. Poking around in the All Software Updates node I noticed all of my updates were now expired and software sync’s from Microsoft were failing with “Failed to sync update…” in the wsyncmgr.log file on the server. Yay! Now I’m off to remove all the updates and redownload again. *sigh*

The post Software Updates Stuck in “AssignmentStateDetecting” Status appeared first on Adam the Automator.

All ConfigMgr Software Updates Expired After Site Restore

$
0
0

As I type, I’m in the middle of a major Configuration Manager problem. You can’t do too many updates deployments when every, single update in the database is expired! This is what it took to get my ConfigMgr software updates back up and going after a restore. According to my Microsoft support person, this isn’t the first time this had to be done after a restore so beware ConfigMgr admins! I hope you never have to go through this.

What I’m about to explain is in no way my fault if it hoses your database. This was my solution for fixing my updates were all were expired in the database and the WSUS sync was failing with CCISource error: 2. From my experience, this error code always has to do with some kind of corruption in the ConfigMgr database. Apparently, I was still able to see the updates only because the metadata was still available so the updates were long gone anyway. I was hoping I didn’t have to go through an entire resync again and setup all update groups, packages and deployments but alas that was a false hope.

Here are the steps that Microsoft support and I went through to get my ConfigMgr software updates back up and going again:

  1. Kicked off the SMS_SITE_BACKUP component to start a backup, monitored via the smsbkup.log file and confirmed success.
  2. Ran SQL query
  3. select * from CI_ConfigurationItems
    
  4. Found the ModelName attribute by looking in the wsyncmgr.log file and looked for the line “Referenced configuration items are not available yet:…“. On this line, you’ll find a “ModelName=Site_ABCDEF/SUM_XYZ” line. You need the site model name which in this example is ABCDEF.
  5. Ran SQL queries to find all affected updates
  6. SELECT * FROM CI_DocumentStore WHERE Document_ID NOT IN (SELECT Document_ID FROM CI_CIDocuments)
    SELECT * FROM v_updatecis WHERE ModelName NOT LIKE '%ABCDEF/SUM%'
    
  7. Stopped the SMS_EXECUTIVE service but, in my case, it didn’t not stop so the smsexec.exe process was killed.
  8. Stopped SMS_SITE_COMPONENT_MANAGER service
  9. Ran SQL query to find no rows affected
  10. UPDATE CI SET IsExpired=1 FROM v_updatecis ci where ci.citype_id in (1,8) and modelname like 'Site_ABCDEF/SUM_%"
    
  11. Started the SMS_EXECUTIVE service again.
  12. Removed the software update point role
  13. Checked SUPsetup.log on SUP server to confirm role removal
  14. Removed the Windows Server Update Services role and all subfeatures from the SUP server and rebooted
  15. Ran SQL queries to remove all expired updates
  16. update ci_configurationitems set isexpired=1 where citype_id in (1,8)
    delete Ci_configurationitemrelations where toci_id in (Select ci_id from ci_configurationitems where citype_id in (1,8))
    delete ci_assignmenttargetedCIs where ci_id in (Select ci_id from ci_configurationitems where citype_id in (1,8))
    
  17. The first UPDATE query usually takes a little bit and in my case actually failed with the error:”
  18. delete statement conflicted with the reference constraint “ci_currenterrordetails_ciid_fk” The conflict occurred in the database…..
    It looks like CIType_Id 8 was causing the foreign key constraint because

    delete ci_configurationitems where citype_id = 1
    

    worked on it’s own.

    A SQL engineer took over and create a temp stored procedure to correct the error. Afterwards, the database was clean!

  19. Renamed the SMS, UpdateServicesPackages and WsusContent WSUS folders to .old on the server holding the software update point role
  20. Reinstalled WSUS on the SUP server with the same content directory as before
  21. Run WSUS to perform the post-installation tasks
  22. Added the SUP role back again documenting all checked categories
  23. Unchecked all categories but a couple to speed up the initial sync
  24. Confirmed a successful software update point role install in SUPsetup.log
  25. Checked wsusctrl.log to ensure the SUP role is connecting to the WSUS server
  26. Go to the WSUS console –> Options –> Products and Classifications and uncheck the all but a couple categories to match ConfigMgr categories
  27. Ensure the UpdateServicesPackages and WsusContent folders have recreated on the SUP role server
  28. Started a manual ConfigMgr WSUS sync
  29. Removed all update groups, packages and deployments
  30. Checked wsyncmgr.log and sync was successful and updates are showing up correctly in the console
  31. Checked all product categories needed again and initiated another sync to get the rest of the updates into the database.
  32. Recreated all update groups, deployment packages and deployments.

The post All ConfigMgr Software Updates Expired After Site Restore appeared first on Adam the Automator.

Unable to Connect to WSUS Database: Solved!

$
0
0

The saga continues… I still can’t get a successful ConfigMgr software update sync from my WSUS server. The latest issue is now I couldn’t access the Windows Internal Database to run any kind of maintenance script. When trying to connect either via sqlcmd or Management Studio via my logged in Windows credentials I was receiving the error system cannot find the file specified when attempting to connect with the server name nb:\\.\pipe\MSSQL$MICROSOFT##SSEE\sql\query and the error login failed when connecting with the server name nb:\\.\pipe\MICROSOFT\##WID\tsql\query. Also, event ID 18456 – Token-based server access validation failed with an infrastructure error… was being generated in the Application event log along with Login failed for user…. server is in script upgrade mode in the C:\Windows\WID\Log\error.log.

Microsoft support suspected problems with the WID so we had to get SQL access to it. To anyone that’s trying to access the WSUS Windows Internal Database via SQL Server Management Studio Express and cannot get connected, here’s what I had to go through:

    1. Download and install SQL Server Management Studio Express.
    2. Open up SQL Server Configuration Manager and go to Native Client 11.0 Configuration -> Client Protocols and ensure TCP/IP is enabled.
    3. Connect to database on command line via sqlcmd -s np:\\.\pipe\MICROSOFT\##WID\tsql\query
    4. Execute the query
      SELECT session_id from sys.dm_Exec_requests where status <> 'sleeping' and session_id <> @@spid
      SELECT @@spid --This gets the allowed user SPID
      sp_who 53 --The 53 was my @@spid. This is used to figure out the account that has access
      

For some reason, this fixed the connection issue and we were now able to connect!

The post Unable to Connect to WSUS Database: Solved! appeared first on Adam the Automator.

Rebuild/Defragment Your WSUS Database Indexes

$
0
0

690px-Microsoft_SQL_Server_Logo.svgI’ve realized it’s important to keep your SQL indexes properly maintained. I’m no SQL DBA but from what I hear, consistently ensure your indexes are defragemented is a good thing. During my recently ConfigMgr software update/WSUS debacle, Microsoft support ran a handy little script on my WSUS server that I decided to share. Use this at your own discretion and I’m in no way responsible for an imploded WSUS Windows Internal Database!

USE SUSDB; 
GO 
SET NOCOUNT ON; 
 
-- Rebuild or reorganize indexes based on their fragmentation levels 
DECLARE @work_to_do TABLE ( 
    objectid int 
    , indexid int 
    , pagedensity float 
    , fragmentation float 
    , numrows int 
) 
 
DECLARE @objectid int; 
DECLARE @indexid int; 
DECLARE @schemaname nvarchar(130);  
DECLARE @objectname nvarchar(130);  
DECLARE @indexname nvarchar(130);  
DECLARE @numrows int 
DECLARE @density float; 
DECLARE @fragmentation float; 
DECLARE @command nvarchar(4000);  
DECLARE @fillfactorset bit 
DECLARE @numpages int 
 
-- Select indexes that need to be defragmented based on the following 
-- * Page density is low 
-- * External fragmentation is high in relation to index size 
PRINT 'Estimating fragmentation: Begin. ' + convert(nvarchar, getdate(), 121)  
INSERT @work_to_do 
SELECT 
    f.object_id 
    , index_id 
    , avg_page_space_used_in_percent 
    , avg_fragmentation_in_percent 
    , record_count 
FROM  
    sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL , NULL, 'SAMPLED') AS f 
WHERE 
    (f.avg_page_space_used_in_percent < 85.0 and f.avg_page_space_used_in_percent/100.0 * page_count < page_count - 1) 
    or (f.page_count > 50 and f.avg_fragmentation_in_percent > 15.0) 
    or (f.page_count > 10 and f.avg_fragmentation_in_percent > 80.0) 
 
PRINT 'Number of indexes to rebuild: ' + cast(@@ROWCOUNT as nvarchar(20)) 
 
PRINT 'Estimating fragmentation: End. ' + convert(nvarchar, getdate(), 121) 
 
SELECT @numpages = sum(ps.used_page_count) 
FROM 
    @work_to_do AS fi 
    INNER JOIN sys.indexes AS i ON fi.objectid = i.object_id and fi.indexid = i.index_id 
    INNER JOIN sys.dm_db_partition_stats AS ps on i.object_id = ps.object_id and i.index_id = ps.index_id 
 
-- Declare the cursor for the list of indexes to be processed. 
DECLARE curIndexes CURSOR FOR SELECT * FROM @work_to_do 
 
-- Open the cursor. 
OPEN curIndexes 
 
-- Loop through the indexes 
WHILE (1=1) 
BEGIN 
    FETCH NEXT FROM curIndexes 
    INTO @objectid, @indexid, @density, @fragmentation, @numrows; 
    IF @@FETCH_STATUS < 0 BREAK; 
 
    SELECT  
        @objectname = QUOTENAME(o.name) 
        , @schemaname = QUOTENAME(s.name) 
    FROM  
        sys.objects AS o 
        INNER JOIN sys.schemas as s ON s.schema_id = o.schema_id 
    WHERE  
        o.object_id = @objectid; 
 
    SELECT  
        @indexname = QUOTENAME(name) 
        , @fillfactorset = CASE fill_factor WHEN 0 THEN 0 ELSE 1 END 
    FROM  
        sys.indexes 
    WHERE 
        object_id = @objectid AND index_id = @indexid; 
 
    IF ((@density BETWEEN 75.0 AND 85.0) AND @fillfactorset = 1) OR (@fragmentation < 30.0) 
        SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REORGANIZE'; 
    ELSE IF @numrows >= 5000 AND @fillfactorset = 0 
        SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REBUILD WITH (FILLFACTOR = 90)'; 
    ELSE 
        SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REBUILD'; 
    PRINT convert(nvarchar, getdate(), 121) + N' Executing: ' + @command; 
    EXEC (@command); 
    PRINT convert(nvarchar, getdate(), 121) + N' Done.'; 
END 
 
-- Close and deallocate the cursor. 
CLOSE curIndexes; 
DEALLOCATE curIndexes; 
 
 
IF EXISTS (SELECT * FROM @work_to_do) 
BEGIN 
    PRINT 'Estimated number of pages in fragmented indexes: ' + cast(@numpages as nvarchar(20)) 
    SELECT @numpages = @numpages - sum(ps.used_page_count) 
    FROM 
        @work_to_do AS fi 
        INNER JOIN sys.indexes AS i ON fi.objectid = i.object_id and fi.indexid = i.index_id 
        INNER JOIN sys.dm_db_partition_stats AS ps on i.object_id = ps.object_id and i.index_id = ps.index_id 
 
    PRINT 'Estimated number of pages freed: ' + cast(@numpages as nvarchar(20)) 
END 
GO 
 
 
--Update all statistics 
PRINT 'Updating all statistics.' + convert(nvarchar, getdate(), 121)  
EXEC sp_updatestats 
PRINT 'Done updating statistics.' + convert(nvarchar, getdate(), 121)  
GO

The post Rebuild/Defragment Your WSUS Database Indexes appeared first on Adam the Automator.

Problems Downloading ConfigMgr Software Updates License Terms: SOLVED!

$
0
0

To finish off the marathon troubleshooting session I’ve had lately I’ve finally fixed the last lingering software update sync problem. This problem was noticed by the following symptoms:

I was seeing this error in the wsyncmgr.log over and over.

Failed to sync update XXXXXXXXXXXXXXXXXXXXX. Error: The Microsoft Software License Terms have not been completely downloaded and cannot be accepted. Source: Microsoft.UpdateServices.Internal.BaseApi.SoapExceptionProcessor.DeserializeAndThrow

These events were showing up in the Application event log:

Event ID: 2 - Content file download failed. Reason: Value does not fall within the expected range. Source File: /msdownload/update/v5/eula/xpsp2eula_chs.txt Destination File: D:\Sources\WsusContent\3F\257205304B7517CD24386B2D997F74743DC6E73F.txt.
Event ID: 7 - The server is failing to download some updates.

These lines were constantly coming up in the C:\Program Files\Update Services\LogFiles\SoftwareDistribution.log

WsusService.15	ContentSyncAgent.SetProxySettings	Value does not fall within the expected range.
WsusService.15	EventLogEventReporter.ReportEvent	EventId=364,Type=Error,Category=Synchronization,Message=Content file download failed. Reason: Value does not fall within the expected range. Source File: /msdownload/update/v5/eula/officefilevalidation_eula_bg-bg-950bcb06-a2aa-4b25-bae4-2c198be23859.txt Destination File: D:\Sources\WsusContent\C5\CC06CFA16ED105D62526DBDD724A27F0A6A57FC5.txt.
Warning	w3wp.3	SoapExceptionProcessor.SerializeAndThrow	Discarding stack trace for user APOLLO\SERVER$, IP Address xxx.xxx.xxx.xxx, exception System.InvalidOperationException: The Microsoft Software License Terms have not been completely downloaded and cannot be accepted.
Warning	w3wp.40	SoapUtilities.CreateException	ThrowException: actor = http://wsusserver:8530/ClientWebService/client.asmx, ID=9e952ee5-8337-4f24-a76f-75c212cd27ba, ErrorCode=InvalidCookie, Message=, Client=?

Microsoft support couldn’t easily fix this and forced me to open a new ticket. Since the client used a proxy for all Internet traffic, they blamed it on the proxy. I refused to open another ticket and decided to look into it myself. Luckily, just after a few minutes of Googling, I came across the KB2838998 hotfix, downloaded, installed, rebooted the server with the SUP role on it, tried the sync again and worked like a champ!

I sure as hell hope next week my days go back to normal! I didn’t to write any cool Powershell scripts!

The post Problems Downloading ConfigMgr Software Updates License Terms: SOLVED! appeared first on Adam the Automator.

CMClient Module: Invoke Common ConfigMgr Client Actions via Powershell

$
0
0

UPDATE 06/13/14 – Added Enable/Disable-CMClientDebugLogging functions.

I’m pleased to share my most recent module; CMClient! This module contains a lot of the common actions I tend to do on ConfigMgr clients all grouped into a module. At this point, it only has functions to kick off common client triggers. These triggers are Machine Policy Download, Discovery Data Cycle, Compliance Evaluation, Application Deployment Evaluation, Hardware Inventory, Software Inventory, Update Deployment Evaluation and Update Scan.

Each of these functions is simply an easy way to trigger client schedule IDs. Each function feeds back to the the function that’s doing the work; Invoke-CMClientAction. If you specify -AsJob on any function, that gets passed to the Initialize-CMClientJob. I’ve tested these functions on nearly all of my 5,000 clients so far so I believe it’s fairly bug-free. If you notice any problems, please let me know.

I intend to be adding a lot more functions to this at a later point in time. Functions like Set-CMClientBusinessHours, Get-CMClientBusinessHours, Disable-CMClientBusinessHours, Get-CMClientApplicationDeploymentState and Get-CMClientUpdateDeploymentState are coming soon! I’ve got those functions done but they haven’t been prettified yet. :)

For now, I’m just releasing the PSM1 file. When I get the manifest done and all of the pretty stuff available, I’ll be sure to update this post.

Enjoy!

<#	
	===========================================================================
	 Created with: 	SAPIEN Technologies, Inc., PowerShell Studio 2014 v4.1.58
	 Created on:   	6/9/2014 1:57 PM
	 Created by:   	Adam Bertram
	 Filename:     	CMClient.psm1
	-------------------------------------------------------------------------
	 Module Name: CMClient
	===========================================================================
#>
#region Invoke-CMClientAction
<#
.SYNOPSIS
	This is a helper function that initiates many ConfigMgr client actions.
.DESCRIPTION

.PARAMETER  Computername
	The system you'd like to initate the action on.
.PARAMETER  AsJob
	A switch parameter that initates a job in the background
.PARAMETER  ClientAction
	The client action to initiate.
.EXAMPLE
	PS C:\> Invoke-CMClientAction -Computername 'Value1' -AsJob
	This example shows how to call the Invoke-CMClientAction function with named parameters.
.NOTES
#>
function Invoke-CMClientAction {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true)]
		[string]$Computername,
		[Parameter(Mandatory = $true)]
		[ValidateSet('MachinePolicy',
			'DiscoveryData',
			'ComplianceEvaluation',
			'AppDeployment', 
			'HardwareInventory',
			'UpdateDeployment',
			'UpdateScan',
			'SoftwareInventory')]
		[string]$ClientAction,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		try {
			$ScheduleIDMappings = @{
				'MachinePolicy' = '{00000000-0000-0000-0000-000000000021}';
				'DiscoveryData' = '{00000000-0000-0000-0000-000000000003}';
				'ComplianceEvaluation' = '{00000000-0000-0000-0000-000000000071}';
				'AppDeployment' = '{00000000-0000-0000-0000-000000000121}';
				'HardwareInventory' = '{00000000-0000-0000-0000-000000000001}';
				'UpdateDeployment' = '{00000000-0000-0000-0000-000000000108}';
				'UpdateScan' = '{00000000-0000-0000-0000-000000000113}';
				'SoftwareInventory' = '{00000000-0000-0000-0000-000000000002}';
			}
			$ScheduleID = $ScheduleIDMappings[$ClientAction]
		} catch {
			Write-Error $_.Exception.Message
		}
		
	}
	Process {
		try {
			## Args1 represents the computername and $args[1] represents the scheduleID
			$ActionScriptBlock = {
				[void] ([wmiclass] "\\$($args[0])\root\ccm:SMS_Client").TriggerSchedule($args[1]);
				if (!$?) {
					throw "Failed to initiate a $ClientAction on $($args[0])"
				}
			}
			
			if ($AsJob.IsPresent) {
				$Params = @{
					'Computername' = $Computername;
					'OriginatingFunction' = $ClientAction;
					'ScriptBlock' = $ActionScriptBlock;
					'ScheduleID' = $ScheduleID
				}
				Initialize-CMClientJob @Params
			} else {
				Invoke-Command -ScriptBlock $ActionScriptBlock -ArgumentList $Computername,$ScheduleID
			}
		} catch {
			Write-Error $_.Exception.Message
		}
		
	}
	End {
		
	}
}
#endregion

#region Initialize-CMClientJob
<#
.SYNOPSIS
	This is a helper function that starts and manages background jobs for the all
	functions in this module.
.DESCRIPTION

.PARAMETER  OriginatingFunction
	The function where the job request came from.  This is used to keep track of
	which background jobs were started by which function.
.PARAMETER  ScriptBlock
	This is the scriptblock that is passed to the system.
.PARAMETER  Computername
	The computer name that the function is connecting to.  This is used to keep track
	of the functions initiated on computers.
.PARAMETER  ScheduleID
	This is the schedule ID to designate which client action to initiate.  This is needed
	because it has to be sent to the background job scriptblock.
.EXAMPLE

.NOTES
#>
function Initialize-CMClientJob {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true)]
		[string]$OriginatingFunction,
		[Parameter(Mandatory = $true)]
		[scriptblock]$ScriptBlock,
		[Parameter(Mandatory = $true)]
		[string]$Computername,
		[Parameter(Mandatory = $true)]
		[string]$ScheduleID
	)
	
	Begin {
		## The total number of jobs that can be concurrently running
		$MaxJobThreads = 10
		## How long to wait when the max job threads has been met to start another job
		$JobWaitSecs = 1
	}
	Process {
		try {
			Write-Verbose "Starting job `"$ComputerName - $OriginatingFunction`"..."
			Start-Job -ScriptBlock $ScriptBlock -Name "$ComputerName - $OriginatingFunction" -ArgumentList $Computername, $ScheduleID | Out-Null
			While ((Get-Job -state running).count -ge $MaxJobThreads) {
				Write-Verbose "Maximum job threshold of $MaxJobThreadsget-db has been met.  Waiting $JobWaitSecs second(s) to try again...";
				Start-Sleep -Seconds $JobWaitSecs
			}			
		} catch {
			Write-Error $_.Exception.Message
		}
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientMachinePolicyDownload
<#
.SYNOPSIS
	This function invokes a machine policy download on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the machine policy download on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientMachinePolicyDownload -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientMachinePolicyDownload {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[alias('Name')]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername'	= $Computername;
			'ClientAction'  = 'MachinePolicy';
			'AsJob'			= $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientDiscoveryDataCycle
<#
.SYNOPSIS
	This function invokes a DDR cycle on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientDiscoveryDataCycle -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientDiscoveryDataCycle {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'DiscoveryData';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientComplianceEvaluation
<#
.SYNOPSIS
	This function invokes a compliance evaluation on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientComplianceEvaluation -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientComplianceEvaluation {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'ComplianceEvaluation';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientApplicationDeploymentEvaluation
<#
.SYNOPSIS
	This function invokes an application deployment eval on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientApplicationDeploymentEvaluation -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientApplicationDeploymentEvaluation {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'AppDeployment';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientHardwareInventory
<#
.SYNOPSIS
	This function invokes a hardware inventory cycle on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientHardwareInventory -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientHardwareInventory {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'HardwareInventory';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientUpdateDeploymentEvaluation
<#
.SYNOPSIS
	This function invokes an update deployment eval on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientUpdateDeploymentEvaluation -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientUpdateDeploymentEvaluation {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'UpdateDeployment';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientUpdateScan
<#
.SYNOPSIS
	This function invokes an update scan eval on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientUpdateScan -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientUpdateScan {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'UpdateScan';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Invoke-CMClientSoftwareInventory
<#
.SYNOPSIS
	This function invokes a software inventory scan on a ConfigMgr client
.DESCRIPTION

.PARAMETER  Computername
	The name of the system you'd like to invoke the action on
.PARAMETER  AsJob
	Specify this parameter if you'd like to run this as a background job.
.EXAMPLE
	PS C:\> Invoke-CMClientSoftwareInventory -Computername 'Value1' -AsJob
.NOTES

#>
function Invoke-CMClientSoftwareInventory {
	[CmdletBinding()]
	param
	(
		[Parameter(Mandatory = $true,
				   ValueFromPipeline = $true,
				   ValueFromPipelineByPropertyName = $true)]
		[string]$Computername,
		[Parameter()]
		[switch]$AsJob
	)
	
	Begin {
		
	}
	Process {
		$Params = @{
			'Computername' = $Computername;
			'ClientAction' = 'SoftwareInventory';
			'AsJob' = $AsJob.IsPresent
		}
		Invoke-CMClientAction @Params
	}
	End {
		
	}
}
#endregion

#region Set-CMClientDebugLogging
<#
.NOTES
	 Created on:   	5/23/2014 5:27 PM
	 Created by:   	Adam Bertram
	 Filename:      Set-CMClientDebugLogging.ps1
.DESCRIPTION
	This simple script either enables or disables ConfigMgr client debug logging on a remote computer
.EXAMPLE
	.\Set-CMClientDebugLogging.ps1 -Computername COMPUTERNAME -Enable
.EXAMPLE
	.\Set-CMClientDebugLogging.ps1 -Computername COMPUTERNAME -Diable
.PARAMETER Computername
	The computer name you'd like to connect to
.PARAMETER Enable
	Add this parameter to enable debug logging
.PARAMETER Disable
	Add this parameter to disable debug logging
#>
function Set-CMClientDebugLogging {
	[CmdletBinding(DefaultParameterSetName = 'Enable')]
	param (
		[Parameter(Mandatory = $True,
				   ValueFromPipeline = $True,
				   ValueFromPipelineByPropertyName = $True)]
		[ValidateScript({ Test-Connection $_ -Count 1 -Quiet })]
		[string[]]$Computername,
		[Parameter(ParameterSetName = 'Enable',
				   Mandatory = $True,
				   ValueFromPipeline = $True,
				   ValueFromPipelineByPropertyName = $True)]
		[switch]$Enable,
		[Parameter(ParameterSetName = 'Disable',
				   Mandatory = $True,
				   ValueFromPipeline = $True,
				   ValueFromPipelineByPropertyName = $True)]
		[switch]$Disable
	)
	
	begin {
		try {
			$Registry = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $Computername)
			$LogLevel = @{ $true = 0; $false = 1 }[$Enable.IsPresent]
		} catch {
			Write-Error $_.Exception.Message
		}
	}
	
	process {
		try {
			$RegistryKey = $Registry.OpenSubKey("SOFTWARE\Microsoft\CCM\Logging\@Global", $true)
			$RegistryKey.SetValue("Loglevel", $LogLevel, [Microsoft.Win32.RegistryValueKind]::Dword)
			
			if ($Disable.IsPresent) {
				## Delete the enabled value rather than the entire Enabled key. I am not deleting the
				## entire key to enable this line to be backward compatible with XP.
				$RegistryKey = $Registry.OpenSubKey("SOFTWARE\Microsoft\CCM\Logging\Enabled", $true)
				$RegistryKey.DeleteValue('Enabled')
			} else {
				$RegistryKey = $Registry.OpenSubKey("SOFTWARE\Microsoft\CCM\Logging", $true)
				$NewKey = $RegistryKey.CreateSubKey('Enabled')
				$NewKey.Close()
				$RegistryKey = $Registry.OpenSubKey("SOFTWARE\Microsoft\CCM\Logging\Enabled", $true)
				$RegistryKey.SetValue('Enabled', 'True', [Microsoft.Win32.RegistryValueKind]::String)
			}
			
		} catch {
			Write-Error $_.Exception.Message
		}
	}
}
#endregion
	
#region Enable-CMClientDebugLogging
<#
.NOTES
	 Created on:   	5/23/2014 5:27 PM
	 Created by:   	Adam Bertram
	 Filename:      Enable-CMClientDebugLogging.ps1
.DESCRIPTION
	This simple script either enables ConfigMgr client debug logging on a remote computer. This function
	is just a wrapper for Set-CMClientDebugLogging
.EXAMPLE
	.\Enable-CMClientDebugLogging.ps1 -Computername COMPUTERNAME
.PARAMETER Computername
	The computer name you'd like to connect to
#>
function Enable-CMClientDebugLogging {
	[CmdletBinding(DefaultParameterSetName = 'Enable')]
	param (
		[Parameter(Mandatory = $True,
				   ValueFromPipeline = $True,
				   ValueFromPipelineByPropertyName = $True)]
		[ValidateScript({ Test-Connection $_ -Count 1 -Quiet })]
		[string[]]$Computername
	)
	
	process {
		try {
			Set-CMClientDebugLogging -Computername $Computername -Enable
		} catch {
			Write-Error $_.Exception.Message
		}
	}
}
#endregion
		
#region Disable-CMClientDebugLogging
<#
.NOTES
	 Created on:   	5/23/2014 5:27 PM
	 Created by:   	Adam Bertram
	 Filename:      Disable-CMClientDebugLogging.ps1
.DESCRIPTION
	This simple script disables ConfigMgr client debug logging on a remote computer. This function
	is just a wrapper for Set-CMClientDebugLogging
.EXAMPLE
	.\Disable-CMClientDebugLogging.ps1 -Computername COMPUTERNAME
.PARAMETER Computername
	The computer name you'd like to connect to
#>
function Disable-CMClientDebugLogging {
	[CmdletBinding(DefaultParameterSetName = 'Enable')]
	param (
		[Parameter(Mandatory = $True,
				   ValueFromPipeline = $True,
				   ValueFromPipelineByPropertyName = $True)]
		[ValidateScript({ Test-Connection $_ -Count 1 -Quiet })]
		[string[]]$Computername
	)
	
	process {
		try {
			Set-CMClientDebugLogging -Computername $Computername -Disable
		} catch {
			Write-Error $_.Exception.Message
		}
	}
}
#endregion

Export-ModuleMember Enable-CMClientDebugLogging
Export-ModuleMember Disable-CMClientDebugLogging
Export-ModuleMember Invoke-CMClientUpdateScan
Export-ModuleMember Invoke-CMClientUpdateDeploymentEvaluation
Export-ModuleMember Invoke-CMClientHardwareInventory
Export-ModuleMember Invoke-CMClientApplicationDeploymentEvaluation
Export-ModuleMember Invoke-CMClientComplianceEvaluation
Export-ModuleMember Invoke-CMClientDiscoveryDataCycle
Export-ModuleMember Invoke-CMClientMachinePolicyDownload

Download this script on the Technet Script Repository

The post CMClient Module: Invoke Common ConfigMgr Client Actions via Powershell appeared first on Adam the Automator.


Convert ConfigMgr Application to Package/Program

$
0
0

UPDATE 07/11/14 – Added the ability to convert applications with multiple deployment types. The script will now create a package for each deployment type.

Here’s a script I recently wrote to solve a necessary problem. At my client, we were seeing way too many flaky problems with using application inside OSD task sequences. The same programs would work just fine in a package. It was decided that no more applications would go inside OSD task sequences. This is a script I wrote based on David O’Brien’s script. to not only convert all of the existing applications but also to use in a larger application provisioning script that creates an application and immediately converts it to a package for OSD.

Download this script on the Technet Script Repository

#Requires -Version 3

<#
.SYNOPSIS
 	This creates a ConfigMgr package/program for each deployment type from the attributes of a ConfigMgr application.
.DESCRIPTION
	This reads a ConfigMgr application and gathers all needed attributes from the application itself and it's
	deployment types.  It then creates a package/program for each deployment type in the application.  The resulting
	package(s) will be named $ApplicationName - $DeploymentTypeName" with each program being named $DeploymentTypeName.
.NOTES
	Created on: 07/03/2014
	Created by: Adam Bertram
	Filename:   Convert-CMApplicationToPackage.ps1
	Credits:    http://www.david-obrien.net/2014/01/24/convert-configmgr-applications-packages-powershell/
	Todos:	    If the app has a product code, use this for installation program management in the program
		    Create an option to distribute the package to DPs after creation
.DESCRIPTION
 	This gets all common attributes of a ConfigMgr application that a ConfigMgr package/program has and uses
	these attributes to create a new packages with a program inside for each application deployment type.
.EXAMPLE
    .\Convert-CMApplicationToPackage.ps1 -ApplicationName 'Application 1'
	This example converts the application "Application 1" into a package called "Application 1" and a program
	called "Install Application 1" if it has a single deployment type.
.EXAMPLE
     .\Convert-CMApplicationToPackage.ps1 -ApplicationName 'Application 1' -SkipRequirements
	This example converts the application "Application 1" into a package called "Application 1" and a program
	called Install "Application 1" excluding disk space and OS requirements if it has a single deployment type.
.PARAMETER ApplicationName
 	This is the name of the application you'd like to convert.
.PARAMETER SkipRequirements
	Use this switch parameter if you don't want to bring over any disk or OS requirements from the application.
.PARAMETER OsdFriendlyPowershellSyntax
	Use this switch parameter to convert any program that's simply a reference to a PS1 file that normally works
	in a non-OSD environment to a full powershell syntax using powershell.exe.
.PARAMETER SiteServerName
	The ConfigMgr site server name
.PARAMETER SiteCode
	The ConfigMgr site code.
#>
[CmdletBinding()]
param (
	[Parameter(Mandatory,
			   ValueFromPipeline,
			   ValueFromPipelineByPropertyName)]
	[string]$ApplicationName,
	[switch]$SkipRequirements,
	[switch]$OsdFriendlyPowershellSyntax,
	[string]$SiteServer = 'YOURSITESERVER',
	[string]$SiteCode = 'YOURSITECODE'
)

begin {
	try {
		## This helper function gets all of the supported platform objects that's supported for creating OS requirements for a package,
		## looks for a match between each CI_UniqueID and the OS string and if there's a match, creates a new lazy property instance
		## populates the necessary values and returns an array of objects that can be used to populated the SupportOperatingSystemPlatforms
		## lazy WMI property on the SMS_Program object.
		function New-SupportedOsObject([Microsoft.SystemsManagementServer.DesiredConfigurationManagement.Rules.Rule]$OsRequirement) {
			$SupportedPlatforms = Get-WmiObject -ComputerName $SiteServer -Class SMS_SupportedPlatforms -Namespace "root\sms\site_$SiteCode"
			$SupportedOs = @()
			## Define the array of OS strings to convert to objects
			if ($OsRequirement.Expression.Operator.OperatorName -eq 'OneOf') {
				$AppOsList = $OsRequirement.Expression.Operands.RuleId
			} elseif ($OsRequirement.Expression.Operator.OperatorName -eq 'NoneOf') {
				## TODO: Query the site server for all possible operating system values and remove all OSes in
				## $OsRequirement.DeploymentTypes[0].Requirements[0].Expression.Operands.RuleId
				return $false
			}
			foreach ($AppOs in $AppOsList) {
				foreach ($OsDetail in $SupportedPlatforms) {
					if ($AppOs -eq $OsDetail.CI_UniqueId) {
						$instance = ([wmiclass]("\\$SiteServer\root\sms\site_$SiteCode`:SMS_OS_Details")).CreateInstance()
						if ($instance -is [System.Management.ManagementBaseObject]) {
							$instance.MaxVersion = $OsDetail.OSMaxVersion
							$instance.MinVersion = $OsDetail.OSMinVersion
							$instance.Name = $OsDetail.OSName
							$instance.Platform = $OsDetail.OSPlatform
							$SupportedOs += $instance
						}
					}
				}
			}
			$SupportedOs
		}
		
		if (!(Test-Path "$(Split-Path $env:SMS_ADMIN_UI_PATH -Parent)\ConfigurationManager.psd1")) {
			throw 'Configuration Manager module not found.  Is the admin console intalled?'
		} elseif (!(Get-Module 'ConfigurationManager')) {
			Import-Module "$(Split-Path $env:SMS_ADMIN_UI_PATH -Parent)\ConfigurationManager.psd1"
		}
		$Location = (Get-Location).Path
		Set-Location "$($SiteCode):"
		
		$Application = Get-CMApplication -Name $ApplicationName
		if (!$Application) {
			throw "$ApplicationName not found"
		}
		$ApplicationXML = [Microsoft.ConfigurationManagement.ApplicationManagement.Serialization.SccmSerializer]::DeserializeFromString($Application.SDMPackageXML)
		
		$SetProgramProps = @{ }
	} catch {
		Write-Error $_.Exception.Message
		exit
	}
}
process {
	try {
		$DeploymentTypes = $ApplicationXML.DeploymentTypes
		
		for ($i = 0; $i -lt $DeploymentTypes.Count; $i++) {
			$PackageName = "$ApplicationName - $($ApplicationXML.DeploymentTypes[$i].Title)"
			$ProgramName = $ApplicationXML.DeploymentTypes[$i].Title
			
			if (Get-CMPackage -Name $PackageName) {
				throw "$PackageName already exists"
			}
			
			$PackageProps = @{
				'Name' = $PackageName;
				'Version' = $ApplicationXML.SoftwareVersion;
				'Manufacturer' = $ApplicationXML.Publisher;
				'Path' = $ApplicationXML.DeploymentTypes[$i].Installer.Contents.Location;
			}
			
			## 07/03/2014 - Even though the New-CMProgram documentation leads you to believe you can use a string for the RunType
			## param, it won't work.  You must use the [Microsoft.ConfigurationManagement.Cmdlets.AppModel.Commands.RunType] object.
			$NewProgramProps = @{
				'StandardProgramName' = $ProgramName;
				'PackageName' = $PackageName;
				'RunType' = [Microsoft.ConfigurationManagement.Cmdlets.AppModel.Commands.RunType]::($ApplicationXML.DeploymentTypes[$i].Installer.UserInteractionMode)
			}
			
			$AppCmdLine = $ApplicationXML.DeploymentTypes[$i].Installer.InstallCommandLine
			## If the command line is simply a reference to a single PS1 file
			if ($OsdFriendlyPowershellSyntax.IsPresent -and ($AppCmdLine -match '^\w+\.ps1$')) {
				$NewProgramProps.CommandLine = "powershell.exe -ExecutionPolicy bypass -NoProfile -NoLogo -NonInteractive -File $AppCmdLine"
			} else {
				$NewProgramProps.CommandLine = $ApplicationXML.DeploymentTypes[$i].Installer.InstallCommandLine
			}
			
			$SetProgramProps = @{
				'EnableTaskSequence' = $true;
				'StandardProgramName' = $ProgramName;
				'Name' = $PackageName;
			}
			
			## 07/03/2014 - Due to a bug in the New-CMprogram cmdlet, even though 15 min or 720 min is allowed via the GUI
			## for the max run time, it doesn't work via the New-CMProgram cmdlet.  To compensate, I'm adding or removing
			## 1 minute and this works.
			$Duration = $ApplicationXML.DeploymentTypes[$i].Installer.MaxExecuteTime
			if ($Duration -eq 15) {
				$Duration = $Duration + 1
			} elseif ($Duration -eq 720) {
				$Duration = $Duration - 1
			}
			$NewProgramProps.Duration = $Duration
			
			if (!$SkipRequirements.IsPresent) {
				$Requirements = $ApplicationXML.DeploymentTypes[$i].Requirements
				$RequirementExpressions = $Requirements.Expression
				$FreeSpaceRequirement = $RequirementExpressions | where { ($_.Operands.LogicalName -contains 'FreeDiskSpace') -and ($_.Operator.OperatorName -eq 'GreaterEquals') }
				if ($FreeSpaceRequirement) {
					$NewProgramProps.DiskSpaceRequirement = $FreeSpaceRequirement.Operands.value / 1MB
					$NewProgramProps.DiskSpaceUnit = 'MB'
				}
			}
			
			switch ($ApplicationXML.DeploymentTypes[$i].Installer.RequiresLogon) {
				$false {
					$NewProgramProps.ProgramRunType = 'OnlyWhenNoUserIsLoggedOn'
				}
				$true {
					$NewProgramProps.ProgramRunType = 'OnlyWhenUserIsLoggedOn'
				}
				default {
					$NewProgramProps.ProgramRunType = 'WhetherOrNotUserIsLoggedOn'
				}
			}
			
			if ($ApplicationXML.DeploymentTypes[$i].Installer.UserInteractionMode -eq 'Hidden') {
				$SetProgramProps['SuppressProgramNotifications'] = $true
			}
			
			if ($ApplicationXML.DeploymentTypes[$i].Installer.SourceUpdateCode) {
				##TODO: Look into setting installation source management on the package
			}
			
			$PostIntallBehavior = $ApplicationXML.DeploymentTypes[$i].Installer.PostInstallBehavior
			if (($PostIntallBehavior -eq 'BasedOnExitCode') -or ($PostIntallBehavior -eq 'NoAction')) {
				$SetProgramProps.AfterRunningType = 'NoActionRequired'
			} elseif ($PostIntallBehavior -eq 'ProgramReboot') {
				$SetProgramProps.AfterRunningType = 'ProgramControlsRestart'
			} elseif ($PostIntallBehavior -eq 'ForceReboot') {
				$SetProgramProps.AfterRunningType = 'ConfigurationManagerRestartsComputer'
			}
			
			$NewPackage = New-CMPackage @PackageProps
			Write-Verbose "Successfully created package name $($NewPackage.Name) ($($NewPackage.PackageID))"
			$NewProgram = New-CMProgram @NewProgramProps
			Set-CMProgram @SetProgramProps
			
			if (!$SkipRequirements.IsPresent) {
				$OsRequirement = $Requirements | where { $_.Expression -is [Microsoft.SystemsManagementServer.DesiredConfigurationManagement.Expressions.OperatingSystemExpression] }
				if ($OsRequirement) {
					$SupportedOs = New-SupportedOsObject $OsRequirement
					$NewProgram.SupportedOperatingSystems = $SupportedOs
					$NewProgram.Put()
				}
			}
			Unlock-CMObject $NewPackage
		}
		
	} catch {
		Write-Error $_.Exception.Message
	}
}

end {
	Set-Location $Location
}

The post Convert ConfigMgr Application to Package/Program appeared first on Adam the Automator.

How to Create a Robust ConfigMgr Backup with Powershell

$
0
0

backupDownload this script on the Technet Script Repository

I’ve recently completed a high priority task on my list this week created from my most recent forced ConfigMgr site restore.  After that went down, I’d been procrastinating getting a solid ConfigMgr site backup going again. Previously, I was using the afterbackup.bat file to do some post-backup tasks doing a combination of SSRS backups and daily folders. It was working great although being a batch file it was pretty rudimentary. I’m not doing this anymore because during my outage, I discovered that the afterbackup.bat file was removed and I had no backup whatsoever. Awesome, right?!?

I decided to write a Powershell script that did all that batch file did with the robustness that Powershell can provide like error handling and verification. Previously, I was limited to well…uh…a batch file but now I see a whole new light.

Any good ConfigMgr admin knows that every time the built-in Site Backup SQL maintenance task runs, the site server executes the afterbackup.bat file located in %CONFIGURATION_MANAGER_INSTALL_FOLDER%\inboxes\smsbkup.box folder. People have done all kinds of stuff with this batch file as you can tell if you’ve clicked on the links above. I’m not down with batch files anymore; I’m monogamous now with my friend Powershell. Due to my monogamy, I’m just using afterbackup.bat to launch a Powershell script which is going to do all the work. I think I’ve came up with a pretty slick solution.

After a lot of deliberation I decided against using a SQL Server maintenance plan to backup the database. Third-party database restoration is supported now with ConfigMgr 2012 SP1 and I’ve heard good things about this approach but it wasn’t for me. I decided to use the built-in site backup task because of a single reason; I know the restore works great. I’ve never performed a restore via a database backed up with the maintenance plan so I’d be in uncharted water if I ever had to do another restore.

To get a solid SCCM site backup requires a few prereqs first:

  1. You have the Site Backup SQL Maintenance task enabled and verified to be working in your ConfigMgr site.
  2. You have the site backup task configured to push off the backup to a remote server.  You aren’t backing it up to the site server, are you?
  3. You have this line in your afterbackup.bat file:
    powershell.exe -ExecutionPolicy Bypass -NoProfile -NoLogo -FilePath %SOMEREMOTEPATH%\%POSTBACKUPSCRIPT%.PS1

Now that we have that out of the way here’s what I’ve come up with. I’ve tried to make this as robust as possible accounting for everything I can think of but if you see something that should be changed, please let me know.

#Requires -Version 3

<#
.SYNOPSIS
	This script checks a ConfigMgr site backup to ensure success and performs various post-backup functions that 
	back up other critical data that the built-in ConfigMgr site backup task does not.
.DESCRIPTION
	This script checks any previous backup attempt ran within the last hour of this script running for a
	a successful run if backup check is selected.  It assumes you also have SSRS installed on the site DB server 
	and backs up both SSRS databases, exports out the SSRS encryption keys, backs up the export file and the 
	entire ReportingServices folder on the server.  Once SSRS has been backed up, it then also copies the entire 
	SCCMContentLib folder, client install folder and the afterbackup.bat file to the destination backup folder path.  
	Once complete, it then attaches the log file it creates as part of the process and emails it out to the defined recipient.

	The script intends on creating 7 days worth of backups in a central location labeled Monday through Friday
	and places a copy of all backed up components in each day's folder.
.NOTES
	Created on: 	6/13/2014
	Created by: 	Adam Bertram
	Filename:	Start-PostConfigMgrBackupSteps.ps1
	Credits:	http://bit.ly/1i24NgC
	Requirements:	ConfigMgr, Reporting Point installed on the site DB server
	Todos:		Use the Sync framework to only copy deltas (http://bit.ly/1nh3FmP)
			Backup custom updates added via SCUP
			Verify copies were actually successful
			Retrieve more params automatically
.EXAMPLE
	.\Start-PostConfigMgrBackupSteps.ps1
	This example uses all default parameters for the script which will be the most likely way it is executed.
.PARAMETER SiteCode
	The ConfigMgr site code that the site server is a part of
.PARAMETER SiteDbServer
	The Configmgr site server that has the Reporting Services Point and the Site database server role installed.
.PARAMETER DestDbBackupFolderPath
	The UNC root folder path where the days' backup folders will be copied to
.PARAMETER SrcReportingServicesFolderPath
	The UNC folder path where you have installed reporting services to on the server.
.PARAMETER ReportingServicesDbBackupSqlFilePath
	The UNC file path to the SQL file that is dynamically created (if not exists) that the script passes to the 
	sqlcmd utility to kick off a backup of the SSRS databases.  This does not have to exist.  It is recommended
	to allow the script to create this.
.PARAMETER ReportingServicesEncKeyPassword
	The password that's set on the exported SSRS keys
.PARAMETER SrcContentLibraryFolderPath
	The folder path to ConfigMgr's content library on the site server.  This folder is called SCCMContentLib. The
	default path probably does not need to be changed.
.PARAMETER SrcClientInstallerFolderPath
	The folder path where the ConfigMgr client install is located on the site server.  This is backed up if you
	have any hotfixes being installed with your clients and may be located in here.
.PARAMETER SrcAfterBackupFilePath
	The file path where the afterbackup.bat file is located.  You should not have to change this from the default.
.PARAMETER LogFilesFolderPath
	The folder path where the script will create a log file for each day it runs.
.PARAMETER CheckBackup
	Use this switch to first check to ensure a recent backup was successful.  This parameter is recommended
	when running inside the afterbackup.bat file.
#>
[CmdletBinding()]
param (
	[string]$SiteCode = 'YOURSITEHERE',
	[ValidateScript({Test-Connection $_ -Quiet -Count 1})]
	[string]$SiteDbServer = 'YOURSITESERVERHERE',
	[ValidateScript({ Test-Path $_ -PathType 'Container' })]
	[string]$DestDbBackupFolderPath = '\\YOUR\DESTINATION\BACKUP\FOLDER\PATH',
	[ValidateScript({ Test-Path $_ -PathType 'Container' })]
	[string]$SrcReportingServicesFolderPath = "\\$SiteDbServer\f$\Sql2012Instance\MSRS11.MSSQLSERVER\Reporting Services",
	[string]$ReportingServicesDbBackupSqlFilePath = "\\$SiteDbServer\c$\ReportingServicesDbBackup.sql",
	[string]$ReportingServicesEncKeyPassword = 'my_password',
	[ValidateScript({ Test-Path $_ -PathType 'Container' })]
	[string]$SrcContentLibraryFolderPath = "\\$SiteDbServer\f$\SCCMContentLib",
	[ValidateScript({ Test-Path $_ -PathType 'Container' })]
	[string]$SrcClientInstallerFolderPath = "\\$SiteDbServer\c$\Program Files\Microsoft Configuration Manager\Client",
	[ValidateScript({ Test-Path $_ -PathType 'Leaf' })]
	[string]$SrcAfterBackupFilePath = "\\$SiteDbServer\c$\Program Files\Microsoft Configuration Manager\inboxes\smsbkup.box\afterbackup.bat",
	[string]$LogFilesFolderPath = "$DestDbBackupFolderPath\Logs",
	[switch]$CheckBackup
)

begin {
	Set-StrictMode -Version Latest
	try {
		## This function builds a SQL file called $ReportingServicesDbBackupSqlFile that backs up both
		## reporting services databases to a subfolder called ReportsBackup under today's
		## destination backup folder
		function New-ReportingServicesBackupSqlFile($TodayDbDestFolderPath) {
			Add-Content -Value "declare @path1 varchar(100);
			declare @path2 varchar(100);
			SET @path1 = '$TodayDbDestFolderPath\ReportsBackup\ReportServer.bak';
			SET @path2 = '$TodayDbDestFolderPath\ReportsBackup\ReportServerTempDB.bak';
			
			USE ReportServer;
			BACKUP DATABASE REPORTSERVER TO DISK = @path1;
			BACKUP DATABASE REPORTSERVERTEMPDB TO DISK = @path2;
			DBCC SHRINKFILE(ReportServer_log);
			USE ReportServerTempDb;
			DBCC SHRINKFILE(ReportServerTempDB_log);" -Path $ReportingServicesDbBackupSqlFilePath
		}
		
		function Convert-ToLocalFilePath($UncFilePath) {
			$Split = $UncFilePath.Split('\')
			$FileDrive = $Split[3].TrimEnd('$')
			$Filename = $Split[-1]
			$FolderPath = $Split[4..($Split.Length - 2)]
			if ($Split.count -eq 5) {
				"$FileDrive`:\$Filename"
			} else {
				"$FileDrive`:\$FolderPath\$Filename"
			}
		}
		
		Function Get-LocalTime($UTCTime) {
			$strCurrentTimeZone = (Get-WmiObject win32_timezone).StandardName
			$TZ = [System.TimeZoneInfo]::FindSystemTimeZoneById($strCurrentTimeZone)
			$LocalTime = [System.TimeZoneInfo]::ConvertTimeFromUtc($UTCTime, $TZ)
			$LocalTime
		}
		
		if (!(Test-Path $LogFilesFolderPath)) {
			New-Item -Path $LogFilesFolderPath -Type Directory | Out-Null
		}
		$script:MyDate = Get-Date -Format 'MM-dd-yyyy'
		$script:LogFilePath = "$LogFilesFolderPath\$MyDate.log"
		
		## Simple logging function to create a log file in $LogFilesFolderPath named today's
		## date then write a timestamp and the message on each line and outputs the log file
		## path it wrote to
		function Write-Log($Message) {
			$MyDateTime = Get-Date -Format 'MM-dd-yyyy H:mm:ss'
			Add-Content -Path $script:LogFilePath -Value "$MyDateTime - $Message"
		}
		
		$DefaultBackupFolderPath = "$DestDbBackupFolderPath\$SiteCode" + 'Backup'
		if (!(Test-Path $DefaultBackupFolderPath)) {
			throw "Default backup folder path $DefaultBackupFolderPath does not exist"
		}
		
		if ($CheckBackup.IsPresent) {
			## Ensure the backup was successful before doing post-backup tasks
			
			## $DefaultBackupFolderPath is the path where the builtin Site Backup SQL maintenance task places the
			## backup. Ensure it has today's write time before going further because if not then the backup
			## didn't run successfully
			$BackupFolderLastWriteDate = (Get-ItemProperty $DefaultBackupFolderPath).Lastwritetime.Date
			
			$SuccessMessageId = 5035
			$OneHourAgo = (Get-Date).AddHours(-1)
			Write-Log "One hour ago detected as $OneHourAgo"
			
			$WmiParams = @{
				'ComputerName' = $SiteDbServer;
				'Namespace' = "root\sms\site_$SiteCode";
				'Class' = 'SMS_StatusMessage';
				'Filter' = "Component = 'SMS_SITE_BACKUP' AND MessageId = '$SuccessMessageId'"
			}
			$LastSuccessfulBackup = (Get-WmiObject @WmiParams | sort time -Descending | select -first 1 @{ n = 'DateTime'; e = { $_.ConvertToDateTime($_.Time) } }).DateTime
			$LastSuccessfulBackup = Get-LocalTime $LastSuccessfulBackup
			Write-Log "Last successful backup detected on $LastSuccessfulBackup"
			$IsBackupSuccessful = $LastSuccessfulBackup -gt $OneHourAgo
			
			if (($BackupFolderLastWriteDate -ne (get-date).date) -or !$IsBackupSuccessful) {
				throw 'The backup was not successful. Post-backup procedures not necessary'
			}
		}
		
		$CommonCopyFolderParams = @{
			'Recurse' = $true;
			'Force' = $true;
		}
		
	} catch {
		Write-Log "ERROR: $($_.Exception.Message)"
		exit (10)
	}
}

process {
	try {
		## If today's folder exists in the root of the backup folder path
		## remove it else create a new one
		$Today = (Get-Date).DayOfWeek
		$TodayDbDestFolderPath = "$DestDbBackupFolderPath\$Today"
		if ((Test-Path $TodayDbDestFolderPath -PathType 'Container')) {
			Remove-Item $TodayDbDestFolderPath -Force -Recurse
			Write-Log "Removed $TodayDbDestFolderPath..."
		}
		
		## Rename the default backup folder to today's day of the week
		Rename-Item $DefaultBackupFolderPath $Today
		## Create the folder to put the reporting services database backups in
		New-Item -Path "$TodayDbDestFolderPath\ReportsBackup" -ItemType Directory | Out-Null
		
		## if the SQL file that gets invoked to backup the SSRS databases isn't in the root of
		## C on the site server, build it.  The root of C isn't necessary.  It just needs to be
		## somewhere on the local server
		if (Test-Path $ReportingServicesDbBackupSqlFilePath) {
			Remove-Item $ReportingServicesDbBackupSqlFilePath -Force
		}
		New-ReportingServicesBackupSqlFile $TodayDbDestFolderPath
		Write-Log "Created new SQL file in $TodayDbDestFolderPath..."
		
		## Convert the UNC path specified for the SQL file into a local path to feed to
		## sqlcmd on the site server which backs up the SSRS databases.  Confirm success
		## afterwards.
		Write-Log "Backing up SSRS Databases..."
		$LocalPath = Convert-ToLocalFilePath $ReportingServicesDbBackupSqlFilePath
		$result = Invoke-Command -ComputerName $SiteDbServer -ScriptBlock { sqlcmd -i $using:LocalPath }
		if ($result[-1] -match 'DBCC execution completed') {
			Write-Log 'Successfully backed up SSRS databases'
		} else {
			Write-Log 'WARNING: Failed to backup SSRS databases'
		}
		
		## Export the SSRS Encryption keys to a local file via remoting on the site server and
		## copy that file to the backup location
		Write-Log "Exporting SSRS encryption keys..."
		$ExportFilePath = "\\$SiteDbServer\c$\rsdbkey.snk"
		$LocalPath = Convert-ToLocalFilePath $ExportFilePath
		$result = Invoke-Command -ComputerName $SiteDbServer -ScriptBlock { echo y | rskeymgmt -e -f $using:LocalPath -p $using:ReportingServicesEncKeyPassword }
		if ($result[-1] -ne 'The command completed successfully') {
			Write-Log 'WARNING: SSRS keys were not exported!'
		} else {
			Copy-Item $ExportFilePath $TodayDbDestFolderPath -Force
			Write-Log 'Successfully exported and backed up encryption keys.'
		}		
		
		## Backup the Reporting Services SSRS folder
		Write-Log "Backing up $SrcReportingServicesFolderPath..."
		Copy-Item @CommonCopyFolderParams -Path $SrcReportingServicesFolderPath -Destination "$TodayDbDestFolderPath\ReportsBackup"
		Write-Log "Successfully backed up the $SrcReportingServicesFolderPath folder.."
				
		## Backup the SCCMContentLib folder
		Write-Log "Backing up $SrcContentLibraryFolderPath..."
		Copy-Item @CommonCopyFolderParams -Path $SrcContentLibraryFolderPath -Destination $TodayDbDestFolderPath
		Write-Log "Successfully backed up the $SrcContentLibraryFolderPath folder.."
		
		## Backup the client install folder from the site server to the backup folder
		Write-Log "Backing up $SrcClientInstallerFolderPath..."
		Copy-Item @CommonCopyFolderParams -Path $SrcClientInstallerFolderPath -Destination $TodayDbDestFolderPath
		Write-Log "Successfully backed up the $SrcClientInstallerFolderPath folder.."
		
		##TODO: Backup any SCUP updates
		## On the computer that runs Updates Publisher, browse the Updates Publisher 2011 database file (Scupdb.sdf)
		## in %USERPROFILE%\AppData\Local\Microsoft\System Center Updates Publisher 2011\5.00.1727.0000\. There is
		## a different database file for each user that runs Updates Publisher 2011. Copy the database file to your
		## backup destination. For example, if your backup destination is E:\ConfigMgr_Backup, you could copy the
		## Updates Publisher 2011 database file to E:\ConfigMgr_Backup\SCUP2011.
		
		## Backup the afterbackup.bat file that kicks off this script
		Write-Log "Backing up $SrcAfterBackupFilePath.."
		Copy-Item @CommonCopyFolderParams -Path $SrcAfterBackupFilePath -Destination $TodayDbDestFolderPath
		Write-Log "Successfully backed up the $SrcAfterBackupFilePath file..."
		
	} catch {
		Write-Log "ERROR: $($_.Exception.Message)"
	}
}

end {
	Write-Log 'Emailing results of backup...'
	## Email me the results of the backup and post-backup tasks
	$Params = @{
		'From' =  'ConfigMgr Backup <me@domain.com>';
		'To' = 'Adam Bertram <me@domain.com>';
		'Subject' = 'ConfigMgr Backup';
		'Attachment' =  $script:LogFilePath;
		'SmtpServer' = 'SOMESERVER'
	}
	
	Send-MailMessage @Params -Body 'ConfigMgr Backup Email'
}

The post How to Create a Robust ConfigMgr Backup with Powershell appeared first on Adam the Automator.

SCCM Admins: A Simple Installed Software Query

$
0
0

The most frequent piece of client information I have to retrieve from SCCM on a regular basis is installed software. The requests are typically nothing formal. Myself or others just want to get a rough number on how many clients have X software installed or to just eyeball the client names. Queries are a great solution to this problem. They provide a quick method to query information out of the SCCM database.

I built this query some time ago that’s been used hundreds of time so if I’m using it so much I figured the SCCM community could get some use out of it as well.

This query mostly uses the Installed Software class. However, I also tap into the Operating System and System Resource classes to display client name and OS information. Since I was frequently getting questions like “How recent is this information?” I also added the Last Hardware Scan time from the Workstation Status class to show the timestamp.

When run, the query prompts for a collection.

Screen Shot 2014-12-01 at 1.24.04 PM

Since the software names are different every time I then force the user to input the software title. Use your typical WQL syntax here. This is a query for the software title that’s in the Installed Software WMI class’s ARPDisplayname attribute.

Screen Shot 2014-12-01 at 1.24.39 PM

Finally, here’s the sample output you’ll receive. You can see that you get a nice summary of the clients that matched your software query.

Screen Shot 2014-12-01 at 1.26.17 PM

I could provide you with the query to do this but this is an automation blog so I just exported out the query as a MOF file which allows you to simply right click on the Queries node in your SCCM console and import it from there. Download the MOF here, import and away you go!

The post SCCM Admins: A Simple Installed Software Query appeared first on Adam the Automator.

MIFS Gone Wild: Use Powershell to Track Down Problem Clients

$
0
0

SCCM admins have a lot of things to worry about. Software deployments, OS imaging, patching, the tasks can sometimes feel overwhelming. Tack on client health to that list of responsibilities and you’ll seen feel like you’re going insane. What better way to ease the burden than with Powershell. Here’s a small script you can use to get a glimpse into hardware inventory problems.

When something goes wrong, SCCM places MIF files into the dataldr.box directory on the site server. This directory contains all kinds of different MIF files that couldn’t be committed to the database for one reason or another. The MIF fils are arranged in a way that the reason they weren’t committed is the subdirectory of dataldr.box that they’re in. I needed a way to easily query all these MIF files to figure out which clients were having problems, when the problem happened and what the problem was. This script was my answer to that.

$FolderPath = '\CONFIGMGRSERVER\f$\CM2012\inboxes\auth\dataldr.box'

## Get all of the MIF files we'll be querying
$Mifs = Get-ChildItem $FolderPath -Filter '*.mif' -Recurse

## Find all the MIF files that match the regex string '//KeyAttribute<NetBIOS Name><(.*)>' 
$Matches = Select-String -Path $Mifs -Pattern '//KeyAttribute<NetBIOS Name><(.*)>'
foreach ($Match in $Matches) { 
    $FileProps = Get-Itemproperty $Match.Path       
    ## Output an object containing the client name, time the file was created and why the MIF wasn't committed to the DB
    [pscustomobject]@{
        'Client' = $Match.matches.groups[1].Value
        'FileCreationTime' = $FileProps.CreationTime
        'MifProblem' = $FileProps.DirectoryName | Split-Path -Leaf
    }
}

I hope that this saves you a little bit of time tracking down client health problems in SCCM!

The post MIFS Gone Wild: Use Powershell to Track Down Problem Clients appeared first on Adam the Automator.

ConfigMgr Site Maintenance Task Powershell Module

$
0
0

I was getting ready to upgrade our ConfigMgr 2012 SP1 site server to R2 (I know, I’m a little behind the times!) and one of the tasks I needed to do was disable all the site maintenance tasks while the upgrade is being performed. I started to document all the site maintenance tasks that were enabled and then proceeded to disable them and hit my click limit. If I find myself clicking a repetitive pattern more than a couple times I stop and ask myself “Can and should this be automated?” and lucky for you I decided to take some time to create a few small Powershell functions.

These functions will not only work for my upgrade scenario I’m sure I’ll also use them for other tasks down the road.

For anyone doing a ConfigMgr upgrade and want to knock out the disabling/enabling site maintenance tasks here’s a little script to run after you’ve got these functions loaded.

## Export out all the tasks that are enabled
Get-CmSiteMaintenceTask -Status Enabled | Export-Csv -Path 'EnabledConfigMgrSiteMaintTasks.csv' -Append -NoTypeInformation
## Disable all of the enabled tasks
Import-Csv '.\EnabledConfigMgrSiteMaintTasks.csv' | Get-CmSiteMaintenaceTask | Disable-CmSiteMaintenanceTask
## Do the R2 upgrade here
## Enable all the tasks that you disabled prior to the upgrade
Import-Csv '.\EnabledConfigMgrSiteMaintTasks.csv' | Get-CmSiteMaintenaceTask | Enable-CmSiteMaintenanceTask

Here are the 3 functions that make this happen. I’d throw them into a .psm1 file and import them as a module but do with them as you please.

function Get-CmSiteMaintenanceTask {
	<#
	.SYNOPSIS
		This function discovers and records the state of all site maintenance tasks on a ConfigMgr site server.
	.PARAMETER TaskName
		The name of the site maintenance task you'd like to limit the result set by.  This accepts wildcards or
		multiple names
	.PARAMETER Status
		The status (either enabled or disabled) of the site maintenance tasks you'd like to limit the result set by.
	.PARAMETER SiteServer
		The SCCM site server to query
	.PARAMETER SiteCode
		The SCCM site code
	.EXAMPLE
	
	PS> Get-CmSiteMaintenanceTask -TaskName 'Disabled*' -Status Enabled
	
	This example finds all site maintenance tasks starting with 'Disabled' that are enabled.
	#>
	[CmdletBinding()]
	[OutputType([System.Management.ManagementObject])]
	param (
		[Parameter(ValueFromPipeline,ValueFromPipelineByPropertyName)]
		[string[]]$TaskName,
		[Alias('ItemName')]
		[ValidateSet('Enabled', 'Disabled')]
		[string]$Status,
		[string]$SiteServer = 'CONFIGMANAGER',
		[ValidateLength(3, 3)]
		[string]$SiteCode = 'UHP'
	)
	
	process {
		try {
			$WmiParams = @{ 'Computername' = $SiteServer; 'Namespace' = "root\sms\site_$SiteCode"}
			
			Write-Verbose -Message "Building the WMI query..."
			if ($TaskName -or $Status) {
				if ($TaskName) {
					$WmiParams.Query = 'SELECT * FROM SMS_SCI_SQLTask WHERE '
					$NameConditions = @()
					foreach ($n in $TaskName) {
						## Allow asterisks in cmdlet but WQL requires percentage and double backslashes
						$NameValue = $n.Replace('*', '%').Replace('\', '\\')
						$Operator = @{ $true = 'LIKE'; $false = '=' }[$NameValue -match '\%']
						$NameConditions += "(ItemName $Operator '$NameValue')"
					}
					$WmiParams.Query += ($NameConditions -join ' OR ')
				}
				if ($Status) {
					$WmiParams.Class = 'SMS_SCI_SQLTask'
					$Enabled = $Status -eq 'Enabled'
					$WhereBlock = { $_.Enabled -eq $Enabled }
				}
			} else {
				$WmiParams.Class = 'SMS_SCI_SQLTask'
			}
			if ($WhereBlock) {
				Get-WmiObject @WmiParams | where $WhereBlock
			} else {
				Get-WmiObject @WmiParams
			}
		} catch {
			Write-Error $_.Exception.Message
		}
	}
}

function Enable-CmSiteMaintenanceTask {
	<#
	.SYNOPSIS
		This function enables a ConfigMgr site maintenance task.
	.PARAMETER InputObject
		An object of returned from Get-CmSiteMaintenceTask of the task you'd like enabled.
	.EXAMPLE
	
	PS> Get-CmSiteMaintenanceTask -TaskName 'Disabled*' -Status Disabled | Enable-CmsiteMaintenanceTask
	
	This example finds all site maintenance tasks starting with 'Disabled' that are disabled and enables them all.
	#>
	[CmdletBinding()]
	param (
		[Parameter(ValueFromPipeline)]
		[System.Management.ManagementObject]$InputObject
	)
	process {
		try {
			$InputObject | Set-WmiInstance -Arguments @{ 'Enabled' = $true } | Out-Null
		} catch {
			Write-Error $_.Exception.Message
		}
	}
}

function Disable-CmSiteMaintenanceTask {
	<#
	.SYNOPSIS
		This function disables a ConfigMgr site maintenance task.
	.PARAMETER InputObject
		An object of returned from Get-CmSiteMaintenceTask of the task you'd like disabled.
	.EXAMPLE
	
	PS> Get-CmSiteMaintenanceTask -TaskName 'Disabled*' -Status Enabled | Disable-CmsiteMaintenanceTask
	
	This example finds all site maintenance tasks starting with 'Disabled' that are enabled and disables them all.
	#>
	[CmdletBinding()]
	param (
		[Parameter(ValueFromPipeline)]
		[System.Management.ManagementObject]$InputObject
	)
	process {
		try {
			$InputObject | Set-WmiInstance -Arguments @{ 'Enabled' = $false } | Out-Null
		} catch {
			Write-Error $_.Exception.Message
		}
	}
}

Download this script on the Technet Script Repository

The post ConfigMgr Site Maintenance Task Powershell Module appeared first on Adam the Automator.

Automatic SCCM to WSUS Software Update Sync

$
0
0

My client uses Configuration Manager for software updates and has been for a long time.  It works well and they’re used to the workflow. As you may know, Configuration Manager uses WSUS to manage a lot of the heavy lifting regarding software updates and works just fine (well..most of the time).  You shouldn’t try to implement another solution just because of the extra management that it brings unless you’re hand is forced.  Let me tell you a story…unless you don’t want to hear my ramblings and just want to download it now.

There once was a product called Cisco ISE that tested and remediated clients that didn’t meet a certain update baseline.  This product, although, very functional and expensive was created by a team that didn’t believe that any other update delivery mechanism existed other than plain ol’ WSUS; not SCCM, not Altiris, notta; only WSUS.  Frank comments to Cisco like “we have SCCM and we can’t use it’s WSUS server directly because SCCM controls it” yielded ye but a confused look and instructions to “build the SCCM integration yourself” were given.  Needless to say, I was not happy but what was I to do?  So I conceded.

Next comes along MDT.  Although I’m not real familiar with the process it seems that this product cannot use SCCM’s software updates natively either and must have it’s own WSUS server as well.

With two products needing a separate WSUS server yet one system (SCCM) with all the tested and approved software updates already what is a guy to do?  We have two options.

  1. Check each update in SCCM, look at the WSUS console and manually approve each and every one.
  2. Spend a few days and create a Powershell script that approves all WSUS updates in SCCM on a recurring basis.

Which am I to choose?  Hmm..  Automation wins every time!

Out of this circumstance was born the bouncing baby Sync-CmToWsusSoftwareUpdates.ps1 script.  Although I found multiple people (Technet Blogs, Configmgrblog.com, MyItForum) already having their hand at solving this problem every one didn’t go far enough for me.  Some were not matching updates 1:1 and all were only approving WSUS updates in SCCM so I decided (with permission from my client) to build a more robust and reliable script (IMO) with far more features.

This script has the following features.

  • Approves all WSUS updates that are deployed (in any deployment) in SCCM
  • Declines all WSUS updates that are not in SCCM (optional)
  • Lists all updates in SCCM that weren’t in WSUS
  • Logs all activity to a log file in CSV-format for easy parsing later
  • Performs a WSUS sync from Microsoft Update if any SCCM updates could not be match (Optional)

These functions allowed me to get 90% of where I wanted to go.  You’ll see in the code that I also wanted to synchronize all approved classifications and products from SCCM to WSUS but being a practicing pragmatist I decided to let those go for now in respect for my client’s time.  These only need be set once in WSUS anyway.

This script has a single requirement and a few suggestions before you run it.

Requirements

  1. The WSUS 3.0 SP2 (the version I used) administration console needs to be installed on the machine you’re executing this script on.  If you’ll be running this script on Windows Server 2012 you can simply run a one liner:
    Install-WindowsFeature -Name UpdateServices-Ui

    If you’re on any other operating system you’ll probably need to download and install it.  This will install all the .NET assemblies you need to query the WSUS server remotely.

Suggestions

  1. Remove permission from anyone to manually approve, decline or even breathe on the WSUS server.  It’s not going to be a big deal if someone changes a few updates between syncs but it will take longer for the script to run.
  2. Set the WSUS server’s update source to Microsoft (or an upstream WSUS server that you’re sure has all the updates that SCCM has).  You need as many update as possible in WSUS so that you can match on as many SCCM updates as possible.
  3. Perform at least one synchronization from Microsoft Update on the WSUS server before you run this.  We need all the updates possible on the WSUS server to match from SCCM.
  4. Synchronization set to Automatic in WSUS.  Again, because we need all the update metadata in WSUS as possible, you need to sync from Microsoft often.
  5. Remove all automatic approvals.  This may seem obvious but you don’t want any other outside process jacking with the WSUS updates that you only want synced from SCCM.
  6. Run this script as a scheduled task.  If you really want to, you can put a sticky note on your monitor with a reminder to execute this script every day at 2AM if you want but I’d use a scheduled task instead.

I’ve heavily commented the script and have added all the common Powershell advanced function ninja stuff but there are a few things I want to mention before you go off breaking your…..I mean TESTING!..this script first before you roll it into production.

Since I intended this script to run as a scheduled task it outputs nothing to the console and writes all activity to a CSV-formatted log file to allow you to do things like this. Don’t comment on this post and tell me I don’t know what I’m doing since I didn’t include any verbose logging. Instead, look at the log file, by default, in the same directory as the script.

wsussccmsync

If you’ve got thousands of updates like my client did, this log file will be enormous if you include all of the Write-Log lines I have in here.  So, I’ve commented out a lot of the really verbose stuff to only give you the meat and potatoes.  However, feel free to uncomment out this stuff if you’re troubleshooting and are just curious which updates it’s actually syncing.

commented_Writelogs

 

As always, if you use this and something’s not working please let me know. I spent a lot of time on this but nothing is ever perfect and I’d love the opportunity to improve upon it.

The post Automatic SCCM to WSUS Software Update Sync appeared first on Adam, the Automator.

All ConfigMgr Software Updates Expired After Site Restore

$
0
0

As I type, I’m in the middle of a major Configuration Manager problem. You can’t do too many updates deployments when every, single update in the database is expired! This is what it took to get my ConfigMgr software updates back up and going after a restore. According to my Microsoft support person, this isn’t the first time this had to be done after a restore so beware ConfigMgr admins! I hope you never have to go through this.

What I’m about to explain is in no way my fault if it hoses your database. This was my solution for fixing my updates were all were expired in the database and the WSUS sync was failing with CCISource error: 2. From my experience, this error code always has to do with some kind of corruption in the ConfigMgr database. Apparently, I was still able to see the updates only because the metadata was still available so the updates were long gone anyway. I was hoping I didn’t have to go through an entire resync again and setup all update groups, packages and deployments but alas that was a false hope.

Here are the steps that Microsoft support and I went through to get my ConfigMgr software updates back up and going again:

  1. Kicked off the SMS_SITE_BACKUP component to start a backup, monitored via the smsbkup.log file and confirmed success.
  2. Ran SQL query
  3. select * from CI_ConfigurationItems

  4. Found the ModelName attribute by looking in the wsyncmgr.log file and looked for the line “Referenced configuration items are not available yet:…“. On this line, you’ll find a “ModelName=Site_ABCDEF/SUM_XYZ” line. You need the site model name which in this example is ABCDEF.
  5. Ran SQL queries to find all affected updates
  6. SELECT * FROM CI_DocumentStore WHERE Document_ID NOT IN (SELECT Document_ID FROM CI_CIDocuments)
    SELECT * FROM v_updatecis WHERE ModelName NOT LIKE '%ABCDEF/SUM%'

  7. Stopped the SMS_EXECUTIVE service but, in my case, it didn’t not stop so the smsexec.exe process was killed.
  8. Stopped SMS_SITE_COMPONENT_MANAGER service
  9. Ran SQL query to find no rows affected
  10. UPDATE CI SET IsExpired=1 FROM v_updatecis ci where ci.citype_id in (1,8) and modelname like 'Site_ABCDEF/SUM_%"

  11. Started the SMS_EXECUTIVE service again.
  12. Removed the software update point role
  13. Checked SUPsetup.log on SUP server to confirm role removal
  14. Removed the Windows Server Update Services role and all subfeatures from the SUP server and rebooted
  15. Ran SQL queries to remove all expired updates
  16. update ci_configurationitems set isexpired=1 where citype_id in (1,8)
    delete Ci_configurationitemrelations where toci_id in (Select ci_id from ci_configurationitems where citype_id in (1,8))
    delete ci_assignmenttargetedCIs where ci_id in (Select ci_id from ci_configurationitems where citype_id in (1,8))

  17. The first UPDATE query usually takes a little bit and in my case actually failed with the error:”
  18. delete statement conflicted with the reference constraint “ci_currenterrordetails_ciid_fk” The conflict occurred in the database…..
    It looks like CIType_Id 8 was causing the foreign key constraint because

    delete ci_configurationitems where citype_id = 1

    worked on it’s own.

    A SQL engineer took over and create a temp stored procedure to correct the error. Afterwards, the database was clean!

  19. Renamed the SMS, UpdateServicesPackages and WsusContent WSUS folders to .old on the server holding the software update point role
  20. Reinstalled WSUS on the SUP server with the same content directory as before
  21. Run WSUS to perform the post-installation tasks
  22. Added the SUP role back again documenting all checked categories
  23. Unchecked all categories but a couple to speed up the initial sync
  24. Confirmed a successful software update point role install in SUPsetup.log
  25. Checked wsusctrl.log to ensure the SUP role is connecting to the WSUS server
  26. Go to the WSUS console –> Options –> Products and Classifications and uncheck the all but a couple categories to match ConfigMgr categories
  27. Ensure the UpdateServicesPackages and WsusContent folders have recreated on the SUP role server
  28. Started a manual ConfigMgr WSUS sync
  29. Removed all update groups, packages and deployments
  30. Checked wsyncmgr.log and sync was successful and updates are showing up correctly in the console
  31. Checked all product categories needed again and initiated another sync to get the rest of the updates into the database.
  32. Recreated all update groups, deployment packages and deployments.

The post All ConfigMgr Software Updates Expired After Site Restore appeared first on Adam, the Automator.


Building Logs for CMTrace with PowerShell

$
0
0

This post is part of the #PSBlogWeek PowerShell blogging series. #PSBlogWeek is a regular event where anyone interested in writing great content about PowerShell is welcome to volunteer for. The purpose is to pool our collective PowerShell knowledge together over a 5-day period and write about a topic that anyone using PowerShell may benefit from. #PSBlogWeek is a Twitter hashtag so feel free to stay up to date on the topic on Twitter at the #PSBlogWeek hashtag. For more information on #PSBlogWeek or if you’d like to volunteer for future sessions, contact Adam Bertram (@adbertram) on Twitter.

Once you’re done getting schooled on everything this post has to offer head on over to the powershell.org announcement for links to the other four past and upcoming #PSBlogWeek articles this week!

 

In a previous life, I managed Microsoft’s System Center Configuration Manager (SCCM) product. I was a SCCM ninja. One of the coolest things I got out of that was learning about the CMTrace log utility.  Part of the System Center Configuration Manager Toolkit, CMTrace is a log viewing utility that allows you to watch logs, in real time, as well as point out various potential problems through yellow and red highlighting, a sortable timestamp column and more.

cmtrace

Just look at the beauty of the sortable columns and the red highlighting! At first, you might think that you can view any kind of text log in CMTrace and you’d be right. However, let’s a look at the WindowsUpdate.log file in CMTrace.2015-12-03_18-22-01

Notice all the columns are gone?  CMTrace will still view regular log files but you won’t get some of the features that makes CMTrace great.  You’ll soon find that a text file has to be properly formatted in order to get all of those helpful columns to show up and to properly define which lines should be highlighted yellow vs. red. vs. nothing at all.

In today’s post, I’d like to show you a couple of functions called

Write-Log and Start-Log
 . These functions were specifically built to record your script’s activity to a log file which can then be read in CMtrace. By the end of this post, you will have a function that you can call in your scripts to build log files in a way for CMtrace to read them properly.

Start-Log

To prevent having to specify the same log file path over and over again I chose to create a function called

Start-Log
 . This function is intended to be called at the top of your script.  This function simply creates a text file and (the important part) sets a global variable called
ScriptLogFilePath
 .
[CmdletBinding()]
    param (
        [ValidateScript({ Split-Path $_ -Parent | Test-Path })]
	[string]$FilePath
    )
	
    try
    {
        if (!(Test-Path $FilePath))
	{
	    ## Create the log file
	    New-Item $FilePath -Type File | Out-Null
	}
		
	## Set the global variable to be used as the FilePath for all subsequent Write-Log
	## calls in this session
	$global:ScriptLogFilePath = $FilePath
    }
    catch
    {
        Write-Error $_.Exception.Message
    }

This function is super-simple. However, is required to prevent us from having to pass 

-LogFile
  every, single time we need to call our Write-Log function in our scripts. By simply creating a global variable ahead of time, we can then simply call
Write-Log
  and will know the log file path.

Write-Log

Once you’ve called

Start-Log
  in your script, you are now able to run
Write-Log
  to write log messages to the log file.
Write-Log
  has two parameters;
Message
  and
LogLevel
 .
Message
  is easy. That’s simply what you’d like to write to the log.
LogLevel
  requires some explaining.  In order for CMTrace to highlight lines as red or yellow the line needs to be recorded a certain way. More specifically, it needs to have string like this: 
type="1"
 . This type key can be 1,2 or 3. These indicate levels of severity in your script. For example, if I’d like to log a simple informational message, then that’d be a 1. If I’d like to log a more severe activity then I might use 2 which would get highlighted yellow. Finally, I might choose 3 if I’d like that line highlighted red in CMTrace.
param (
    [Parameter(Mandatory = $true)]
    [string]$Message,
		
    [Parameter()]
    [ValidateSet(1, 2, 3)]
    [int]$LogLevel = 1
)

Notice the

LogLevel
  parameter?  By default, it will set that to a 1 but you are always able to override that if necessary if you’d like to write some more severe activity that happens during your script’s execution.

Next, you need that handy date/time column to show up right. To do this required a specific date/time format that is achieved by this string manipulation wizardry.

$TimeGenerated = "$(Get-Date -Format HH:mm:ss).$((Get-Date).Millisecond)+000"

Next is where I’ll build a log line’s template using all of the appropriate format that the line needs to have to show up correctly in CMtrace.

$Line = '<![LOG[{0}]LOG]!><time="{1}" date="{2}" component="{3}" context="" type="{4}" thread="" file="">'

After you’ve got the template it’s then a matter of building what’s going to go in the

{}
 ‘s. Here, I build an array which I will then pass into the
$Line
  to replace all of our
{}
 ‘s with real information.
$LineFormat = $Message, $TimeGenerated, (Get-Date -Format MM-dd-yyyy), "$($MyInvocation.ScriptName | Split-Path -Leaf):$($MyInvocation.ScriptLineNumber)", $LogLevel

These are in the same order as the

{}
 ‘s above.
{0}
  will get converted to
$Message
 ,
{1}
  will get converted to
$TimeGenerated
 ,
{2}
  will get converted to today’s date and
{4}
  will get converted by
$LogLevel
 .  Notice I skipped
{3}
 ?  This is where I get all ninja on you. CMTrace has a component column that I never used much so I decided to make something out of it. I wanted to see the script’s name and the line number in which
Write-Log
  was called. This string: 
"$($MyInvocation.ScriptName | Split-Path -Leaf):$($MyInvocation.ScriptLineNumber)"
 is what makes that happen.

I then bring these two variables together using PowerShell’s string formatting to build

$Line
 .
$Line = $Line -f $LineFormat

It’s then just a matter of writing

$Line
  to a text file that’s already been defined by
Start-Log
 .
Add-Content -Value $Line -Path $ScriptLogFilePath

How it Works

Let’s say I build a script that looks something like this called LogDemo.ps1:

Start-Log -FilePath C:\MyLog.log
Write-Host "Script log file path is [$ScriptLogFilePath]"
Write-Log -Message 'simple activity'
Write-Log -Message 'warning' -LogLevel 2
Write-Log -Message 'Error' -LogLevel 3

This script creates our log file at

C:\MyLog.log
  and then proceeds to write 3 levels of severity to the log through using the
LogLevel
  parameters I explained above.

When I check out the output of this file with

Get-Content
  it looks pretty ugly.
<![LOG[simple activity]LOG]!><time="18:56:26.307+000" date="12-03-2015" component="LogDemo.ps1:3" context="" type="1" thread="" file="">
<![LOG[warning]LOG]!><time="18:56:26.307+000" date="12-03-2015" component="LogDemo.ps1:4" context="" type="2" thread="" file="">
<![LOG[Error]LOG]!><time="18:56:26.307+000" date="12-03-2015" component="LogDemo.ps1:5" context="" type="3" thread="" file="">

However, let’s break this open in CMTrace and see what it looks like.

2015-12-03_18-58-10

Isn’t that beautiful?

Even if you’re not an SCCM admin I highly recommend using CMtrace for all your log viewing needs. Once you’ve got the log files in the appropriate format (and you now have no excuse not to) simply open them up in CMtrace and observe the beauty of all that is CMtrace!

The post Building Logs for CMTrace with PowerShell appeared first on Adam, the Automator.

ConfigMgr: How To Find Deployments Targeted at a Client

$
0
0

I had an issue tonight with ConfigMgr and software updates that I thought I’d share. My place of employment’s PCs are waaay out of date with updates. Unfortunately there’s over 2,000+ patches that have to be made available to workstations during deployments. Due to ConfigMgr’s 1,000 update per deployment rule 5 different software update groups were created along with 5 different deployments. Some clients were receiving all deployments but some were not. It was getting frustrating spot-checking 800 PCs to see if they were queued up ready for the maintenance window.

With a few minutes of browsing the ConfigMgr 2012 SDK documentation I found the CCM_AssignmentCompliance class in the DeploymentAgent namespace would get me a list of all deployments each client sees.  Once I had the WMI class I needed the rest was cake! I was going to see if I could find the actual deployment name but it was getting late and I didn’t much care at the time.

Get-WmiObject -Namespace 'rootccmsoftwareupdatesdeploymentagent' -ComputerName $computer -Class 'ccm_assignmentcompliance' | % {$_.AssignmentId}

I hope this saves any other ConfigMgr admins some time.

The post ConfigMgr: How To Find Deployments Targeted at a Client appeared first on Adam, the Automator.

How to Tell a Thin Client From a Desktop with Powershell

$
0
0

I’m a ConfigMgr admin by day. I have to manage thousands of workstations; some desktops and some thin clients. Thin clients and desktops are two different device types and must be handled different. When running Powershell scripts against these workstations I always have the need to tell if I’m working with a thin client or a desktop. One of the big reasons is to specify different ConfigMgr client install settings such as cache size, compressed installation, etc. This function is the one I use all the time.

This function takes one argument; a computer name. If no computer name is specified it’ll default to localhost.  It then attempts a WMI query against the device to find the Model attribute. In my environment I define a thin client as anything with ‘hp t*’ in the model name that’s not a ‘hp touchsmart’ model. Your thin clients may differ here in model. If it matches ‘hp t*’ but is not a HP touch smart device I define it as a thin client. If, for some reason, a WMI query won’t work I grab the attributes from the remote PC’s explorer exe to see if it’s compressed or not. In my environment, all thin clients have a compressed hard drive and no desktops do. I consider this my fallback attempt.

Function Get-DeviceChassis () {
    [CmdletBinding()]
    Param($ComputerName = 'localhost')

    PROCESS { 
        $Output = New-Object PsObject -property @{ComputerName = $ComputerName;Chassis=''}
        try {
            $Model = (Get-WmiObject -Query "SELECT Model FROM Win32_ComputerSystem" -ComputerName $ComputerName).Model;
                if (($Model -like 'hp t*') -and ($Model -notlike 'hp touchsmart*')) {
                    Write-Verbose "Found chassis to be thin client via WMI";
                    $Output.Chassis = 'Thin Client'
	    	} else {
                    Write-Verbose "Found chassis to be desktop via WMI";
                    $Output.Chassis = 'Desktop'
	    	}##endif
        } catch {
            if ((Get-Item "\$ComputerNameadmin$explorer.exe").Attributes -match 'Compressed') {
                Write-Verbose "Found chassis to be thin client via compressed files";
                $Output.Chassis = 'Thin Client'
            } else {
                Write-Verbose "Found chassis to be desktop via compressed files";
                $Output.Chassis = 'Desktop'
            }
        } finally {
            $Output
        }
    }
}

Download this script on poshcode.org

The post How to Tell a Thin Client From a Desktop with Powershell appeared first on Adam, the Automator.

Convert a Product Code to a GUID

$
0
0

As a ConfigMgr admin I sometimes have to reverse engineer packages. In this instance, I had a line of business application that whoever packaged it up was on drugs. To remove this application without any remnants I was forced to analyze the uninstall procedure. During this process I had a need to convert a product code (compressed GUID) to a normal GUID to find instances in the registry. To do this, you’re forced to adhere to some strange methodology.

At one time I found an obscure Vbscript to do this that was nearly 20 lines long.  I have saved you from this trouble and have created this handy Powershell function to do the dirty work for you.

function Convert-CompressedGuidToGuid {
	<#
	.SYNOPSIS
		This converts a compressed GUID also known as a product code into a GUID.	
	.DESCRIPTION
		This function will typically be used to figure out the MSI installer GUID
		that matches up with the product code stored in the 'SOFTWARE\Classes\Installer\Products'
		registry path.
	.EXAMPLE
		Convert-CompressedGuidToGuid -CompressedGuid '2820F6C7DCD308A459CABB92E828C144'
	
		This example would output the GUID '{7C6F0282-3DCD-4A80-95AC-BB298E821C44}'
	.PARAMETER CompressedGuid
		The compressed GUID you'd like to convert.
	#>
	[CmdletBinding()]
	[OutputType([System.String])]
	param (
		[Parameter(ValueFromPipeline, ValueFromPipelineByPropertyName, Mandatory)]
		[ValidatePattern('^[0-9a-fA-F]{32}$')]
		[string]$CompressedGuid
	)
	process {
		$Indexes = [ordered]@{
			0 = 8;
			8 = 4;
			12 = 4;
			16 = 2;
			18 = 2;
			20 = 2;
			22 = 2;
			24 = 2;
			26 = 2;
			28 = 2;
			30 = 2
		}
		$Guid = '{'
		foreach ($index in $Indexes.GetEnumerator()) {
			$part = $CompressedGuid.Substring($index.Key, $index.Value).ToCharArray()
			[array]::Reverse($part)
			$Guid += $part -join ''
		}
		$Guid = $Guid.Insert(9,'-').Insert(14, '-').Insert(19, '-').Insert(24, '-')
		$Guid + '}'
	}
}

The post Convert a Product Code to a GUID appeared first on Adam, the Automator.

Viewing all 19 articles
Browse latest View live