- Print
- DarkLight
Microsoft System Center Configuration Manager (SCCM) is a management tool that enables IT administrators to deploy, manage, and secure applications, updates, and operating systems across enterprise environments. For more information, refer to the Microsoft product page.
Integration Method: Azure Blob
Tables: OS Patches, Software Inventory
Events | Description |
Software List | Get the list of software deployed on devices |
OS Patch Details | Get the list of patches applied on the OS |
This integration has been tested against the following versions
Microsoft SCCM version | 2403 |
Microsoft SQL Server Management Studio (SSMS) | 2017 |
Prerequisites
Privileges requirements
Access to the SCCM server to download and run a Powershell script
Access to DataBee console
Hardware requirements
Minimum system requirements for script to run on SCCM server
CPU - 8 core
RAM - 16GB
Disk I/O space - 100MB
Configuration Overview
Configure an Azure Blob. This is used to share information from SCCM to DataBee
Download and schedule a Powershell script on the SCCM server
Add the SCCM feed in DataBee
DataBee Parameters | Microsoft Parameters |
Storage account name | |
Blob container name | |
Azure queue name |
DataBee gets OS Patches and Software Inventory related information from SCCM. Data will be sent to Azure Blob using a Powershell script which runs on the SCCM server.
Azure Configuration
Step 1: App Registration
Open Azure Portal and click on App registration
Create a new application registration or open your existing application.
To create a new application registration:
Click on “+ New registration”
Enter the app name, select the supported account types and click “Register”
New application registration will be created
If using an existing application registration, locate your application registration and click on it
Application, Directory ID & Secret Value
Copy Application Id and Directory id from the overview page of the app. This is your client ID and tenant ID respectively.
Go to Certificates & Secrets, then create new secret as shown below. Make sure you copy the secret value and save it for future use.
Step 2: Blob Creation
Go to home page and then go to Storage Accounts
Storage Account Name
Storage Account - Create or Use an existing Storage Account.
To create a new storage account, Follow the steps mentioned on here.
To access an existing storage account, locate the storage account and click it. Record the Storage Account Name
Assign Azure AD App to a Role to access the blob storage. There are two roles you can use as shown below
Storage Blob Data Contributor: Select this role if the DataBee should delete objects after reading them.
Storage Blob Data Reader: Choose this role for read-only access when deletion is not required.
Delete on read is a setting you can choose when configuring the DataBee feed
To add the role:
a. Under the Storage Account, navigate to Access Control (IAM) and click on “+ Add”
b. In the search bar, type storage blob, select the role and click Next
c. Select the member for role assignment
i. Click on “+ Select members”
ii. Find the application created in Step 1 and click “Select”
d. Click “Next”
e. Add conditions If you want to provide more fine-grained access control. Click “Next”
f. Click “Review + Assign”
g. To verify the role assignment
i. Click on “Access Control (IAM)” > “Role assignments”
ii. Search for your application, role assignment will appear for storage blob
Storage Container Name
Click on Data storage > Containers
Add a new container by clicking on “+ Container”
Provide a name and choose anonymous access level: Blob
Click Create
Right click on created container and click Generate SAS
Generate SAS token
Select read permissions
Set allowed protocols to HTTPS and HTTP
Click on generate SAS token and URL
Copy the token and keep it safe
Queue Name
Provide a queue name. Click on Data storage > Queues
Go to Queues
Click on + Queue
Add name and click on OK
To create event, go to events tab and click on + Event Subscription
Event Name
Provide event subscription details, Topic details and Event types
Enter event name
Choose event schema as Event Grid Schema
Add System topic name if not added already
Select event type Blob Created
Select Endpoint type as Storage Queue
Configure an endpoint
Click on Configure an endpoint option
Select subscription and storage account
Select existing queue
Click on select
Click on Create
Step 3: Server Instance and Database Name
To get the server instance and database name, Microsoft SQL Server Management Studio (SSMS) is required.
To check if Microsoft SSMS installed, search for"SQL Server Management Studio" in the Start menu
If Microsoft SSMS is not present, install it using the steps found at this site. SSMS is used to get the SCCM database and server instance name.
Follow the steps below to retrieve server instance and database name:
Open the Microsoft SQL Server Management Studio (SSMS) and click on the “Connect object explorer” button
A pop-up will appear to connect the server. Copy the Server name and click “Connect” to connect the server
The database name is shown under the server instance. Save the database instance name
Save these details to configure the PowerShell script in later steps.
Step 4: Powershell Script Configuration
In this step we will configure the task scheduler to run a task on the specified time which will push the data in the Azure blob.
Follow below steps to configure task scheduler
Login with an Administrator account
In Search bar, search for task scheduler and open it
Click on Create Task.
Fill in Generate Details about the task
Enter a task name
Select "Run whether the user is logged on or not"
Check the "Run with highest privileges” option
Create a new trigger
Go to the Triggers tab
Click on “New” button
Set the time to run daily at a specific time when you want to run the trigger
Ensure the "Enabled" option is checked and Click OK
Create new action for task
To create an action, we need a script which will push the data to azure blob and we need the script parameters.
You can follow below steps to get those,
Request the script file: upload_sccm_data_v3.ps1
Script parameters will be passed along with the script as arguments. Below are the script parameters.
Script Parameters:
Script File Path: The file path of the downloaded script
Storage Account Name: The name of your Azure Storage account
Storage Container Name: The name of the container within the storage account
SAS Token: Requires write access permission for shared access signatures Generate SAS Token
Output File Path: Specifies the location where the output file should be saved. Also, Logs for the script will be stored under this location.
Database Name: Name of the database where SCCM stores the data
Server Instance Name: Name of the server instance
Sample script parameters,
SCRIPT_FILE_PATH: "C:\Users\user\DataBeeSCCM\upload_sccm_data_v3.ps1"
STORAGE_ACCOUNT_NAME: "storage_name"
CONTAINER_NAME: "container_name"
SAS_TOKEN: "'sas_token'"
OUTPUT_FILE_PATH: "C:\Users\user\DataBeeSCCM\output"
DATABASE_NAME: "demo_database"
SERVER_INSTANCE_NAME: "demo_instance"
Note: If a parameter value contains spaces, enclose it in double quotes.
c. After preparing script parameters, perform below steps
i. Go to the actions tab
ii. Click on “New” button
iii. Enter “powershell” in the Program/script field
iv. In the “Add Arguments” field, paste the string below with your parameters.
This is the argument passed to the script based on the example above:
-ExecutionPolicy Bypass -c "C:\Users\user\DataBeeSCCM\upload_sccm_data_v3.ps1" -storageAccountName "storage_name" -containerName "container_name" -sasToken "'sas_token'" -outputFilePath "C:\Users\user\DataBeeSCCM\output" -databaseName "demo_database" -ServerInstanceName "demo_instance"
d. Click OK
You can view the task logs in the History tab
DataBee Configuration
Navigate to Data and click on the Add New Data Source.
Search for the Microsoft SCCM using the search bar and select Microsoft SCCM.
Select the Azure Blob option.
Enter data source details in the form. Then click on Next.
DataBee Feed Configuration
Enter the Azure authentication details
Client ID – Paste the Application ID
Client Secret - Paste the Secret Value
Tenant ID - Paste the Directory ID
DataBee Blob Configuration
Enter the Azure Blob Storage Details
Blob Account Name: Paste the Storage Account name
Blob Container Name: Paste the Storage Container name
Compression: none
Content Type: JSON
Azure Queue Name: Paste the Queue name
Click Submit
Troubleshooting
Check script ran successfully from the task scheduler or not
Check the Task Scheduler history to verify whether the script was executed successfully. To check that follow below steps,
Locate Your Script's Task.
Look for the latest events which has Event ID 100 (Task Started) and Event ID 200 (It means task completed successfully).
If Event ID 102 appears, it means the task was triggered but may have encountered an issue.
If Event ID 203 appears, it indicates the script failed.
You can see the details about the event below.
Check why data is not being pulled
First you can check If the script ran successfully or not. You can follow the above steps to check that.
If the script ran successfully, you could check the script logs.
The log files will be available at the location you provided to the script parameters. You can refer this step for output file path. For example, if you provided “C:\Users\Desktop\Databee” path. Then log files will be available under the “C:\Users\Desktop\Databee\Logs” directory.
You can check for any errors in the latest log file and share it with the support team.
Possible errors are that invalid details provided for the below parameters
Database name, Instance name, Azure sasToken.
Not able to run the script
Ensure that the user has the necessary permissions to execute PowerShell script.
Try running the script using an administrator user.