Microsoft SCCM
  • 12 Feb 2025
  • 7 Minutes to read
  • Contributors
  • Dark
    Light

Microsoft SCCM

  • Dark
    Light

Article summary

Microsoft System Center Configuration Manager (SCCM) is a management tool that enables IT administrators to deploy, manage, and secure applications, updates, and operating systems across enterprise environments. For more information, refer to the Microsoft product page.

Integration Method: Azure Blob
Tables: OS Patches, Software Inventory

Events

Description

Software List

Get the list of software deployed on devices

OS Patch Details

Get the list of patches applied on the OS

This integration has been tested against the following versions

Microsoft SCCM version

2403

Microsoft SQL Server Management Studio (SSMS)

2017

Prerequisites

Privileges requirements

  • Access to the SCCM server to download and run a Powershell script

  • Access to DataBee console

Hardware requirements

    • Minimum system requirements for script to run on SCCM server

      • CPU - 8 core

      • RAM - 16GB

      • Disk I/O space - 100MB

Configuration Overview

  1. Configure an Azure Blob. This is used to share information from SCCM to DataBee

  2. Download and schedule a Powershell script on the SCCM server

  3. Add the SCCM feed in DataBee
     A screen shot of a computer  AI-generated content may be incorrect.

DataBee Parameters

Microsoft Parameters

Client ID

Application (client) ID

Client Secret

Secret Value

Tenant ID

Directory (tenant) ID

Blob account name

Storage account name

Blob container name

Blob container name

Azure queue name

Azure queue name

DataBee gets OS Patches and Software Inventory related information from SCCM. Data will be sent to Azure Blob using a Powershell script which runs on the SCCM server.

Azure Configuration

Step 1: App Registration

  1. Open Azure Portal and click on App registration


     

  2. Create a new application registration or open your existing application.

    1. To create a new application registration:

    2. Click on “+ New registration
       

    3. Enter the app name, select the supported account types and click “Register
       

    4. New application registration will be created

  3. If using an existing application registration, locate your application registration and click on it
     

Application, Directory ID & Secret Value

  1. Copy Application Id and Directory id from the overview page of the app. This is your client ID and tenant ID respectively.
     

  2. Go to Certificates & Secrets, then create new secret as shown below. Make sure you copy the secret value and save it for future use.

     

Step 2: Blob Creation

  1. Go to home page and then go to Storage Accounts
     

Storage Account Name

  1. Storage Account - Create or Use an existing Storage Account.

    1. To create a new storage account, Follow the steps mentioned on here.

To access an existing storage account, locate the storage account and click it. Record the Storage Account Name

  1. Assign Azure AD App to a Role to access the blob storage. There are two roles you can use as shown below

    1. Storage Blob Data Contributor: Select this role if the DataBee should delete objects after reading them.

    2. Storage Blob Data Reader: Choose this role for read-only access when deletion is not required.

Delete on read is a setting you can choose when configuring the DataBee feed

To add the role:

a. Under the Storage Account, navigate to Access Control (IAM) and click on “+ Add” 

b. In the search bar, type storage blob, select the role and click Next
 

c. Select the member for role assignment

i. Click on “+ Select members”

ii. Find the application created in Step 1 and click “Select” 

d. Click “Next”
 

e. Add conditions If you want to provide more fine-grained access control. Click “Next
 

f. Click “Review + Assign
 

g. To verify the role assignment

i. Click on “Access Control (IAM)” > “Role assignments”

ii. Search for your application, role assignment will appear for storage blob
 

Storage Container Name

  1. Click on Data storage > Containers

    1. Add a new container by clicking on “+ Container

    2. Provide a name and choose anonymous access level: Blob

    3. Click Create


       

  2. Right click on created container and click Generate SAS
     

  3. Generate SAS token

    1. Select read permissions

    2. Set allowed protocols to HTTPS and HTTP

    3. Click on generate SAS token and URL

    4. Copy the token and keep it safe
       

Queue Name

  1. Provide a queue name. Click on Data storage > Queues

    1. Go to Queues

    2. Click on + Queue

    3. Add name and click on OK 

  2. To create event, go to events tab and click on + Event Subscription 

Event Name

  1. Provide event subscription details, Topic details and Event types

    1. Enter event name

    2. Choose event schema as Event Grid Schema

    3. Add System topic name if not added already

    4. Select event type Blob Created
       

  2. Select Endpoint type as Storage Queue

  1. Configure an endpoint

    1. Click on Configure an endpoint option

    2. Select subscription and storage account

    3. Select existing queue

    4. Click on select
       

  2. Click on Create


     

Step 3: Server Instance and Database Name

To get the server instance and database name, Microsoft SQL Server Management Studio (SSMS) is required.

To check if Microsoft SSMS installed, search for"SQL Server Management Studio" in the Start menu

If Microsoft SSMS is not present, install it using the steps found at this site. SSMS is used to get the SCCM database and server instance name.

Follow the steps below to retrieve server instance and database name:

  1. Open the Microsoft SQL Server Management Studio (SSMS) and click on the “Connect object explorer” button
     

  2. A pop-up will appear to connect the server. Copy the Server name and click “Connect” to connect the server
     

  3. The database name is shown under the server instance. Save the database instance name
     

  4. Save these details to configure the PowerShell script in later steps.

Step 4: Powershell Script Configuration

In this step we will configure the task scheduler to run a task on the specified time which will push the data in the Azure blob.

Follow below steps to configure task scheduler

  1. Login with an Administrator account

  2. In Search bar, search for task scheduler and open it
     

  3. Click on Create Task. 

  4. Fill in Generate Details about the task

    1. Enter a task name

    2. Select "Run whether the user is logged on or not"

    3. Check the "Run with highest privileges” option
       

  5. Create a new trigger

    1. Go to the Triggers tab

    2. Click on “New” button

    3. Set the time to run daily at a specific time when you want to run the trigger

    4. Ensure the "Enabled" option is checked and Click OK
       

  6. Create new action for task

    1. To create an action, we need a script which will push the data to azure blob and we need the script parameters.

    2. You can follow below steps to get those,

      1. Request the script file: upload_sccm_data_v3.ps1

      2. Script parameters will be passed along with the script as arguments. Below are the script parameters.

Script Parameters:

  • Script File Path: The file path of the downloaded script

  • Storage Account Name: The name of your Azure Storage account

  • Storage Container Name: The name of the container within the storage account

  • SAS Token: Requires write access permission for shared access signatures Generate SAS Token

  • Output File Path:  Specifies the location where the output file should be saved. Also, Logs for the script will be stored under this location.

  • Database Name: Name of the database where SCCM stores the data

  • Server Instance Name: Name of the server instance

Sample script parameters,

SCRIPT_FILE_PATH: "C:\Users\user\DataBeeSCCM\upload_sccm_data_v3.ps1"

STORAGE_ACCOUNT_NAME: "storage_name"

CONTAINER_NAME: "container_name"

SAS_TOKEN: "'sas_token'"

OUTPUT_FILE_PATH: "C:\Users\user\DataBeeSCCM\output"

DATABASE_NAME: "demo_database"

SERVER_INSTANCE_NAME: "demo_instance"

Note: If a parameter value contains spaces, enclose it in double quotes.

c. After preparing script parameters, perform below steps

i. Go to the actions tab
ii. Click on “New” button
iii. Enter “powershell” in the Program/script field
iv. In the “Add Arguments” field, paste the string below with your parameters.

This is the argument passed to the script based on the example above:

-ExecutionPolicy Bypass -c "C:\Users\user\DataBeeSCCM\upload_sccm_data_v3.ps1" -storageAccountName "storage_name" -containerName "container_name" -sasToken "'sas_token'" -outputFilePath "C:\Users\user\DataBeeSCCM\output" -databaseName "demo_database" -ServerInstanceName "demo_instance"

d. Click OK
 

  1. You can view the task logs in the History tab
     

DataBee Configuration

  1. Navigate to Data and click on the Add New Data Source.

  1. Search for the Microsoft SCCM using the search bar and select Microsoft SCCM

  2. Select the Azure Blob option. 

  3. Enter data source details in the form. Then click on Next

DataBee Feed Configuration

  1. Enter the Azure authentication details

    1. Client ID – Paste the Application ID

    2. Client Secret -  Paste the Secret Value

    3. Tenant ID -  Paste the Directory ID

DataBee Blob Configuration

  1. Enter the Azure Blob Storage Details

    1. Blob Account Name: Paste the Storage Account name

    2. Blob Container Name: Paste the Storage Container name

    3. Compression: none

    4. Content Type: JSON

    5. Azure Queue Name:  Paste the Queue name  

  2. Click Submit

Troubleshooting

  • Check script ran successfully from the task scheduler or not

    • Check the Task Scheduler history to verify whether the script was executed successfully. To check that follow below steps,

        • Locate Your Script's Task.

      • Look for the latest events which has Event ID 100 (Task Started) and Event ID 200 (It means task completed successfully).

      • If Event ID 102 appears, it means the task was triggered but may have encountered an issue.

      • If Event ID 203 appears, it indicates the script failed.

      • You can see the details about the event below.

  • Check why data is not being pulled

    1. First you can check If the script ran successfully or not. You can follow the above steps to check that.

    2. If the script ran successfully, you could check the script logs.

      • The log files will be available at the location you provided to the script parameters. You can refer this step for output file path. For example, if you provided “C:\Users\Desktop\Databee” path. Then log files will be available under the “C:\Users\Desktop\Databee\Logs” directory.

      • You can check for any errors in the latest log file and share it with the support team.

      • Possible errors are that invalid details provided for the below parameters

        • Database name, Instance name, Azure sasToken.

  • Not able to run the script

    1. Ensure that the user has the necessary permissions to execute PowerShell script.

    2. Try running the script using an administrator user.


Was this article helpful?

Changing your password will log you out immediately. Use the new password to log back in.
First name must have atleast 2 characters. Numbers and special characters are not allowed.
Last name must have atleast 1 characters. Numbers and special characters are not allowed.
Enter a valid email
Enter a valid password
Your profile has been successfully updated.
ESC

Eddy AI, facilitating knowledge discovery through conversational intelligence