This repository contains an Azure solution for deploying the Intune Log Collector using Bicep (recommended) or ARM templates, and a custom portal UI.
- Collect any log file, directory or event log from Intune managed devices.
- Deploys an Azure Function App, Storage Account, and Key Vault.
- Supports deployment via Azure Template Spec (with Bicep or ARM template) or direct ARM template.
- The main logic for log gathering is in the Remediation script (
Remediation Script/Detection.ps1). - The script connects to the Function App authenticating itself using the Entra ID device registration certificate, downloads the
LogsGatherRules.jsonfile from therulescontainer, and reads it for instructions on which files to collect. When all required data is gathered, the Function App is queried again for a SAS token eligible for the Storage Account container named 'logs', where the compressed archive of all gathered logs are uploaded.
- Azure subscription with permissions to create resources and template specs.
- The user running the deployment must have at least the Contributor role on the target resource group. This is required for all resources and for the deployment script to execute successfully.
- Azure PowerShell or Azure CLI installed.
-
Connect to Azure with a specific tenant and subscription:
Connect-AzAccount -Tenant <your-tenant-id> -Subscription <your-subscription-id>
Replace
<your-tenant-id>and<your-subscription-id>with your Azure tenant and subscription IDs. -
Clone the repository:
git clone https://github.com/MSEndpointMgr/IntuneLogCollector.git
-
Publish the template spec (Bicep):
PowerShell:
Note: Bicep CLI must be installed locally for PowerShell deployments. Install Bicep
New-AzTemplateSpec ` -Name "IntuneLogCollector" ` -Version "1.0.0" ` -ResourceGroupName "<your-resource-group>" ` -Location "<your-location>" ` -TemplateFile "Deploy/logcoll-spec.bicep" ` -UIFormDefinitionFile "Deploy/logcollector-def.json"
-
Get the template spec resource ID:
$specId = (Get-AzTemplateSpec -Name "IntuneLogCollector" -ResourceGroupName "<your-resource-group>").Versions["1.0.0"].Id
Save this resource ID for deployment.
-
Deploy the solution using the template spec:
- Open the Azure portal and navigate to:
https://portal.azure.com/#create/Microsoft.TemplateSpec/resourceId/$specId - This will launch the custom deployment experience for the Intune Log Collector solution, allowing you to configure and deploy all required resources using the portal form.
- Open the Azure portal and navigate to:
Note: Bicep is the recommended format for new deployments. Direct ARM/JSON deployment is supported for legacy scenarios only.
Use the Deploy to Azure button and follow the portal prompts to configure and deploy the solution:
After the deployment completes (using either the Azure Deploy button or Template Spec method), you can safely remove the following resources from the resource group:
logcoll-script-identity(user-assigned managed identity)logsContainerPolicyScript(deployment script resource)zipDeployScript(deployment script resource)
These resources are only required during the initial deployment and are not needed for the ongoing operation of the solution.
The managed system identity of the Function App must be granted the Microsoft Graph Device.Read.All application permission. This is required because the Function App queries Microsoft Entra ID for device records to validate and authorize requests (using the public key of the device registration certificate that's present in the alternativeSecurityIds property of the device record) from coming from the Intune-managed devices. Without this permission, the Function App will not be able to retrieve device information and will fail to process requests from clients.
Note: Assigning Microsoft Graph application permissions to a managed identity must be done via PowerShell and Microsoft Graph API. The Azure Portal does not support this for managed identities as of 2025-09-08. If this changes in the future, the steps below can simply be skipped and the permission be assigned through the Azure portal.
- Install the Microsoft Graph PowerShell module (if not already installed):
# Install only the minimal required Microsoft Graph submodules for this scenario:
Install-Module Microsoft.Graph.Authentication, Microsoft.Graph.Applications -Scope CurrentUserThe required commands (Connect-MgGraph, Get-MgServicePrincipal, New-MgServicePrincipalAppRoleAssignment, etc.) are available after installing these child modules. You do not need to install the full Microsoft.Graph module.
- Connect to Microsoft Graph as a
Global Administrator,Cloud Application AdministratororApplication Administrator:
Connect-MgGraph -Scopes 'Application.ReadWrite.All','AppRoleAssignment.ReadWrite.All'- Find the managed identity's service principal (replace
<function-app-name>with your Function App name):
$sp = Get-MgServicePrincipal -Filter "displayName eq '<function-app-name>'"- Get the Device.Read.All app role ID for Microsoft Graph:
$graphSp = Get-MgServicePrincipal -Filter "appId eq '00000003-0000-0000-c000-000000000000'"
$role = $graphSp.AppRoles | Where-Object { $_.Value -eq 'Device.Read.All' -and $_.AllowedMemberTypes -contains 'Application' }- Assign Device.Read.All to the managed identity:
New-MgServicePrincipalAppRoleAssignment -ServicePrincipalId $sp.Id -PrincipalId $sp.Id -ResourceId $graphSp.Id -AppRoleId $role.Id- Verify the permission assignment:
Get-MgServicePrincipalAppRoleAssignment -ServicePrincipalId $sp.Id | Where-Object { $_.ResourceDisplayName -eq 'Microsoft Graph' }You must be a Global Administrator or have sufficient directory permissions to grant admin consent for application permissions.
Once this is done, the Function App will be able to query device records in Microsoft Graph as required for the solution to function.
- The Function App uses Microsoft Graph to look up device objects and their properties in your tenant.
- This is essential for validating device identity, alternative security IDs, and other device attributes as part of the log collection workflow.
- Register the permission in Azure Portal:
- Go to Azure Active Directory > App registrations > Managed Identities.
- Find and select your Function App's managed identity (search for the Function App name).
- In the left menu, select API permissions > Add a permission.
- Choose Microsoft Graph > Application permissions.
- Search for and select
Device.Read.All. - Click Add permissions.
- Grant admin consent:
- Still in the API permissions blade, click Grant admin consent for [Tenant].
- Confirm the action. The status should show as "Granted for [Tenant]".
Note: You must be a Global Administrator or have sufficient directory permissions to grant admin consent for application permissions.
Once this is done, the Function App will be able to query device records in Microsoft Graph as required for the solution to function.
After deployment, you must upload the LogsGatherRules.json file to the rules container in your Storage Account. This file controls what logs are collected from endpoints.
Step-by-step instructions:
- Download or edit your
LogsGatherRules.jsonfile (seeFiles/LogsGatherRules.jsonfor a sample). - Open the Azure Portal and navigate to your deployed Storage Account.
- In the left menu, select Containers and click on the
rulescontainer. - Click Upload and select your
LogsGatherRules.jsonfile. - Confirm the file appears in the container.
Alternatively, you can use Azure CLI or PowerShell:
Azure CLI:
az storage blob upload --account-name <storage-account-name> --container-name rules --name LogsGatherRules.json --file Files/LogsGatherRules.json --auth-mode loginReplace <storage-account-name> with your actual storage account name.
PowerShell (Az.Storage module):
$ctx = New-AzStorageContext -StorageAccountName '<storage-account-name>'
Set-AzStorageBlobContent -Context $ctx -Container 'rules' -File 'Files/LogsGatherRules.json' -Blob 'LogsGatherRules.json'Replace <storage-account-name> with your actual storage account name. You may be prompted to authenticate if not already logged in.
The Detection.ps1 script requires you to specify the Function App URLs and Storage Account details for log upload and rule retrieval.
Variables to update:
# Enter the GetSASUri function URI
$FunctionGetSASUri = "<enter_uri_for_function_GetSASUri>"
# Enter the GetBlobContent function URI
$FunctionGetBlobContent = "<enter_uri_for_function_GetBlobContent>"How to find the Function App URLs:
- In the Azure Portal, go to your deployed Function App.
- In the left menu, select Functions.
- Click on the
GetSASUrifunction. In the top menu, click Get Function URL. Copy the URL and paste it into$FunctionGetSASUriin your script. - Repeat for the
GetBlobContentfunction and update$FunctionGetBlobContent.
Example:
$FunctionGetSASUri = "https://<function-app-name>.azurewebsites.net/api/GetSASUri?code=<function-key>"
$FunctionGetBlobContent = "https://<function-app-name>.azurewebsites.net/api/GetBlobContent?code=<function-key>"Other variables to update:
$StorageAccountLogsName: The name of your Storage Account (lowercase, no hyphens, with environment suffix if used).$StorageAccountLogsContainerName: Should belogs.$StorageAccountRulesName: The name of your Storage Account (same as above).$StorageAccountRulesContainerName: Should berules.
- Prepare the Detection Script:
- Ensure the script has been modified as explained in the previous section.
- Create a Remediation in Intune:
- Go to the Microsoft Endpoint Manager admin center.
- Navigate to Devices > Scripts and remediations > Remediations.
- Click Create.
- Upload
Detection.ps1as the detection script. (You can also add a remediation script if needed.) - Configure assignment and schedule as desired.
- Ensure Access to Storage Account:
- Devices running the script must have internet access to the Azure Storage Account.
- The script uses the rules file in the
rulescontainer to determine what to collect.
- Update Log Collection Rules:
- To change what is collected, update
LogsGatherRules.jsonand upload it to therulescontainer in your storage account.
Summary:
- Upload
LogsGatherRules.jsonto therulescontainer in your Storage Account. - Update the Function App URLs and Storage Account details in
Detection.ps1. - Use the Azure Portal to find the correct Function URLs for both
GetSASUriandGetBlobContent.
- The
Files/LogsGatherRules.jsonfile defines which logs and files are collected from target devices. - After deployment, this file is placed in a container named
ruleson the deployed Azure Storage Account. - You can extend log gathering by adding custom entries to the JSON file and uploading the updated file to the
rulescontainer. - By default, the sample file includes common log locations for Intune and Windows diagnostics.
The LogsGatherRules.json file supports only the attributes shown in the sample file. The Remediation script will only process these attributes:
- Type: Specifies the rule type. Allowed values include
Folder,MultipleFiles,File,Registry,MDMDiagnostics,MDMReport,WindowsUpdateClient,EventLog. - Path: The file, folder, or registry path to collect.
- LogFolderName: The name of the folder where files, logs, or exported data are copied before all gathered logs are compressed into a single archive and sent to the storage account. This helps organize collected data by source or type prior to upload.
- Recurse: (For
Foldertype) Boolean indicating whether to search subfolders. - FileExtension: (For
Foldertype) The file extension to filter files. - FileNames: (For
MultipleFilestype) Semicolon-separated list of file names to collect. - Area: (For
MDMDiagnosticstype) Semicolon-separated list of diagnostic areas. - EventLogName: (For
EventLogtype) The name of the event log to collect. - EventLogPath: (For
EventLogtype) The event log path (provider).
No other attributes are supported. Only use the attributes and types present in the sample file to ensure compatibility with the Remediation script.
Below are the supported rule types and their required/optional attributes, based on the sample file:
Collects files from a folder, optionally filtered by extension and recursion.
{
"Type": "Folder",
"Path": "C:\\Windows\\Logs",
"LogFolderName": "WindowsLogs",
"Recurse": false,
"FileExtension": "log"
}- Type: "Folder"
- Path: Folder path
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
- Recurse: (optional) true/false
- FileExtension: (optional) file extension to filter
Collects specific files from a folder.
{
"Type": "MultipleFiles",
"Path": "C:\\Windows\\Temp",
"LogFolderName": "WindowsTemp",
"FileNames": "msedge_installer.log;sample_logfile_name.log"
}- Type: "MultipleFiles"
- Path: Folder path
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
- FileNames: Semicolon-separated list of file names
Collects a single file.
{
"Type": "File",
"Path": "C:\\Windows\\System32\\drivers\\CrowdStrike\\hbfw.log",
"LogFolderName": "CrowdStrike"
}- Type: "File"
- Path: File path
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
Exports a registry key.
{
"Type": "Registry",
"Path": "HKLM:\\SOFTWARE\\Microsoft\\IntuneManagementExtension",
"LogFolderName": "Registry"
}- Type: "Registry"
- Path: Registry key path
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
Collects MDM diagnostics for specified areas.
{
"Type": "MDMDiagnostics",
"Area": "Autopilot;DeviceEnrollment;DeviceProvisioning",
"LogFolderName": "MDMDiagnostics"
}- Type: "MDMDiagnostics"
- Area: Semicolon-separated diagnostic areas. For a list of available diagnostic areas, see the official Microsoft documentation: MDMDiagnosticsTool.exe command-line options
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
Collects MDM report.
{
"Type": "MDMReport",
"LogFolderName": "MDMReport"
}- Type: "MDMReport"
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
Collects Windows Update Client logs.
{
"Type": "WindowsUpdateClient",
"LogFolderName": "WindowsUpdateClient"
}- Type: "WindowsUpdateClient"
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
Exports Windows event logs.
{
"Type": "EventLog",
"EventLogName": "Application",
"EventLogPath": "",
"LogFolderName": "EventLogs"
}- Type: "EventLog"
- EventLogName: Log name (e.g., Application, System, Operational, Admin, etc.)
- EventLogPath: (optional) Provider path (e.g., Microsoft-Windows-AAD)
- LogFolderName: Logical folder inside the compressed archive containing the different collected logs
Use only these constructs and attributes for compatibility with the Remediation script.
For issues or questions, open an issue in this repository or contact the maintainers.