Azure Log Ingestion API configuration
Omada currently uses the HTTP Data Collector API to send logs to Azure Log Analytics. However, the HTTP Data Collector API is being deprecated and will no longer function after September 14, 2026. To ensure uninterrupted connector functionality, you must migrate to the Log Ingestion API, updating existing connectors for compatibility with Log Analytics workspaces and to leverage enhanced ingestion capabilities.
Here, you can find information on the following scenarios:
-
New setup – configure Azure Log Analytics using the Log Ingestion API
-
Migration – migrate from the legacy HTTP Data Collector API to the Log Ingestion API
Both scenarios preserve backward compatibility with existing configurations.
Prerequisites
Before starting the migration, ensure the following requirements are met:
-
Azure CLI installed locally or access to Azure Cloud Shell
-
An Azure account with permissions to create Data Collection Rules (DCRs) and manage Log Analytics workspaces:
Task Required Role Scope Alternative Permission Create Table Owner Log Analytics Workspace Microsoft.OperationalInsights/workspaces/tables/writeMigrate Table Owner Log Analytics Workspace Microsoft.OperationalInsights/workspaces/tables/writeCreate DCR Contributor or Monitoring Contributor Resource Group Microsoft.Insights/dataCollectionRules/writeAssign RBAC Owner or User Access Administrator DCR Resource Microsoft.Authorization/roleAssignments/write
After ensuring the above prerequisites are met, verify your role assignments:
az role assignment list --assignee <your-email> --resource-group <resource-group>
If you do not have the required permissions, contact your Azure administrator to request the Contributor role at the resource group level.
Configuration procedure
Follow the steps below to migrate from the HTTP Data Collector API to the Log Ingestion API.
Azure authentication
Sign in to Azure using the Azure CLI:
az login
Prepare the Log Analytics table
- Create new table
- Migrate existing table
If you are setting up the Log Ingestion API for the first time and do not have an existing table, create a custom table in Azure Log Analytics.
Required permission: Microsoft.OperationalInsights/workspaces/tables/write, or the Owner role on the Log Analytics workspace.
The following example shows how to create the custom table using the Azure CLI.
az monitor log-analytics workspace table create `
--resource-group <<resource-group-name>> `
--workspace-name <<log-analytics-name>> `
--name <<log-analytics-tablename>> `
--columns `
TimeGenerated=datetime `
Category=string `
Component_s=string `
ContextId_s=string `
Context_s=string `
CorrelationId=string `
DateTime_t=datetime `
ElapsedTime_d=real `
EventId_d=real `
Event_s=string `
Exception_s=string `
HostName_s=string `
Layer_s=string `
Level_d=real `
Location_s=string `
MayContainPII_b=boolean `
Message=string `
MetaData_s=string `
TenantId_d=real `
UserId_s=string `
Username_s=string `
--retention-time 60
Make sure to replace <<resource-group-name>>, <<log-analytics-name>>, and <<log-analytics-tablename>> with values from your configuration. Custom table names must end with the _CL suffix.
If you are migrating from the HTTP Data Collector API and already have an existing custom table, migrate the table to support Data Collection Rules (DCR).
Required permission: Microsoft.OperationalInsights/workspaces/tables/write, or the Owner role on the Log Analytics workspace.
Run the migration command for the OIS_CL Log Analytics table. Replace subscription-id, resource-group, and log-analytics-name with values from your configuration.
az rest --method post --url "https://management.azure.com/subscriptions/<<subscription-id>>/resourcegroups/<<resource-group>>/providers/Microsoft.OperationalInsights/workspaces/<<log-analytics-name>>/tables/OIS_CL/migrate?api-version=2025-02-01%22"
This migration preserves existing data and maintains continuity for dashboards, alerts, and integrations.
Create direct DCR using Azure CLI
Use the Azure CLI to create a Data Collection Rule (DCR) based on a JSON definition file.
If you want to keep the existing table and fields, the Data Collection Rule (DCR) must use exactly the same field names and structure. Keeping the same table and structure preserves existing data and maintains continuity for dashboards, alerts, and integrations.
Command template
Run the following command, replacing the placeholders with values from your configuration:
az monitor data-collection rule create `
--location '<<location>>' `
--resource-group '<<resource-group>>' `
--name '<<dcr-name>>' `
--rule-file '"<<path-to-dcr-json>>"' `
--description '<<description>>'
Example
The following example creates a DCR in the westeurope region using a local JSON definition file:
```bash
az monitor data-collection rule create `
--location 'westeurope' `
--resource-group 'resgroup-name' `
--name 'resgroup-name-la-dcr1' `
--rule-file '"C:\Users\johndoe\Desktop\dcr1.json"' `
--description 'DCR for log ingestion'
Required: A Data Collection Rule (DCR) JSON definition file (for example, C:\Users\johndoe\Desktop\dcr1.json).
Exemplary DCR JSON Content
Use the following example as a starting point for your Data Collection Rule (DCR) JSON definition. This configuration defines a Direct DCR for ingesting custom OIS_CL data into a Log Analytics workspace.
{
"location": "westeurope",
"kind": "Direct",
"properties": {
"streamDeclarations": {
"Custom-<<log-analytics-tablename>>": {
"columns": [
{
"name": "Category",
"type": "string"
},
{
"name": "Component_s",
"type": "string"
},
{
"name": "ContextId_s",
"type": "string"
},
{
"name": "Context_s",
"type": "string"
},
{
"name": "CorrelationId",
"type": "string"
},
{
"name": "DateTime_t",
"type": "datetime"
},
{
"name": "ElapsedTime_d",
"type": "real"
},
{
"name": "EventId_d",
"type": "real"
},
{
"name": "Event_s",
"type": "string"
},
{
"name": "Exception_s",
"type": "string"
},
{
"name": "HostName_s",
"type": "string"
},
{
"name": "Layer_s",
"type": "string"
},
{
"name": "Level_d",
"type": "real"
},
{
"name": "Location_s",
"type": "string"
},
{
"name": "MayContainPII_b",
"type": "boolean"
},
{
"name": "Message",
"type": "string"
},
{
"name": "MetaData_s",
"type": "string"
},
{
"name": "TenantId_d",
"type": "real"
},
{
"name": "UserId_s",
"type": "string"
},
{
"name": "Username_s",
"type": "string"
}
]
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "<<log-analytics-workspaceresourceid>>",
"name": "<<log-analytics-name>>"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-<<log-analytics-tablename>>"
],
"destinations": [
"<<log-analytics-name>>"
],
"transformKql": "source | project TimeGenerated=now(), Category, Component_s, ContextId_s, Context_s, CorrelationId, DateTime_t, ElapsedTime_d, EventId_d, Event_s, Exception_s, HostName_s, Layer_s, Level_d, Location_s, MayContainPII_b, Message, MetaData_s, TenantId_d, UserId_s, Username_s"
}
]
}
}
Update the workspaceResourceId, name, and destination name to match your specific Log Analytics workspace configuration.
Verify DCR Creation
After creating the Data Collection Rule (DCR), verify that it was created with the required properties.
Check DCR properties
-
In the Azure Portal, go to Data Collection Rules and select
[Your DCR Name]. -
Select JSON view (top right).
-
From the API version dropdown, select 2024-03-11.
Verify required properties
Ensure the following properties are present in the JSON definition:
"kind": "Direct""streamDeclarations"section
Fix missing properties (if required)
If any of the required properties are missing:
-
In the Azure Portal, go to Data Collection Rules →
[DCR Name]> Export template. -
Select Deploy and then choose Edit template.
-
Add the missing properties, leaving the rest of the template unchanged:
{
"kind": "Direct",
"streamDeclarations": { /* your stream declarations */ }
} -
Save the changes and complete the deployment.
Exemplary ARM Template for DCR deployment
Use this ARM template if the CLI-based DCR creation does not include the required kind: Direct and streamDeclarations properties. You can also merge these properties into your existing ARM or JSON definition to complete the DCR configuration before deployment.
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"dataCollectionRules_example_dcr1_name": {
"defaultValue": "<<dcr-name>>",
"type": "String"
},
"workspaces_example_la_externalid": {
"defaultValue": "<<log-analytics-workspaceresourceid>>",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Insights/dataCollectionRules",
"apiVersion": "2023-03-11",
"name": "[parameters('dataCollectionRules_dcr_name')]",
"location": "westeurope",
"kind": "Direct",
"properties": {
"description": "DCR for log ingestion",
"streamDeclarations": {
"Custom-OIS_CL": {
"columns": [
{
"name": "Category",
"type": "string"
},
{
"name": "Component_s",
"type": "string"
},
{
"name": "ContextId_s",
"type": "string"
},
{
"name": "Context_s",
"type": "string"
},
{
"name": "CorrelationId",
"type": "string"
},
{
"name": "DateTime_t",
"type": "datetime"
},
{
"name": "ElapsedTime_d",
"type": "real"
},
{
"name": "EventId_d",
"type": "real"
},
{
"name": "Event_s",
"type": "string"
},
{
"name": "Exception_s",
"type": "string"
},
{
"name": "HostName_s",
"type": "string"
},
{
"name": "Layer_s",
"type": "string"
},
{
"name": "Level_d",
"type": "real"
},
{
"name": "Location_s",
"type": "string"
},
{
"name": "MayContainPII_b",
"type": "boolean"
},
{
"name": "Message",
"type": "string"
},
{
"name": "MetaData_s",
"type": "string"
},
{
"name": "TenantId_d",
"type": "real"
},
{
"name": "UserId_s",
"type": "string"
},
{
"name": "Username_s",
"type": "string"
}
]
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "[parameters('workspaces_dcr_name_externalid')]",
"name": "log-analytics-name"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-OIS_CL"
],
"destinations": [
"log-analytics"
],
"transformKql": "source | project TimeGenerated=now(), Category, Component_s, ContextId_s, Context_s, CorrelationId, DateTime_t, ElapsedTime_d, EventId_d, Event_s, Exception_s, HostName_s, Layer_s, Level_d, Location_s, MayContainPII_b, Message, MetaData_s, TenantId_d, UserId_s, Username_s"
}
]
}
}
]
}
Verify that all DCR information is visible in the JSON view and that API version 2024-03-11 is selected.
Configure RBAC permissions
Assign the required role to allow your application to send data using the Log Ingestion API:
-
Role:
Monitoring Metrics Publisher -
Scope: Your Data Collection Rule (DCR)
-
Principal: Your service principal / application
Permission propagation may take 10–15 minutes.
Update OIS log configuration
Add the following parameters to your azureLogAnalytics target in the log configuration:
| Parameter | Value Source | Example |
|---|---|---|
dataCollectionRuleId | DCR JSON > properties.immutableId | dcr-abc123... |
streamName | DCR JSON > properties.streamDeclarations.[stream_name] | Custom-OIS_CL |
logsIngestionEndpoint | DCR JSON > properties.endpoints.logsIngestion | https://[endpoint].ingest.monitor.azure.com |
XML configuration example for on-prem
Update your existing azureLogAnalytics target configuration by adding the three new parameters:
<target logTargetType="azureLogAnalytics" name="azureLog" disabled="false">
<targetParameters>
<targetParameter name="workspaceid" value="testWorkspaceId" />
<targetParameter name="logname" value="OIS" />
<targetParameter name="applicationid" value="testAppId" />
<targetParameter name="sharedkey" value="testSharedkey" />
<targetParameter name="clientsecret" value="testSecret" />
<targetParameter name="azuretenantid" value="testAzureTenant" />
<!-- Example of the 3 new parameters to configure logging through Log Ingestion API -->
<targetParameter name="dataCollectionRuleId" value="dcr-9e6d24a9cae24556baa07f1a4a21f0ff" />
<targetParameter name="streamName" value="Custom-OIS_CL" />
<targetParameter name="logsIngestionEndpoint" value="https://environment-la-dcr1-6261-westeurope.logs.z1.ingest.monitor.azure.com" />
</targetParameters>
</target>
Replace the values wrapped in << >> with the actual values from your DCR. For more information, refer to the table above.
XML configuration example for Cloud
Update your existing Azure Log Analytics configuration in the management portal by adding the three new parameters:
- Data Collection Rule Id =
"<<dcr-id>>" - Stream Name =
"Custom-<<log-analytics-tablename>>" - Log Ingestion Endpoint =
"<<logingestion-endpoint>>"
Replace the values wrapped in << >> with the actual values from your DCR. For more information, refer to the table above.
Once the configuration steps above are completed, your connector is ready to use the Log Ingestion API.
Backward compatibility
During the transition period, both APIs are supported:
-
If Log Ingestion API parameters (
dataCollectionRuleId,streamName,logsIngestionEndpoint) are configured, the connector sends logs using the Log Ingestion API. -
If these parameters are not configured, the connector continues to use the HTTP Data Collector API.
The HTTP Data Collector API is being deprecated and will no longer function after September 14, 2026. After this date, configurations that rely on the HTTP Data Collector API will no longer function and must be migrated.