Skip to main content
Version: On prem: 15.0.0

Event Logging configuration

DISCLAIMER


This page contains third-party references. We strive for our content to always be up-to-date, however, the content referring to external vendors may change independently of Omada. If you spot any inconsistency, please report it to our Helpdesk.

The logging framework provides many log entries that can be directed to several log targets.

info

In the default configuration, all logs except authentication-related ones go through the Omada Logging Framework. When the logging framework is enabled, a high number of logs is created. If you need to limit the number of logs for capacity reasons when performing Omada Identity upgrade, disable the logging framework for the initial load using the Enable system event logging customer setting.

In the logging framework, the log of successful logons covers only:

  • OpenID
  • SAML
  • Forms logon

Currently, the logging framework does not log successful logons for:

  • Active Directory SSO
  • Basic Authentication

Expand the following sections to see the list of event types and categories logged:

Omada Logging Framework event types
EventEvent IDMessage
ServerConfiguration10002Failed loading code assembly: {fileName}
ServerConfiguration10002Error reading configuration object secrets from Vault Service
ServerConfiguration10002ServerConfiguration convert error ({key})
ServerConfiguration10002ReportViewerException: {ex.Message}.
LicenseError10003GetLicenseInfo
LicenseError10003License
ExecuteTimer10005Exception occurred when executing dataexchange
ExecuteTimer10005ExecuteDataObjectEventActions
ExecuteTimer10005ExecuteProcessEventAction
ExecuteTimer10005Execute Timer: {timerId}.
ExecuteTimer10005Execute Timer: {timerId} Failed.
ObjectNotFound10100Cannot display object with ID {dataObjectId} either because it does not exist or because you do not have access to it
FailureAudit10400Audit Failure for {customer}\{userName}. The password is invalid.
FailureAudit10400Audit Failure for {customer}\{userName}. The user is not found
SamlResponseFailed10401SamlResponse not base64 encoded
SamlResponseFailed10401SAMLResponse XML invalid: {responseXml}
SamlResponseFailed10401SAMLResponse not valid according to schema
SamlResponseFailed10401SAMLResponse InResponseTo not valid: {requestId}
SamlResponseFailed10401SAMLResponse audience not valid
SamlResponseFailed10401SAMLResponse expired
SamlResponseFailed10401SAMLResponse signature reference cannot be validated
SamlResponseFailed10401SAMLResponse signature cannot be validated
OpenIDResponseFailed10402Token is not a well-formed JWT
OpenIDResponseFailed10402SecurityToken is not a JWT.
OpenIDResponseFailed10402Error occurred validating JWT.
OpenIDResponseFailed10402Failed to discover openid configuration from {endpoint}
OpenIDResponseFailed10402{message}
FailureAudit10403Failed to set new password
FailureAudit10404Audit Failure for {customer}\{userName}. The user is inactive.
FailureAudit10405Audit Failure for {customer}\{userName}. Too many logon attempts ({logonAttempts}). The user has been deactivated.
FailureAudit10406Audit Failure for {customer}\{userName}. The password has expired ({userInfo2.LastPasswordChange.ToString("g")}).
SamlConfiguration10407SAML certificate with serial {IdpCertSerialNo} not found
SamlConfiguration10407SAML certificate does not have private key and will not be used for signing the logout request
SamlConfiguration10407Failed to discover SAML configuration from {endpoint}. Reverting to legacy configuration.
LogConfiguration10408GetConfigurationData
LogConfiguration10408Error in log configuration
LogConfiguration10408Unable to set up logging of changes to property {sysname}. Property does not exist.
SuccessAudit10409Logon successful for {customer}\{tbUserName.Text}
Logoff10410Logoff initiated for {customerName}\\{AppIdentity.Identity.UserInfo.UserName}
ClientAuth10420Unexpected error generating token.
CodeMethodException10500Tabular Model processing failed after {totalSeconds} seconds.
CodeMethodException10500StartImportProfile failed
CodeMethodException10500CheckForPotentiallyStateImport failed
CodeMethodException10500Survey loaded by id:{surveyId} and key {sourceKey} is null
CodeMethodMalConfiguration10502Activity: {activityToReassign.Id} was not assigned due to unmap of the assignee group for the specified tag
CodeMethodValidation10503CodeMethod {method} ({actionId}) on EventDefinition {eventDefinitionName} ({eventDefinitionId}) failed: {errorMessage}
AccessModifierValidation10504AccessModifier on {parentType} {parentName} is obsolete: {accessModifierClassInfoName}
AccessModifierValidation10504AccessModifier failed on {parentType} {parentName}: {e.Message}
AccessDataUploadError10505OIM_AccessDataUploadHandler
ReferencedObjectDeleted10506Referenced object name not found ({dataObjectId})
XmlSchemaFileUpdateError10550XML Schema file path not found: "{path}"
XmlSchemaFileUpdateError10550Error loading xml schema: "{path}": {ex.Message}
DatabaseScriptExecute10601Error loading SystemUpdateActions
DatabaseScriptExecute10601The following script failed to execute: {cmdStr}
AppStringsImported10602ImportConfigurationChanges - UpdateAllAppStrings
UpdateActionAssemblyError10603Cannot resolve types in assembly "{assembly.CodeBase}"
UpdateActionObjectUpdated10605Error Updating the Omada Identity System Default Queries
UpdateActionDataError10606Failed to migrate account resource: {accountResource.DisplayName}
UpdateActionDataError10606Error enabling Identity Queries for the Omada Identity System.
UpdateActionDataError10606Error Updating the Omada Identity System Default Queries.
MailTooManyReceivers10700Refuse to send event mails as there are more than {maxReceivers} receivers (eventdef.id: {eventDef.Id} actionid: {action.Id} actionobj.id {actionObject.Id} transitionid: {transitionId} receivers: {receivers.Count}"
MailTooManyReceivers10700Refused to send event mails as there are more than {maxReceivers} receivers ({receiverUserIds.Count})
MailNoValidEmailAddress10701Mail receiver does not have a valid email address
Mail sent failure10702Could not send mail
MailCannotBeQueued10703MailController.SendMail()
MailCannotDecryptVar10704DecryptFields
MailCannotRemoveMailFromQueue10705Could not remove mail from queue
MailCannotAddMailLog10706Failing to add mail log ({mailId})
MailCannotAddMailLog10706Failing to create mail log ({mailId})
Mail sent10707Mail sent
MailConnection10708Error when loading the NotificationSettings ConfigurationData
MailCheckQueue10710TimerService.CheckMailQueue (Customer {customer.Id})
MailSendMails10711TimerService.SendMailsFromQueue Queued: {queueCount}
ILMExportFailed10900UpdateDataObject failed with message: {e.Message} avp source is: {avpText}
WebRequest11001Failed to send email message to support.
SimulationHubRun12003RunViolationImpactSimulation
RunArchiveBatch12100ArchivingManager.DoExecute
WebserviceError13000Error occurred when trying to test if RoPE RemoteApi is alive
WebserviceError13000Cannot get last import status for system {sysId}
WebserviceError13000Cannot connect to Import service
WebserviceError13000Webservice exec error PushConfiguration()
WebserviceError13000Webservice exec error
WebPhishingAttempt13100not a local ULR
Password change13210Audit password change for {customerName}\\{AppIdentity.Identity.UserInfo.UserName}
PolicyCheckConfigurationError13350Configuration error in PeerAccessPolicyCheck
PolicyCheckExecutionError13351Error processing PeerAccessPolicyCheck
ODataError13400The user does not have read access for the 'Properties' Authorization role element
ODataError13400Error parsing ETag
ODataError13400BadRequestNoDataReceived
ODataError13400OData error
GraphQLError13500GraphQL query caused exception.
GraphQLError13500GraphQL query failed with errors.
GraphQLError13500Error deserializing value of {CustomerSettingKey.UiHomePageActions}
GraphQLError13500GraphQL error
DataExchangeError13600Error executing data exchange!
AjaxGridError14000DataProducerFactory - GetProducers error
SurveyFeatureInfo15000Survey data resend failed to complete for {Count} of {tasksLength} surveys.
SurveyFeatureInfo15000Survey data resend failed to complete
SurveyFeatureInfo15000Could not load survey template with id: {surveyTemplateId}
SurveyOwnershipNotAccepted15001Ownership not accepted for {dataObject.DisplayName}
ComplexDataSetBuilderError16000ComplexDataSetBuilderFactory - GetDataSetBuilders error
CrossDatabaseQueryDetected16001DataSource "{dataSourceName}" has one or more cross database joins which is not recommended from a performance and compatibility perspective
CrossDatabaseQueryDetected16001Survey "{surveyTemplate.Name}" has a data source "{dataSource.Name}" with one or more cross-database joins which is not recommended from a performance and compatibility perspective
KPIEvaluationError17000Error evaluating KPI '{kpi.Name}' ({kpi.Id}) counter and status
VirtualPropertyResolverError18000VirtualPropertyResolverFactory - GetVirtualPropertyResolvers error
Identity disabled23001A data object of type Identity was modified. Identity disabled
Identity locked23002A data object of type Identity was modified. Identity locked
Identity re-enabled23003A data object of type Identity was modified. Identity re-enabled
Request submitted23101A data object of type TRG_ACCESsREQUEST was created. Request submitted
Request approved23102A data object of type ResourceAssignment was modified. Request approved
Request rejected23103A data object of type ResourceAssignment was modified. Request rejected
Members added23201Memberships changed for user group "{UserGroup}". Members added ({Number}): {Identity}
Members removed23202Memberships changed for user group "{UserGroup}". Members removed ({Number}): {Identity}
Resource changed23203A data object of type Resource was modified. Resource changed
Approval configuration changed23204A data object of type ResourceFolder was modified. Approval configuration changed
Assignment policy changed23205A data object of type AssignmentPolicy was modified. Assignment policy changed
Constraint changed23206A data object of type Constraint was modified. Constraint changed
Prioritization policy changed23207A data object of type Prioritization policy was modified. Prioritization policy changed
Survey started23301A data object of type TRG_SURVEY was modified. Survey started
Survey completed23302A data object of type TRG_SURVEY was modified. Survey completed
System owner changed23303A data object of type System was modified. System owner(s) changed
System classification changed23304A data object of type System was modified. System classification changed
Identity created23305A data object of type Identity was created. Identity created
Identity modified23306A data object of type Identity was modified. Identity modified
Identity manager changed23307A data object of type Identity was modified. Identity manager(s) changed
Org. Unit manager changed23308A data object of type OrgUnit was modified. Org. unit manager(s) changed
New system onboarded23401A data object of type System was created. New system on-boarded
Technical identity requested23402A data object of type TECHIDENTREQUEST was created. Technical identity requested
Warehouse import succeeded23403A data object of type ODWIMPORTPROFILE was modified. Warehouse import successful
Warehouse import partially succeeded23404A data object of type ODWIMPORTPROFILE was modified. Warehouse import partially succeeded
Warehouse import failed23405A data object of type ODWIMPORTPROFILE was modified. Warehouse import failed
Omada Logging Framework categories

Note, that some categories are used in more than one component. Such categories are listed as Duplicate.

ComponentCategoryDescription
Enterprise Server
AccessRequestsAnything related to requesting access, including review, approvals, denials, etc.
AuthenticationLogging in and out of Omada Identity, failed login attempts, wrong password used.
ConfigurationConfiguration changes and issues such as errors in the system onboarding configuration, changes to views, properties, workflows, etc.
DebugDebug information, additional information to debug issues.
ExceptionWhen the code doesn't know what to do anymore.
GovernanceReports generated, attestations, classifications, Risk, SoD matters.
PasswordEvents related to password resets and filter.
SecurityIdentity lock-out, Re-enable identities.
SystemOperationAll the events regarding timers have executed, system events and processes have been running or failing.
UserActionsUsers performing searches, downloads.
MailMail related issues, mails being sent, not sent.
IdentityLifecycle Requesting, adding, changing, disabling identities.
Business AlignmentCreate, update, delete roles, policies, contexts.
Role and Policy Engine
AuthenticationLogging in and out of Omada Identity, failed login attempts, wrong password used.
Calculation
ConfigurationConfiguration changes and issues such as errors in the system onboarding configuration, changes to views, properties, workflows, etc.
DataData that is processed in Omada Identity, e.g. identities, resources and accounts, but excluding configuration data.
DebugDebug information, additional information to debug issues.
ExceptionWhen the code doesn't know what to do anymore.
Omada Data WarehouseDataData that is processed in Omada Identity, e.g. identities, resources and accounts, but excluding configuration data.
ConfigurationConfiguration changes and issues such as errors in the system onboarding configuration, changes to views, properties, workflows, etc.
ConnectionConnection related events, e.g. URL, credentials, timeout.
ProgressProgress of a running process or operation.
Omada Provisioning ServiceDebugDebug information, additional information to debug issues.
SystemOperationAll the events regarding timers have executed, system events and processes have been running or failing.
ConfigurationConfiguration changes and issues such as errors in the system onboarding configuration, changes to views, properties, workflows, etc.
ProvisioningEverything concerning the actual provisioning.
PerformanceLogging containing elapsed time for an operation.
Operation Data StoreCopsApiClientThe client that is responsible for retrieving data like environment information for a given environment identifier and ES ODS persistence service URL and its token from the COPS API service.
EnvironmentManagerComponent collects, caches, and provides information about the environment.
EventProcessingFunctionThe component processes messages incoming from the event hub. For each type of message, an appropriate processor is selected that is responsible for handling the message.
MessageSerializerThe component used to serialize and deserialize the collection of messages incoming from the event hub.
ConnectionManagerThe component handles database connection and transaction functionality and dispose them.
ODSManagerThe component provides ODS database operations: bulk insert, command and procedure execution, and supports transactions.
AccessRequestMessageProcessorThe component responsible for processing received messages of type AccessRequestMessage by putting them in ODS.
ApprovalQuestionMessageProcessorThe component responsible for processing received messages of type ApprovalQuestionMessage by putting them in ODS.
CalculatedAssignmentMessageProcessorThe component responsible for processing received messages of type 'CalculatedAssignmentMessage' by putting them in ODS.
ClearAccessRequestsMessageProcessorThe component responsible for processing received messages of type ClearAccessRequestsMessage by cleaning up access requests data from ODS.
ClearSurveysMessageProcessorThe component responsible for processing received messages of type ClearSurveysMessage by cleaning up survey data in ODS.
ResourceMessageProcessorThe component responsible for processing received messages of type ResourceMessage by putting them in ODS.
SurveyMessageProcessorThe component responsible for processing received messages of type SurveyMessage by putting them in ODS.
SystemMessageProcessorThe component responsible for processing received messages of type SystemMessage by putting them in ODS.

Logging configuration

It is possible to configure various aspects of the logs which are captured by the Omada Identity using a dedicated XML configuration object. The default logging configuration is configured to send log entries to the Windows Event Log.

info

Changing the logging configuration will take immediate effect, but only for the website instance from where the change has been made. For that reason, other website instances, RoPE instances, and Timer Services instances must be restarted for the change to take effect.

The logging configuration can be found in Setup > Administration > More > Configuration objects > Log configuration.

LogConfigObjects

The Log configuration object allows you to define how logging is performed. The configuration XML consists of two main sections, which are targets and filters.

The targets section is used to configure where log messages should be sent to, and the filters section is used to configure the filtering of the log messages for the targets.

LogConfigObjCode

When adding log targets, be sure to also explicitly set filters for the configured target. Configuration of filters is required as the default logging level for targets is debug, which may result in extensive logging if left with the default value.

Filters

The Log filters allow for filtering specific properties of a log entry to determine which targets should receive the log entry. If a filter is defined for a particular target, only log entries that match the filter criteria will be sent. However, if there is no filter defined for a target, all log entries will be sent to that target regardless of their properties.

The following attributes from the log entry can be used in the filter:

  • Log level - filtering on log level means that only log entries with a log level greater than or equal to the filtered level are sent to the target.

    • Available values are information, warning, error, fatal, trace, and debug.
  • Category - any text value.

  • Component - the name of the Omada Identity component from which the log entry origins.

  • Targets - a comma separated string of log target names to which the log entry should be sent if it passes the filter.

    info

    When adding log targets, be sure to also explicitly set filters for the configured target. Configuration of filters is required as the default logging level for targets is debug, which may result in extensive logging if left with the default value.

example

<filter category="Governance" targets="windowsEventLog,splunk"></filter>

The filter above will send all log entries with the category Governance to the log targets named eventLog and splunk.

Notice that these are the name of the targets as defined in the target configuration section, not the log target type.

Internal log file

In case of any problems or errors related to the logging framework or configuration of the logging framework, you may need to inspect the internal log file of the logging framework.

The default path of the ES internal log files is:

C:\Users\{USERNAME}\AppData\Local\Temp

where {USERNAME} is the user name of the user running the a specific OI component.

tip

You can change the default path of the internal log files by setting the internalLogFileLocation in the Log configuration data object.

internalLogFile

Log targets on-prem

Log targets are used to display, store, or pass log messages to another destination. This destination could, for instance, be the Windows event log or a SIEM tool such as Splunk. It is possible to have the same target type configured multiple times in the configuration, but the name should be unique.

The following attributes are available for and apply to all log targets:

Target attributeDescription
logTargetTypeAvailable target types: windowsEventLog, splunk

For logging into a file, use file as the log target.
nameUnique name for the target.
disabledSpecifies should the target be disabled. Possible values: true, false

See sections below for information on the attributes specific to a different log target types.

Windows event log target

The windows event log target can be used to write log entries to the Windows event log. The target parameters are:

  • source - value to be used as the event Source.
  • log - name of the Event Log to write to. This can be System, Application, or any user-defined name.
  • detailedLogging - specified whether the log entries should contain details about the log event. The default value is false. If disabled, the log entries will only contain the timestamp, the message, and the exceptions. If enabled, all details from the log entry will be added. Possible values: true, false.

Example XML:

<target logTargetType=”windowsEventLog” name=”eventLog” disabled=”false”><targetParameters>
<targetParameter name=”source” value=”Omada Enterprise”/>
<targetParameter name=”log” value=”Application”/>
<targetParameter name=”detailedLogging” value=”true”/>
</targetParameters>
</target>

By default, the entries in the Windows Event Log only contain the timestamp, the log message and the exception details (if it is an exception being logged) of a log entry. This can be changed by setting the detailedLogging parameter of the Windows Event Log targets to true. Changing this parameter will output all details of the log entry with a new line for each property and value of the log entry.

File log target

The File Log target can be used for writing log entries to a file. The target parameter is:

  • filename - the absolute or relative path to the file name. It is possible to include the current date as part of the file name by adding the following code in the filename: ${shortdate}, for example: “C:\logs\OISLogs_${shortdate}.log”.

When using a relative path, each component in Omada Identity (ES, RoPE, OPS, and ODW) will create a file in the working directory of the component. For example, the log file for ES will be created in the website folder and the one for RoPE will be created in the services folder.

Thus, it is recommended that you refer to the absolute path instead, for example a file share that is accessible by all the servers where the Omada Identity components are installed.

Example of log target XML for file log target:

<target logTargetType="file" name="oisFileLog">
<targetParameters>
<targetParameter name="filename" value="C:\logs\OISLogs_$(shortdata).log"/>
</targetParameters>
</target>

Azure Log target

The Azure Log target can be used for writing log entries to Azure Log Analytics. You need the Workspace ID and the shared key from Azure Log Analytics to configure the log target. The following are target parameters:

  • workspaceid - the Workspace ID of the log analytics workspace in Azure.
  • sharedkey - the Shared Key or Primary Key for the log analytics workspace in Azure.
  • logname - the name of the custom log to save log entries in. Only letters are accepted. The log name in Azure Log Analytics will be postfixed with _CL.The log name is case-sensitive in Azure Log Analytics.

The following three parameters are only required if you wish to use the Event log list page in Enterprise Server to read the log entries from Azure Log Analytics. To configure these parameters, navigate to Setup -> Administration -> More… -> Configuration Objects and edit the Log Configuration object. In the log target named azureLog, configure the following parameters:

  • azuretenantid - the tenant ID for the Azure subscription.
  • applicationid - the Azure AD application ID that is configured with the Log Analytics Reader access to the Azure Log Analytics workspace.
  • clientsecret - the client secret from Azure AD.
info

For performance reasons, the credentials for the Azure Log Analytics workspace are cached for an hour. This means that changes to the azuretenantid, clientsecret, or applicationid parameters in the log configuration will not take effect before the authentication token has expired. To force a refresh of the credentials, perform a reset of the web server.

Retrieving Workspace ID and Shared Key from Azure Log Analytics

  1. In Azure, open Log analytics workspaces.
  2. Select the workspace.
  3. Select Agents management in the Settings section.
  4. Copy the values from the WORKSPACE ID and the PRIMARY KEY (Shared Key).

Example of log target XML for Azure Log Analytics:

<target logTargetType="azureLogAnalytics" name="azureLog">
<targetParameters>
<targetParameter name="workspaceid" value="0aac010e-9e0a-4d55-8d0d-49db2dc795c3" />
<targetParameter name="sharedkey" value="WY6A9UO6/uWNSSgeWz7qyfJRmibqNH78GqPrR1ptVkZsWN06GUjYrUZvyrS9wcXvMGoVb2YnpkXxJEzG7NNuPU+6Vndw==" />
<targetParameter name="logname" value="OISLogs" />
</targetParameters>
</target>

For more information, see the Microsoft documentation.

Splunk target

The Splunk target can be used for writing log entries to Splunk.

You can target both Splunk Enterprise as a target for the event logging. Both Splunk solutions use HEC, HTTP Event collector, and the parameters presented in the table below.

info

Splunk must be made available through a public IP address.

The following are the target parameters:

  • server - Splunk instance URL.
  • token - token from the Splunk HTTP Event collector.
  • channel - channel ID on the Splunk HTTP Event collector.
  • ignoreSslErrors - ignore errors with SSL certificate. The default is true.
  • includeEventProperties - include event properties. The default is true.
  • retriesOnError - number of retries when an error occurs. The default is zero.
<target logTargetType="splunk" name="splunk">   <targetParameters>
     <targetParameter name="server" value="https://prd-p-example.splunk.com:8088"/>
     <targetParameter name="token" value="8429944d-1ffb-4267-857b-302485015b56"/>
     <targetParameter name="channel" value="8429944d-1ffb-4267-857b-302485015b56"/>
     <targetParameter name="ignoreSslErrors" value="true"/>
     <targetParameter name="includeEventProperties" value="true"/>
     <targetParameter name="retriesOnError" value="0"/>
   </targetParameters>
 </target>
Configure import to use the new version of Newtonsoft.Json

The Nlog.Targets.Splunk is referencing the old version of Newtonsoft.Json. Therefore, you need to configure the import to use the new version of Newtonsoft.Json. To do so, follow these steps:

  1. Locate on the SSIS server the DTExec.exe.config that is being used by the import.

    Should be in C:\Program Files\Microsoft SQL Server\140\DTS\Binn for a default installation location of SQL Server 2017.

  2. Edit DTExec.exe.config, and insert the code snippet into the <configuration><runtime> element.

    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">

    <dependentAssembly>

    <assemblyIdentity name="Newtonsoft.Json" publicKeyToken="30ad4fe6b2a6aeed" culture="neutral" />

    <bindingRedirect oldVersion="0.0.0.0-13.0.0.0" newVersion="13.0.0.0" />

    </dependentAssembly>
    </assemblyBinding>
  3. Save the file.

Setting up Splunk Enterprise HTTP Event collector (HEC)

To set up the HTTP Event collector for Splunk Enterprise solution, follow the steps below:

  1. Go to https://localhost:8001 (Splunk Enterprise solution installed on your machine).
  2. Log in with the account information entered during installation.
  3. Click Settings -> Data -> Data inputs.
  4. Under Local Inputs you will find HTTP Event Collector. Select the Add new action on the right.
  5. Give the HEC a Name and click Next.
  6. In the Source Type, click Structured -> _json.
  7. As the App context, select splunk_httpinput.
  8. As the Index, select main.
  9. Click Review and Submit.
  10. Make a note of the created Token Value, this is the value you will need to configure Omada Identity with.
  11. Go back to Settings -> Data -> Data inputs.
  12. Click HTTP Event Collector and verify that your HEC has been added.
  13. Click the Global Settings button in the top right-hand corner.
  14. If you want to use HTTPS, check the Enable SSL box, otherwise, uncheck it.
  15. Select Enabled for All tokens, and then click Save.
Setting up Splunk Cloud HTTP Event collector (HEC)
  1. Go to https://splunk.com.
  2. Log in to your account using the user icon in the top right-hand corner.
  3. Click the user icon again and select Instances.
  4. From the list of instances select Access Instance.
  5. From the instance page select Settings -> Data -> Data inputs.
  6. Under Local Inputs you will find HTTP Event Collector. Select the Add new action on the right.
  7. Give the HEC a Name and click Next.
  8. In the Source Type, click Structured -> _json.
  9. As the App context, select splunk_httpinput.
  10. As the Index, select main.
  11. Click Review and Submit.
  12. Make a note of the created Token Value, this is the value you will need to configure Omada Identity with.
  13. Go back to Settings -> Data -> Data inputs.
  14. Click the HTTP Event Collector link and verify that your HEC has been added.
  15. Click the Global Settings button in the top right-hand corner.
  16. Select Enabled for All tokens, and then click Save.