Logs Scraper
If you already forward your Entra ID logs to a log backend (via Azure Monitor Diagnostic Settings), use the Logs scraper to pull them into Mission Control.
When to use
- Logs already forwarded to OpenSearch, Loki, or BigQuery
- Backend handles retention, indexing, and search
- No direct Graph API calls needed
- You want to query logs alongside non-Entra data in the same backend
When NOT to use
- No existing log backend — use HTTP Scraper instead
- You need real-time latency — use Event Hub
- You want to avoid maintaining an export pipeline (Diagnostic Settings + consumer)
Prerequisites
- Azure Monitor Diagnostic Settings configured to export SignInLogs and AuditLogs to a supported backend
- Entra ID P1 or P2 license — Microsoft does not generate these log categories without it
- A supported log backend: OpenSearch, Loki, BigQuery, or GCP Cloud Logging
- Credentials for Mission Control to query the log backend
Azure Setup
- In the Azure Portal, go to Microsoft Entra ID > Diagnostic settings > Add diagnostic setting
- Select log categories: SignInLogs, AuditLogs (and optionally NonInteractiveUserSignInLogs, ServicePrincipalSignInLogs)
- Choose a destination: Event Hub (for streaming to OpenSearch, Loki, etc.) or Log Analytics workspace
- Deploy a consumer that writes to your log backend (e.g. a function app or agent — consult your log backend's documentation for Event Hub ingestion)
OpenSearch Example
entra-logs-opensearch.yamlapiVersion: configs.flanksource.com/v1
kind: ScrapeConfig
metadata:
name: entra-signin-logs-opensearch
namespace: mc
spec:
schedule: "@every 15m"
logs:
- name: entra-signins
type: OpenSearch
opensearch:
url: https://opensearch.example.com
index: entra-signins-*
username:
valueFrom:
secretKeyRef:
name: opensearch-credentials
key: username
password:
valueFrom:
secretKeyRef:
name: opensearch-credentials
key: password
query: |
{
"query": {
"range": {
"createdDateTime": {
"gte": "now-30m"
}
}
},
"sort": [{"createdDateTime": "desc"}],
"size": 500
}
transform:
expr: |
dyn(config).map(e, {
'config_type': 'Azure::SignIn',
'id': e.id,
'name': e.userPrincipalName + ' -> ' + e.appDisplayName,
'config_access': [{
'user': [e.userPrincipalName, e.userId],
'config_name': e.appDisplayName,
'config_type': 'Azure::EnterpriseApplication',
'status': e.status.errorCode == 0 ? 'success' : 'failure',
'ip': e.?ipAddress.orValue(''),
'created_at': e.createdDateTime
}]
}).toJSON()
See the Logs scraper reference for all backend options (Loki, BigQuery, GCP Cloud Logging).