Add this skill
npx mdskills install sickn33/azure-monitor-ingestion-pyClear SDK documentation with comprehensive examples but lacks agent-specific task instructions
1---2name: azure-monitor-ingestion-py3description: |4 Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.5 Triggers: "azure-monitor-ingestion", "LogsIngestionClient", "custom logs", "DCR", "data collection rule", "Log Analytics".6package: azure-monitor-ingestion7---89# Azure Monitor Ingestion SDK for Python1011Send custom logs to Azure Monitor Log Analytics workspace using the Logs Ingestion API.1213## Installation1415```bash16pip install azure-monitor-ingestion17pip install azure-identity18```1920## Environment Variables2122```bash23# Data Collection Endpoint (DCE)24AZURE_DCE_ENDPOINT=https://<dce-name>.<region>.ingest.monitor.azure.com2526# Data Collection Rule (DCR) immutable ID27AZURE_DCR_RULE_ID=dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx2829# Stream name from DCR30AZURE_DCR_STREAM_NAME=Custom-MyTable_CL31```3233## Prerequisites3435Before using this SDK, you need:36371. **Log Analytics Workspace** — Target for your logs382. **Data Collection Endpoint (DCE)** — Ingestion endpoint393. **Data Collection Rule (DCR)** — Defines schema and destination404. **Custom Table** — In Log Analytics (created via DCR or manually)4142## Authentication4344```python45from azure.monitor.ingestion import LogsIngestionClient46from azure.identity import DefaultAzureCredential47import os4849client = LogsIngestionClient(50 endpoint=os.environ["AZURE_DCE_ENDPOINT"],51 credential=DefaultAzureCredential()52)53```5455## Upload Custom Logs5657```python58from azure.monitor.ingestion import LogsIngestionClient59from azure.identity import DefaultAzureCredential60import os6162client = LogsIngestionClient(63 endpoint=os.environ["AZURE_DCE_ENDPOINT"],64 credential=DefaultAzureCredential()65)6667rule_id = os.environ["AZURE_DCR_RULE_ID"]68stream_name = os.environ["AZURE_DCR_STREAM_NAME"]6970logs = [71 {"TimeGenerated": "2024-01-15T10:00:00Z", "Computer": "server1", "Message": "Application started"},72 {"TimeGenerated": "2024-01-15T10:01:00Z", "Computer": "server1", "Message": "Processing request"},73 {"TimeGenerated": "2024-01-15T10:02:00Z", "Computer": "server2", "Message": "Connection established"}74]7576client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)77```7879## Upload from JSON File8081```python82import json8384with open("logs.json", "r") as f:85 logs = json.load(f)8687client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)88```8990## Custom Error Handling9192Handle partial failures with a callback:9394```python95failed_logs = []9697def on_error(error):98 print(f"Upload failed: {error.error}")99 failed_logs.extend(error.failed_logs)100101client.upload(102 rule_id=rule_id,103 stream_name=stream_name,104 logs=logs,105 on_error=on_error106)107108# Retry failed logs109if failed_logs:110 print(f"Retrying {len(failed_logs)} failed logs...")111 client.upload(rule_id=rule_id, stream_name=stream_name, logs=failed_logs)112```113114## Ignore Errors115116```python117def ignore_errors(error):118 pass # Silently ignore upload failures119120client.upload(121 rule_id=rule_id,122 stream_name=stream_name,123 logs=logs,124 on_error=ignore_errors125)126```127128## Async Client129130```python131import asyncio132from azure.monitor.ingestion.aio import LogsIngestionClient133from azure.identity.aio import DefaultAzureCredential134135async def upload_logs():136 async with LogsIngestionClient(137 endpoint=endpoint,138 credential=DefaultAzureCredential()139 ) as client:140 await client.upload(141 rule_id=rule_id,142 stream_name=stream_name,143 logs=logs144 )145146asyncio.run(upload_logs())147```148149## Sovereign Clouds150151```python152from azure.identity import AzureAuthorityHosts, DefaultAzureCredential153from azure.monitor.ingestion import LogsIngestionClient154155# Azure Government156credential = DefaultAzureCredential(authority=AzureAuthorityHosts.AZURE_GOVERNMENT)157client = LogsIngestionClient(158 endpoint="https://example.ingest.monitor.azure.us",159 credential=credential,160 credential_scopes=["https://monitor.azure.us/.default"]161)162```163164## Batching Behavior165166The SDK automatically:167- Splits logs into chunks of 1MB or less168- Compresses each chunk with gzip169- Uploads chunks in parallel170171No manual batching needed for large log sets.172173## Client Types174175| Client | Purpose |176|--------|---------|177| `LogsIngestionClient` | Sync client for uploading logs |178| `LogsIngestionClient` (aio) | Async client for uploading logs |179180## Key Concepts181182| Concept | Description |183|---------|-------------|184| **DCE** | Data Collection Endpoint — ingestion URL |185| **DCR** | Data Collection Rule — defines schema, transformations, destination |186| **Stream** | Named data flow within a DCR |187| **Custom Table** | Target table in Log Analytics (ends with `_CL`) |188189## DCR Stream Name Format190191Stream names follow patterns:192- `Custom-<TableName>_CL` — For custom tables193- `Microsoft-<TableName>` — For built-in tables194195## Best Practices1961971. **Use DefaultAzureCredential** for authentication1982. **Handle errors gracefully** — use `on_error` callback for partial failures1993. **Include TimeGenerated** — Required field for all logs2004. **Match DCR schema** — Log fields must match DCR column definitions2015. **Use async client** for high-throughput scenarios2026. **Batch uploads** — SDK handles batching, but send reasonable chunks2037. **Monitor ingestion** — Check Log Analytics for ingestion status2048. **Use context manager** — Ensures proper client cleanup205
Full transparency — inspect the skill content before installing.