Add this skill
npx mdskills install sickn33/azure-monitor-ingestion-javaComprehensive Azure SDK documentation with excellent code examples and error handling patterns
1---2name: azure-monitor-ingestion-java3description: |4 Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).5 Triggers: "LogsIngestionClient java", "azure monitor ingestion java", "custom logs java", "DCR java", "data collection rule java".6package: com.azure:azure-monitor-ingestion7---89# Azure Monitor Ingestion SDK for Java1011Client library for sending custom logs to Azure Monitor using the Logs Ingestion API via Data Collection Rules.1213## Installation1415```xml16<dependency>17 <groupId>com.azure</groupId>18 <artifactId>azure-monitor-ingestion</artifactId>19 <version>1.2.11</version>20</dependency>21```2223Or use Azure SDK BOM:2425```xml26<dependencyManagement>27 <dependencies>28 <dependency>29 <groupId>com.azure</groupId>30 <artifactId>azure-sdk-bom</artifactId>31 <version>{bom_version}</version>32 <type>pom</type>33 <scope>import</scope>34 </dependency>35 </dependencies>36</dependencyManagement>3738<dependencies>39 <dependency>40 <groupId>com.azure</groupId>41 <artifactId>azure-monitor-ingestion</artifactId>42 </dependency>43</dependencies>44```4546## Prerequisites4748- Data Collection Endpoint (DCE)49- Data Collection Rule (DCR)50- Log Analytics workspace51- Target table (custom or built-in: CommonSecurityLog, SecurityEvents, Syslog, WindowsEvents)5253## Environment Variables5455```bash56DATA_COLLECTION_ENDPOINT=https://<dce-name>.<region>.ingest.monitor.azure.com57DATA_COLLECTION_RULE_ID=dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx58STREAM_NAME=Custom-MyTable_CL59```6061## Client Creation6263### Synchronous Client6465```java66import com.azure.identity.DefaultAzureCredential;67import com.azure.identity.DefaultAzureCredentialBuilder;68import com.azure.monitor.ingestion.LogsIngestionClient;69import com.azure.monitor.ingestion.LogsIngestionClientBuilder;7071DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();7273LogsIngestionClient client = new LogsIngestionClientBuilder()74 .endpoint("<data-collection-endpoint>")75 .credential(credential)76 .buildClient();77```7879### Asynchronous Client8081```java82import com.azure.monitor.ingestion.LogsIngestionAsyncClient;8384LogsIngestionAsyncClient asyncClient = new LogsIngestionClientBuilder()85 .endpoint("<data-collection-endpoint>")86 .credential(new DefaultAzureCredentialBuilder().build())87 .buildAsyncClient();88```8990## Key Concepts9192| Concept | Description |93|---------|-------------|94| Data Collection Endpoint (DCE) | Ingestion endpoint URL for your region |95| Data Collection Rule (DCR) | Defines data transformation and routing to tables |96| Stream Name | Target stream in the DCR (e.g., `Custom-MyTable_CL`) |97| Log Analytics Workspace | Destination for ingested logs |9899## Core Operations100101### Upload Custom Logs102103```java104import java.util.List;105import java.util.ArrayList;106107List<Object> logs = new ArrayList<>();108logs.add(new MyLogEntry("2024-01-15T10:30:00Z", "INFO", "Application started"));109logs.add(new MyLogEntry("2024-01-15T10:30:05Z", "DEBUG", "Processing request"));110111client.upload("<data-collection-rule-id>", "<stream-name>", logs);112System.out.println("Logs uploaded successfully");113```114115### Upload with Concurrency116117For large log collections, enable concurrent uploads:118119```java120import com.azure.monitor.ingestion.models.LogsUploadOptions;121import com.azure.core.util.Context;122123List<Object> logs = getLargeLogs(); // Large collection124125LogsUploadOptions options = new LogsUploadOptions()126 .setMaxConcurrency(3);127128client.upload("<data-collection-rule-id>", "<stream-name>", logs, options, Context.NONE);129```130131### Upload with Error Handling132133Handle partial upload failures gracefully:134135```java136LogsUploadOptions options = new LogsUploadOptions()137 .setLogsUploadErrorConsumer(uploadError -> {138 System.err.println("Upload error: " + uploadError.getResponseException().getMessage());139 System.err.println("Failed logs count: " + uploadError.getFailedLogs().size());140141 // Option 1: Log and continue142 // Option 2: Throw to abort remaining uploads143 // throw uploadError.getResponseException();144 });145146client.upload("<data-collection-rule-id>", "<stream-name>", logs, options, Context.NONE);147```148149### Async Upload with Reactor150151```java152import reactor.core.publisher.Mono;153154List<Object> logs = getLogs();155156asyncClient.upload("<data-collection-rule-id>", "<stream-name>", logs)157 .doOnSuccess(v -> System.out.println("Upload completed"))158 .doOnError(e -> System.err.println("Upload failed: " + e.getMessage()))159 .subscribe();160```161162## Log Entry Model Example163164```java165public class MyLogEntry {166 private String timeGenerated;167 private String level;168 private String message;169170 public MyLogEntry(String timeGenerated, String level, String message) {171 this.timeGenerated = timeGenerated;172 this.level = level;173 this.message = message;174 }175176 // Getters required for JSON serialization177 public String getTimeGenerated() { return timeGenerated; }178 public String getLevel() { return level; }179 public String getMessage() { return message; }180}181```182183## Error Handling184185```java186import com.azure.core.exception.HttpResponseException;187188try {189 client.upload(ruleId, streamName, logs);190} catch (HttpResponseException e) {191 System.err.println("HTTP Status: " + e.getResponse().getStatusCode());192 System.err.println("Error: " + e.getMessage());193194 if (e.getResponse().getStatusCode() == 403) {195 System.err.println("Check DCR permissions and managed identity");196 } else if (e.getResponse().getStatusCode() == 404) {197 System.err.println("Verify DCE endpoint and DCR ID");198 }199}200```201202## Best Practices2032041. **Batch logs** — Upload in batches rather than one at a time2052. **Use concurrency** — Set `maxConcurrency` for large uploads2063. **Handle partial failures** — Use error consumer to log failed entries2074. **Match DCR schema** — Log entry fields must match DCR transformation expectations2085. **Include TimeGenerated** — Most tables require a timestamp field2096. **Reuse client** — Create once, reuse throughout application2107. **Use async for high throughput** — `LogsIngestionAsyncClient` for reactive patterns211212## Querying Uploaded Logs213214Use [azure-monitor-query](../query/SKILL.md) to query ingested logs:215216```java217// See azure-monitor-query skill for LogsQueryClient usage218String query = "MyTable_CL | where TimeGenerated > ago(1h) | limit 10";219```220221## Reference Links222223| Resource | URL |224|----------|-----|225| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-monitor-ingestion |226| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/monitor/azure-monitor-ingestion |227| Product Docs | https://learn.microsoft.com/azure/azure-monitor/logs/logs-ingestion-api-overview |228| DCE Overview | https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview |229| DCR Overview | https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview |230| Troubleshooting | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-ingestion/TROUBLESHOOTING.md |231
Full transparency — inspect the skill content before installing.