Azure Monitor Ingestion client library for JS

The Azure Monitor Ingestion client library is used to send custom logs to Azure Monitor using the Logs Ingestion API.

This library allows you to send data from virtually any source to supported built-in tables or to custom tables that you create in Log Analytics workspace. You can even extend the schema of built-in tables with custom columns.

Resources:

Getting started

Prerequisites

Install the package

Install the Azure Monitor Ingestion client library for JS with npm:

npm install @azure/monitor-ingestion

Authenticate the client

An authenticated client is required to ingest data. To authenticate, create an instance of a TokenCredential class (see @azure/identity for DefaultAzureCredential and other TokenCredential implementations). Pass it to the constructor of your client class.

To authenticate, the following example uses DefaultAzureCredential from the @azure/identity package:

import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";

import * as dotenv from "dotenv";
dotenv.config();

const logsIngestionEndpoint = process.env.LOGS_INGESTION_ENDPOINT || "logs_ingestion_endpoint";

const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential);

Configure client for Azure sovereign cloud

By default, the client is configured to use the Azure Public Cloud. To use a sovereign cloud instead, provide the correct endpoint and audience value when instantiating the client. For example:

import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";

import * as dotenv from "dotenv";
dotenv.config();

const logsIngestionEndpoint = process.env.LOGS_INGESTION_ENDPOINT || "logs_ingestion_endpoint";

const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential, {
  audience: "https://api.loganalytics.azure.cn/.default",
});

Key concepts

Data Collection Endpoint

Data Collection Endpoints (DCEs) allow you to uniquely configure ingestion settings for Azure Monitor. This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.

Data Collection Rule

Data collection rules (DCR) define data collected by Azure Monitor and specify how and where that data should be sent or stored. The REST API call must specify a DCR to use. A single DCE can support multiple DCRs, so you can specify a different DCR for different sources and target tables.

The DCR must understand the structure of the input data and the structure of the target table. If the two don't match, it can use a transformation to convert the source data to match the target table. You may also use the transform to filter source data and perform any other calculations or conversions.

For more details, refer to Data collection rules in Azure Monitor.For information on how to retrieve a DCR ID, see this tutorial.

Log Analytics workspace tables

Custom logs can send data to any custom table that you create and to certain built-in tables in your Log Analytics workspace. The target table must exist before you can send data to it. The following built-in tables are currently supported:

Examples

You can familiarize yourself with different APIs using Samples.

Upload custom logs

You can create a client and call the client's Upload method. Take note of the data ingestion limits.

const { isAggregateLogsUploadError, DefaultAzureCredential } = require("@azure/identity");
const { LogsIngestionClient } = require("@azure/monitor-ingestion");

require("dotenv").config();

async function main() {
  const logsIngestionEndpoint = process.env.LOGS_INGESTION_ENDPOINT || "logs_ingestion_endpoint";
  const ruleId = process.env.DATA_COLLECTION_RULE_ID || "data_collection_rule_id";
  const streamName = process.env.STREAM_NAME || "data_stream_name";
  const credential = new DefaultAzureCredential();
  const client = new LogsIngestionClient(logsIngestionEndpoint, credential);
  const logs = [
    {
      Time: "2021-12-08T23:51:14.1104269Z",
      Computer: "Computer1",
      AdditionalContext: "context-2",
    },
    {
      Time: "2021-12-08T23:51:14.1104269Z",
      Computer: "Computer2",
      AdditionalContext: "context",
    },
  ];
  try{
    await client.upload(ruleId, streamName, logs);
  }
  catch(e){
    let aggregateErrors = isAggregateLogsUploadError(e) ? e.errors : [];
    if (aggregateErrors.length > 0) {
      console.log("Some logs have failed to complete ingestion");
      for (const error of aggregateErrors) {
        console.log(`Error - ${JSON.stringify(error.cause)}`);
        console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
      }
    } else {
      console.log(e);
    }
  }
}

main().catch((err) => {
  console.error("The sample encountered an error:", err);
  process.exit(1);
});

module.exports = { main };

Verify logs

You can verify that your data has been uploaded correctly by using the @azure/monitor-query library. Run the Upload custom logs sample first before verifying the logs.

// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.

/**
 * @summary Demonstrates how to run query against a Log Analytics workspace to verify if the logs were uploaded
 */

const { DefaultAzureCredential } = require("@azure/identity");
const { LogsQueryClient } = require("@azure/monitor-query");

const monitorWorkspaceId = process.env.MONITOR_WORKSPACE_ID || "workspace_id";
const tableName = process.env.TABLE_NAME || "table_name";
require("dotenv").config();

async function main() {
  const credential = new DefaultAzureCredential();
  const logsQueryClient = new LogsQueryClient(credential);
  const queriesBatch = [
    {
      workspaceId: monitorWorkspaceId,
      query: tableName + " | count;",
      timespan: { duration: "P1D" },
    },
  ];

  const result = await logsQueryClient.queryBatch(queriesBatch);
  if (result[0].status === "Success") {
    console.log("Table entry count: ", JSON.stringify(result[0].tables));
  } else {
    console.log(
      `Some error encountered while retrieving the count. Status = ${result[0].status}`,
      JSON.stringify(result[0])
    );
  }
}

main().catch((err) => {
  console.error("The sample encountered an error:", err);
  process.exit(1);
});

module.exports = { main };

Uploading large batches of logs

When uploading more than 1MB of logs in a single call to the upload method on LogsIngestionClient, the upload will be split into several smaller batches, each no larger than 1MB. By default, these batches will be uploaded in parallel, with a maximum of 5 batches being uploaded concurrently. It may be desirable to decrease the maximum concurrency if memory usage is a concern. The maximum number of concurrent uploads can be controlled using the maxConcurrency option, as shown in this example:

const { DefaultAzureCredential } = require("@azure/identity");
const { isAggregateLogsUploadError, LogsIngestionClient } = require("@azure/monitor-ingestion");

require("dotenv").config();

async function main() {
  const logsIngestionEndpoint = process.env.LOGS_INGESTION_ENDPOINT || "logs_ingestion_endpoint";
  const ruleId = process.env.DATA_COLLECTION_RULE_ID || "data_collection_rule_id";
  const streamName = process.env.STREAM_NAME || "data_stream_name";
  const credential = new DefaultAzureCredential();
  const client = new LogsIngestionClient(logsIngestionEndpoint, credential);

  // Constructing a large number of logs to ensure batching takes place
  const logs = [];
  for (let i = 0; i < 100000; ++i) {
    logs.push({
      Time: "2021-12-08T23:51:14.1104269Z",
      Computer: "Computer1",
      AdditionalContext: `context-${i}`,
    });
  }

  try{
    // Set the maximum concurrency to 1 to prevent concurrent requests entirely
    await client.upload(ruleId, streamName, logs, { maxConcurrency: 1 });
  }
  catch(e){
    let aggregateErrors = isAggregateLogsUploadError(e) ? e.errors : [];
    if (aggregateErrors.length > 0) {
      console.log("Some logs have failed to complete ingestion");
      for (const error of aggregateErrors) {
        console.log(`Error - ${JSON.stringify(error.cause)}`);
        console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
      }
    } else {
      console.log(e);
    }
  }
}
main().catch((err) => {
  console.error("The sample encountered an error:", err);
  process.exit(1);
});

module.exports = { main };

Retrieve logs

Logs uploaded using the Monitor Ingestion client library can be retrieved using the Monitor Query client library.

Troubleshooting

For details on diagnosing various failure scenarios, see our troubleshooting guide.

Next steps

To learn more about Azure Monitor, see the Azure Monitor service documentation. Please take a look at the samples directory for detailed examples on how to use this library.

Contributing

If you'd like to contribute to this library, please read the contributing guide to learn more about how to build and test the code.

Impressions