azlogs

package module
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 18, 2024 License: MIT Imports: 12 Imported by: 4

README

Azure Monitor Ingestion client module for Go

The Azure Monitor Ingestion client module is used to send custom logs to Azure Monitor using the Logs Ingestion API.

This library allows you to send data from virtually any source to supported built-in tables or to custom tables that you create in Log Analytics workspaces. You can even extend the schema of built-in tables with custom columns.

Source code | Package (pkg.go.dev) | Product documentation | Samples

Getting started

Prerequisites
Install the package

Install the azlogs and azidentity modules with go get:

go get github.com/Azure/azure-sdk-for-go/sdk/monitor/ingestion/azlogs
go get github.com/Azure/azure-sdk-for-go/sdk/azidentity

The azidentity module is used for Azure Active Directory authentication while creating the client.

Authentication

An authenticated client object is required to upload logs. The examples demonstrate using azidentity.NewDefaultAzureCredential to authenticate; however, the client accepts any azidentity credential. See the azidentity documentation for more information about other credential types.

The client defaults to the Azure public cloud. For other cloud configurations, see the cloud package documentation.

Create a client

Example client

Key concepts

Data Collection Endpoint

Data Collection Endpoints (DCEs) allow you to uniquely configure ingestion settings for Azure Monitor. This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.

Data Collection Rule

Data collection rules (DCR) define data collected by Azure Monitor and specify how and where that data should be sent or stored. The REST API call must specify a DCR to use. A single DCE can support multiple DCRs, so you can specify a different DCR for different sources and target tables.

The DCR must understand the structure of the input data and the structure of the target table. If the two don't match, it can use a transformation to convert the source data to match the target table. You may also use the transform to filter source data and perform any other calculations or conversions.

For more information, see Data collection rules in Azure Monitor, and see this article for details about a DCR's structure. For information on how to retrieve a DCR ID, see this tutorial.

Log Analytics workspace tables

Custom logs can send data to any custom table that you create and to certain built-in tables in your Log Analytics workspace. The target table must exist before you can send data to it. The following built-in tables are currently supported:

Logs retrieval

The logs that were uploaded using this module can be queried using the azquery module (Azure Monitor Query).

Examples

Get started with our examples.

Troubleshooting

See our troubleshooting guide for details on how to diagnose various failure scenarios.

Next steps

To learn more about Azure Monitor, see the Azure Monitor service documentation.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Documentation

Index

Examples

Constants

View Source
const (
	ServiceNameIngestion cloud.ServiceName = "ingestion/azlogs"
)

Cloud Service Names for Monitor Ingestion, used to identify the respective cloud.ServiceConfiguration

Variables

This section is empty.

Functions

This section is empty.

Types

type Client

type Client struct {
	// contains filtered or unexported fields
}

Client contains the methods for the Client group. Don't use this type directly, use a constructor function instead.

func NewClient

func NewClient(endpoint string, credential azcore.TokenCredential, options *ClientOptions) (*Client, error)

NewClient creates a client to upload logs to Azure Monitor Ingestion.

Example
endpoint = os.Getenv("DATA_COLLECTION_ENDPOINT")
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
	//TODO: handle error
}

client, err := azlogs.NewClient(endpoint, cred, nil)
if err != nil {
	//TODO: handle error
}
_ = client
Output:

func (*Client) Upload

func (client *Client) Upload(ctx context.Context, ruleID string, streamName string, logs []byte, options *UploadOptions) (UploadResponse, error)

Upload - Ingestion API used to directly ingest data using Data Collection Rules. Maximum size of of API call is 1 MB. If the operation fails it returns an *azcore.ResponseError type.

Generated from API version 2023-01-01

  • ruleID - The immutable Id of the Data Collection Rule resource.
  • streamName - The streamDeclaration name as defined in the Data Collection Rule.
  • logs - An array of objects matching the schema defined by the provided stream.
  • options - UploadOptions contains the optional parameters for the Client.Upload method.
Example
package main

import (
	"context"
	"encoding/json"
	"os"
	"strconv"
	"time"

	"github.com/Azure/azure-sdk-for-go/sdk/monitor/ingestion/azlogs"
)

var client azlogs.Client

type Computer struct {
	Time              time.Time
	Computer          string
	AdditionalContext string
}

func main() {
	// set necessary data collection rule variables
	ruleID := os.Getenv("DATA_COLLECTION_RULE_IMMUTABLE_ID")
	streamName := os.Getenv("DATA_COLLECTION_RULE_STREAM_NAME")

	// generating logs
	// logs should match the schema defined by the provided stream
	var data []Computer
	for i := 0; i < 10; i++ {
		data = append(data, Computer{
			Time:              time.Now().UTC(),
			Computer:          "Computer" + strconv.Itoa(i),
			AdditionalContext: "context",
		})
	}
	// Marshal data into []byte
	logs, err := json.Marshal(data)
	if err != nil {
		panic(err)
	}

	// upload logs
	_, err = client.Upload(context.TODO(), ruleID, streamName, logs, nil)
	if err != nil {
		//TODO: handle error
	}
}
Output:

type ClientOptions

type ClientOptions struct {
	azcore.ClientOptions
}

ClientOptions contains optional settings for Client.

type UploadOptions

type UploadOptions struct {
	// If the bytes of the "logs" parameter are already gzipped, set ContentEncoding to "gzip"
	ContentEncoding *string
}

UploadOptions contains the optional parameters for the Client.Upload method.

type UploadResponse

type UploadResponse struct {
}

UploadResponse contains the response from method Client.Upload.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL