sumconnector

package module
v0.114.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 18, 2024 License: Apache-2.0 Imports: 22 Imported by: 1

README

Sum Connector

Status
Distributions []
Issues Open issues Closed issues
Code Owners @greatestusername, @shalper2, @crobert-1

Supported Pipeline Types

Exporter Pipeline Type Receiver Pipeline Type Stability Level
traces metrics alpha
metrics metrics alpha
logs metrics alpha

The sum connector can be used to sum attribute values from spans, span events, metrics, data points, and log records.

Configuration

If you are not already familiar with connectors, you may find it helpful to first visit the Connectors README.

Basic configuration

This example configuration will sum numerical values found within the attribute attribute.with.numerical.value of any span telemetry routed to the connector. It will then output a metric time series with the name my.example.metric.name with those summed values.

Note: Values found within an attribute will be converted into a float regardless of their original type before being summed and output as a metric value. Non-convertible strings will be dropped and not included.

receivers:
  foo:
connectors:
  sum:
    spans:
      my.example.metric.name:
        source_attribute: attribute.with.numerical.value
exporters:
  bar:

service:
  pipelines:
    metrics/sum:
       receivers: [sum]
       exporters: [bar]
    traces:
       receivers: [foo]
       exporters: [sum]
Required Settings

The sum connector has three required configuration settings and numerous optional settings

  • Telemetry type: Nested below the sum: connector declaration. Declared as logs: in the Basic Example.
    • Can be any of spans, spanevents, datapoints, or logs.
    • For metrics use datapoints
    • For traces use spans or spanevents
  • Metric name: Nested below the telemetry type; this is the metric name the sum connector will output summed values to. Declared as my.example.metric.name in the Basic Example
  • source_attribute: A specific attribute to search for within the source telemetry being fed to the connector. This attribute is where the connector will look for numerical values to sum into the output metric value. Declared as attribute.with.numerical.value in the Basic Example
Optional Settings
  • conditions: OTTL syntax can be used to provide conditions for processing incoming telemetry. Conditions are ORed together, so if any condition is met the attribute's value will be included in the resulting sum.
  • attributes: Declaration of attributes to include. Any of these attributes found will generate a separate sum for each set of unique combination of attribute values and output as its own datapoint in the metric time series.
    • key: (required for attributes) the attribute name to match against
    • default_value: (optional for attributes) a default value for the attribute when no matches are found. The default_value value can be of type string, integer, or float.
Detailed Example Configuration

This example declares that the sum connector is going to be ingesting logs and creating an output metric named checkout.total with numerical values found in the source_attribute total.payment.

It provides a condition to check that the attribute total.payment is not NULL. It also checks any incoming log telemetry for values present in the attribute payment.processor and creates a datapoint within the metric time series for each unique value. Any logs without values in payment.processor will be included in a datapoint with the default_value of unspecified_processor.

receivers:
  foo:
connectors:
  sum:
    logs:
      checkout.total:
        source_attribute: total.payment
        conditions:
          - attributes["total.payment"] != "NULL"
        attributes:
          - key: payment.processor
            default_value: unspecified_processor
exporters:
  bar:

service:
  pipelines:
    metrics/sum:
       receivers: [sum]
       exporters: [bar]
    logs:
       receivers: [foo]
       exporters: [sum]

Note for Log to Metrics: If your logs contain all values in their body rather than in attributes (E.G. JSON payload) use a transform processor in your pipeline to upsert parsed key/value pairs (in this case from JSON) into attributes attached to the log.

processors:
  transform/logs:
    log_statements:
      - context: log
        statements:
          - merge_maps(attributes, ParseJSON(body), "upsert")

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func NewFactory

func NewFactory() connector.Factory

NewFactory returns a ConnectorFactory.

Types

type AttributeConfig

type AttributeConfig struct {
	Key          string `mapstructure:"key"`
	DefaultValue any    `mapstructure:"default_value"`
}

type Config

type Config struct {
	Spans      map[string]MetricInfo `mapstructure:"spans"`
	SpanEvents map[string]MetricInfo `mapstructure:"spanevents"`
	Metrics    map[string]MetricInfo `mapstructure:"metrics"`
	DataPoints map[string]MetricInfo `mapstructure:"datapoints"`
	Logs       map[string]MetricInfo `mapstructure:"logs"`
}

Config for the connector

func (*Config) Validate added in v0.106.0

func (c *Config) Validate() (combinedErrors error)

type MetricInfo

type MetricInfo struct {
	Description     string            `mapstructure:"description"`
	Conditions      []string          `mapstructure:"conditions"`
	Attributes      []AttributeConfig `mapstructure:"attributes"`
	SourceAttribute string            `mapstructure:"source_attribute"`
}

MetricInfo for a data type

Directories

Path Synopsis
internal

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL