Sum Connector
Supported Pipeline Types
The sum
connector can be used to sum attribute values from spans, span events, metrics, data points, and log records.
Configuration
If you are not already familiar with connectors, you may find it helpful to first visit the Connectors README.
Basic configuration
This example configuration will sum numerical values found within the attribute attribute.with.numerical.value
of any span telemetry routed to the connector. It will then output a metric time series with the name my.example.metric.name
with those summed values.
Note: Values found within an attribute will be converted into a float regardless of their original type before being summed and output as a metric value. Non-convertible strings will be dropped and not included.
receivers:
foo:
connectors:
sum:
spans:
my.example.metric.name:
source_attribute: attribute.with.numerical.value
exporters:
bar:
service:
pipelines:
metrics/sum:
receivers: [sum]
exporters: [bar]
traces:
receivers: [foo]
exporters: [sum]
Required Settings
The sum connector has three required configuration settings and numerous optional settings
- Telemetry type: Nested below the
sum:
connector declaration. Declared as logs:
in the Basic Example.
- Can be any of
spans
, spanevents
, datapoints
, or logs
.
- For metrics use
datapoints
- For traces use
spans
or spanevents
- Metric name: Nested below the telemetry type; this is the metric name the sum connector will output summed values to. Declared as
my.example.metric.name
in the Basic Example
source_attribute
: A specific attribute to search for within the source telemetry being fed to the connector. This attribute is where the connector will look for numerical values to sum into the output metric value. Declared as attribute.with.numerical.value
in the Basic Example
Optional Settings
conditions
: OTTL syntax can be used to provide conditions for processing incoming telemetry. Conditions are ORed together, so if any condition is met the attribute's value will be included in the resulting sum.
attributes
: Declaration of attributes to include. Any of these attributes found will generate a separate sum for each set of unique combination of attribute values and output as its own datapoint in the metric time series.
key
: (required for attributes
) the attribute name to match against
default_value
: (optional for attributes
) a default value for the attribute when no matches are found. The default_value
value can be of type string, integer, or float.
Detailed Example Configuration
This example declares that the sum
connector is going to be ingesting logs
and creating an output metric named checkout.total
with numerical values found in the source_attribute
total.payment
.
It provides a condition to check that the attribute total.payment
is not NULL
. It also checks any incoming log telemetry for values present in the attribute payment.processor
and creates a datapoint within the metric time series for each unique value. Any logs without values in payment.processor
will be included in a datapoint with the default_value
of unspecified_processor
.
receivers:
foo:
connectors:
sum:
logs:
checkout.total:
source_attribute: total.payment
conditions:
- attributes["total.payment"] != "NULL"
attributes:
- key: payment.processor
default_value: unspecified_processor
exporters:
bar:
service:
pipelines:
metrics/sum:
receivers: [sum]
exporters: [bar]
logs:
receivers: [foo]
exporters: [sum]
Note for Log to Metrics: If your logs contain all values in their body
rather than in attributes (E.G. JSON payload) use a transform processor in your pipeline to upsert parsed key/value pairs (in this case from JSON) into attributes attached to the log.
processors:
transform/logs:
log_statements:
- context: log
statements:
- merge_maps(attributes, ParseJSON(body), "upsert")