automultilinedetection

package
v0.0.0-...-72c4da4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 20, 2024 License: Apache-2.0 Imports: 15 Imported by: 0

Documentation

Overview

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Package automultilinedetection contains auto multiline detection and aggregation logic.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Aggregator

type Aggregator struct {
	// contains filtered or unexported fields
}

Aggregator aggregates multiline logs with a given label.

func NewAggregator

func NewAggregator(outputFn func(m *message.Message), maxContentSize int, flushTimeout time.Duration, tagTruncatedLogs bool, tagMultiLineLogs bool, tailerInfo *status.InfoRegistry) *Aggregator

NewAggregator creates a new aggregator.

func (*Aggregator) Aggregate

func (a *Aggregator) Aggregate(msg *message.Message, label Label)

Aggregate aggregates a multiline log using a label.

func (*Aggregator) Flush

func (a *Aggregator) Flush()

Flush flushes the aggregator.

func (*Aggregator) FlushChan

func (a *Aggregator) FlushChan() <-chan time.Time

FlushChan returns the flush timer channel.

type DiagnosticRow

type DiagnosticRow struct {
	TokenString string
	LabelString string

	Count     int64
	LastIndex int64
	// contains filtered or unexported fields
}

DiagnosticRow is a struct that represents a diagnostic view of a row in the PatternTable.

type Heuristic

type Heuristic interface {
	// ProcessAndContinue processes a log message and annotates the context with a label. It returns false if the message should be done processing.
	// Heuristic implementations may mutate the message context but must do so synchronously.
	ProcessAndContinue(*messageContext) bool
}

Heuristic is an interface representing a strategy to label log messages.

type JSONDetector

type JSONDetector struct{}

JSONDetector is a heuristic to detect JSON messages.

func NewJSONDetector

func NewJSONDetector() *JSONDetector

NewJSONDetector returns a new JSON detection heuristic.

func (*JSONDetector) ProcessAndContinue

func (j *JSONDetector) ProcessAndContinue(context *messageContext) bool

ProcessAndContinue checks if a message is a JSON message. This implements the Herustic interface - so we should stop processing if we detect a JSON message by returning false.

type Label

type Label uint32

Label is a label for a log message.

type Labeler

type Labeler struct {
	// contains filtered or unexported fields
}

Labeler labels log messages based on a set of heuristics. Each Heuristic operates on the output of the previous heuristic - mutating the message context. A label is chosen when a herusitc signals the labeler to stop or when all herustics have been processed.

func NewLabeler

func NewLabeler(lablerHeuristics []Heuristic, analyticsHeuristics []Heuristic) *Labeler

NewLabeler creates a new labeler with the given heuristics. lablerHeuristics are used to mutate the label of a log message. analyticsHeuristics are used to analyze the log message and labeling process for the status page and telemetry.

func (*Labeler) Label

func (l *Labeler) Label(rawMessage []byte) Label

Label labels a log message.

type MatchContext

type MatchContext struct {
	// contains filtered or unexported fields
}

MatchContext is the context of a match.

type PatternTable

type PatternTable struct {
	// contains filtered or unexported fields
}

PatternTable is a table of patterns that occur over time from a log source. The pattern table is always sorted by the frequency of the patterns. When the table becomes full, the least recently updated pattern is evicted.

func NewPatternTable

func NewPatternTable(maxTableSize int, matchThreshold float64, tailerInfo *status.InfoRegistry) *PatternTable

NewPatternTable returns a new PatternTable heuristic.

func (*PatternTable) DumpTable

func (p *PatternTable) DumpTable() []DiagnosticRow

DumpTable returns a slice of DiagnosticRow structs that represent the current state of the table.

func (*PatternTable) Info

func (p *PatternTable) Info() []string

Info returns a breakdown of the patterns in the table.

func (*PatternTable) InfoKey

func (p *PatternTable) InfoKey() string

InfoKey returns a string representing the key for the pattern table.

func (*PatternTable) ProcessAndContinue

func (p *PatternTable) ProcessAndContinue(context *messageContext) bool

ProcessAndContinue adds a pattern to the table and updates its label based on it's frequency. This implements the Herustic interface - so we should stop processing if the label was changed due to pattern detection.

type TimestampDetector

type TimestampDetector struct {
	// contains filtered or unexported fields
}

TimestampDetector is a heuristic to detect timestamps.

func NewTimestampDetector

func NewTimestampDetector(matchThreshold float64) *TimestampDetector

NewTimestampDetector returns a new Timestamp detection heuristic.

func (*TimestampDetector) ProcessAndContinue

func (t *TimestampDetector) ProcessAndContinue(context *messageContext) bool

ProcessAndContinue checks if a message is likely to be a timestamp.

type TokenGraph

type TokenGraph struct {
	// contains filtered or unexported fields
}

TokenGraph is a directed cyclic graph of tokens that model the relationship between any two tokens. It is used to calculate the probability of an unknown sequence of tokens being represented by the graph.

func NewTokenGraph

func NewTokenGraph(minimumTokenLength int, inputData [][]tokens.Token) *TokenGraph

NewTokenGraph returns a new TokenGraph.

func (*TokenGraph) MatchProbability

func (m *TokenGraph) MatchProbability(ts []tokens.Token) MatchContext

MatchProbability returns the probability of a sequence of tokens being represented by the graph.

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer is a heuristic to compute tokens from a log message. The tokenizer is used to convert a log message (string of bytes) into a list of tokens that represents the underlying structure of the log. The string of tokens is a compact slice of bytes that can be used to compare log messages structure. A tokenizer instance is not thread safe as bufferes are reused to avoid allocations.

func NewTokenizer

func NewTokenizer(maxEvalBytes int) *Tokenizer

NewTokenizer returns a new Tokenizer detection heuristic.

func (*Tokenizer) ProcessAndContinue

func (t *Tokenizer) ProcessAndContinue(context *messageContext) bool

ProcessAndContinue enriches the message context with tokens. This implements the Herustic interface - this heuristic does not stop processing.

type UserSample

type UserSample struct {
	// Sample is a raw log message sample used to aggregate logs.
	Sample string `mapstructure:"sample"`
	// MatchThreshold is the ratio of tokens that must match between the sample and the log message to consider it a match.
	// From a user perspective, this is how similar the log has to be to the sample to be considered a match.
	// Optional - Default value is 0.75.
	MatchThreshold *float64 `mapstructure:"match_threshold,omitempty"`
	// Regex is a pattern used to aggregate logs. NOTE that you can use either a sample or a regex, but not both.
	Regex string `mapstructure:"regex,omitempty"`
	// Label is the label to apply to the log message if it matches the sample.
	// Optional - Default value is "start_group".
	Label *string `mapstructure:"label,omitempty"`
	// contains filtered or unexported fields
}

UserSample represents a user-defined sample for auto multi-line detection.

type UserSamples

type UserSamples struct {
	// contains filtered or unexported fields
}

UserSamples is a heuristic that represents a collection of user-defined samples for auto multi-line aggreagtion.

func NewUserSamples

func NewUserSamples(config model.Reader) *UserSamples

NewUserSamples creates a new UserSamples instance.

func (*UserSamples) ProcessAndContinue

func (j *UserSamples) ProcessAndContinue(context *messageContext) bool

ProcessAndContinue applies a user sample to a log message. If it matches, a label is assigned. This implements the Herustic interface - so we should stop processing if we detect a user pattern by returning false.

Directories

Path Synopsis
Package tokens contains the token definitions for the tokenizer.
Package tokens contains the token definitions for the tokenizer.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL