collector

package
v0.0.0-...-28b5f68 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 4, 2024 License: Apache-2.0 Imports: 16 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type NodeCollector

type NodeCollector struct {
	// contains filtered or unexported fields
}

NodeCollector is a collector that collects data using prometheus/node_exporter. Since prometheus returns an internal type we have to wrap it with our own type

func NewNodeCollector

func NewNodeCollector() (*NodeCollector, error)

NewNodeCollector creates a new prometheus NodeCollector

func (*NodeCollector) Collect

func (n *NodeCollector) Collect(ch chan<- prometheus.Metric)

Collect collects metrics using prometheus/node_exporter

func (*NodeCollector) Collectors

func (n *NodeCollector) Collectors() map[string]collector.Collector

Collectors returns the list of collectors registered

func (*NodeCollector) Describe

func (n *NodeCollector) Describe(ch chan<- *prometheus.Desc)

Describe describes the metrics collected using prometheus/node_exporter

func (*NodeCollector) Name

func (n *NodeCollector) Name() string

Name returns the name of this collector

type Option

type Option func(o *scraperOpts)

Option is used to configure optional scraper options.

func WithBearerToken

func WithBearerToken(token string) Option

WithBearerToken configures a scraper to use a bearer token

func WithBearerTokenFile

func WithBearerTokenFile(tokenFile string) Option

WithBearerTokenFile configures a scraper to use a bearer token read from a file

func WithLogLevel

func WithLogLevel(l log.Level) Option

WithLogLevel configures a custom log level for scraping.

func WithTimeout

func WithTimeout(d time.Duration) Option

WithTimeout configures a scraper with a timeout for scraping metrics.

type Scraper

type Scraper struct {
	// contains filtered or unexported fields
}

Scraper is a remote metric scraper that scrapes HTTP endpoints

func NewScraper

func NewScraper(name, metricsEndpoint string, extraMetricLabels []*dto.LabelPair, whitelist map[string]bool, opts ...Option) (*Scraper, error)

NewScraper creates a new scraper to scrape metrics from the provided host

func (*Scraper) Collect

func (s *Scraper) Collect(ch chan<- prometheus.Metric)

Collect collectrs metrics from the remote endpoint and reports them to ch

func (*Scraper) Describe

func (s *Scraper) Describe(ch chan<- *prometheus.Desc)

Describe describes this collector

func (*Scraper) FilterMetric

func (s *Scraper) FilterMetric(metricFamily *dto.MetricFamily) bool

FilterMetric returns true if the metric should be skipped (filtered out)

func (*Scraper) Name

func (s *Scraper) Name() string

Name returns the name of this scraper

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL