telegraf-output-kinesis-data-firehose

module
v1.4.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 13, 2023 License: MIT

README

Telegraf Output Plugin for Amazon Kinesis Data Firehose

Buy Me A Coffee

This plugin makes use of the Telegraf Output Exec plugin. It will batch up Points in one Put request to Amazon Kinesis Data Firehose.

The plugin also provides optional common formatting options, like normalizing keys and flattening the output. Such configuration can be used to provide data ingestion without the need of a data transformation function.

It expects that the configuration for the output ship data in line format.


About Amazon Kinesis Data Firehose

It may be useful for users to review Amazons official documentation which is available here.

Usage

mkdir /var/lib/telegraf/firehose
chown telegraf:telegraf /var/lib/telegraf/firehose
tar xf telegraf-output-kinesis-data-firehose-<LATEST_VERSION>-<OS>-<ARCH>.tar.gz -C /var/lib/telegraf/firehose
# e.g. tar xf telegraf-output-kinesis-data-firehose-v1.0.0-linux-amd64.tar.gz -C /var/lib/telegraf/firehose
  • Edit the plugin configuration as needed:
vi /var/lib/telegraf/firehose/plugin.conf
  • Add the plugin to /etc/telegraf/telegraf.conf or into a new file in /etc/telegraf/telegraf.d:
[[outputs.exec]]
  command = [ "/var/lib/telegraf/firehose/telegraf-output-kinesis-data-firehose", "-config", "/var/lib/telegraf/firehose/plugin.conf" ]
  data_format = "influx"
  • Restart or reload Telegraf.

AWS Authentication

This plugin uses a credential chain for Authentication with the Amazon Kinesis Data Firehose API endpoint. The plugin will attempt to authenticate in the following order:

  1. web identity provider credentials via STS if role_arn and web_identity_token_file are specified,
  2. assumed credentials via STS if the role_arn attribute is specified (source credentials are evaluated from subsequent rules),
  3. explicit credentials from the access_key, and secret_key attributes,
  4. shared profile from the profile attribute,
  5. environment variables,
  6. shared credentials, and/or
  7. the EC2 instance profile.

If you are using credentials from a web identity provider, you can specify the session name using role_session_name. If left empty, the current timestamp will be used.

Configuration

## AWS region
region = "eu-west-1"

## AWS credentials
#access_key = ""
#secret_key = ""
#role_arn = ""
#web_identity_token_file = ""
#role_session_name = ""
#profile = ""
#shared_credential_file = ""

## Endpoint to make request against, the correct endpoint is automatically
## determined and this option should only be set if you wish to override the
## default.
##   ex: endpoint_url = "http://localhost:8000"
#endpoint_url = ""

## Amazon Kinesis Data Firehose DeliveryStreamName must exist prior to starting telegraf.
streamname = "DeliveryStreamName"

## 'debug' will show upstream AWS messages.
debug = false

## 'format' provides formatting options
#[format]
  ## 'flatten' flattens all tags and fields into top-level keys
  #flatten = false
  ## 'normalize_keys' normalizes all keys to:
  ## 1/ convert to lower case, and
  ## 2/ replace spaces (' ') with underscores ('_')
  #normalize_keys = false
  ## 'name_key_rename' renames the 'name' field to the provided value
  #name_key_rename = ""
  ## 'timestamp_as_rfc3339' parses the timestamp into RFC3339 instead of a unix timestamp
  #timestamp_as_rfc3339 = false
  ## 'timestamp_units' defines the unix timestamp precision
  #timestamp_units = "1ms"

For this output plugin to function correctly the following variables must be configured:

  • region: the AWS region to connect to
  • streamname: used to send data to the correct stream (the stream MUST be pre-configured prior to starting this plugin!)

Development

The project uses go modules which can be downloaded by running:

go mod download
Testing
  1. Install all dependencies as shown above.
  2. Run go test by:
make test
Linting and Code Style

The project uses golangci-lint, and also pre-commit.

  1. Install all dependencies as shown above.
  2. (Optional) Install pre-commit hooks:
pre-commit install
  1. Run linter:
make lint
Commit Message

This project follows Conventional Commits, and your commit message must also adhere to the additional rules outlined in .conform.yaml.


Release

To draft a release, use standard-version:

standard-version
# alternatively
npx standard-version

Finally, push with tags:

git push --follow-tags

Note: releasing is automated for the main branch!


Contributions

Please feel free to contribute, be it with Issues or Pull Requests! Please read the Contribution guidelines

Notes

The plugin was inspired by the Amazon Kinesis Data Stream Output Plugin.

Supporting

If you enjoy the application and want to support my efforts, please feel free to buy me a coffe. :)

Buy Me A Coffee

Directories

Path Synopsis
plugins

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL