README
¶
cloudevent-recorder
This module provisions a regionalized cloudevents sink that consumes events of particular types from each of the regional brokers, and writes them into a regional GCS bucket where a periodic BigQuery Data-Transfer Service Job will pull events from into a BigQuery table schematized for that event type. The intended usage of this module for publishing events is something like this:
// Create a network with several regional subnets
module "networking" {
source = "chainguard-dev/common/infra//modules/networking"
name = "my-networking"
project_id = var.project_id
regions = [...]
}
// Create the Broker abstraction.
module "cloudevent-broker" {
source = "chainguard-dev/common/infra//modules/cloudevent-broker"
name = "my-broker"
project_id = var.project_id
regions = module.networking.regional-networks
}
// Record cloudevents of type com.example.foo and com.example.bar
module "foo-emits-events" {
source = "chainguard-dev/common/infra//modules/cloudevent-recorder"
name = "my-recorder"
project_id = var.project_id
regions = module.networking.regional-networks
broker = module.cloudevent-broker.broker
retention-period = 30 // keep around 30 days worth of event data
types = {
"com.example.foo": file("${path.module}/foo.schema.json"),
"com.example.bar": file("${path.module}/bar.schema.json"),
}
}
Requirements
No requirements.
Providers
Name | Version |
---|---|
n/a | |
random | n/a |
Modules
Name | Source | Version |
---|---|---|
recorder-dashboard | ../dashboard/cloudevent-receiver | n/a |
this | ../regional-go-service | n/a |
triggers | ../cloudevent-trigger | n/a |
Resources
Name | Type |
---|---|
google_bigquery_data_transfer_config.import-job | resource |
google_bigquery_dataset.this | resource |
google_bigquery_table.types | resource |
google_bigquery_table_iam_binding.import-writes-to-tables | resource |
google_service_account.import-identity | resource |
google_service_account.recorder | resource |
google_service_account_iam_binding.bq-dts-assumes-import-identity | resource |
google_service_account_iam_binding.provisioner-acts-as-import-identity | resource |
google_storage_bucket.recorder | resource |
google_storage_bucket_iam_binding.import-reads-from-gcs-buckets | resource |
google_storage_bucket_iam_binding.recorder-writes-to-gcs-buckets | resource |
random_id.suffix | resource |
random_id.trigger-suffix | resource |
google_project.project | data source |
Inputs
Name | Description | Type | Default | Required |
---|---|---|---|---|
broker | A map from each of the input region names to the name of the Broker topic in that region. | map(string) |
n/a | yes |
deletion_protection | Whether to enable deletion protection on data resources. | bool |
true |
no |
location | The location to create the BigQuery dataset in, and in which to run the data transfer jobs from GCS. | string |
"US" |
no |
name | n/a | string |
n/a | yes |
project_id | n/a | string |
n/a | yes |
provisioner | The identity as which this module will be applied (so it may be granted permission to 'act as' the DTS service account). This should be in the form expected by an IAM subject (e.g. user:sally@example.com) | string |
n/a | yes |
regions | A map from region names to a network and subnetwork. A recorder service and cloud storage bucket (into which the service writes events) will be created in each region. | map(object({ |
n/a | yes |
retention-period | The number of days to retain data in BigQuery. | number |
n/a | yes |
types | A map from cloudevent types to the BigQuery schema associated with them, as well as an alert threshold and a list of notification channels | map(object({ |
n/a | yes |
Outputs
No outputs.
Click to show internal directories.
Click to hide internal directories.