This module provisions a regionalized cloudevents sink that consumes events of
particular types from each of the regional brokers, and writes them into a
regional GCS bucket where a periodic BigQuery Data-Transfer Service Job will
pull events from into a BigQuery table schematized for that event type. The
intended usage of this module for publishing events is something like this:
// Create a network with several regional subnets
module "networking" {
source = "chainguard-dev/common/infra//modules/networking"
name = "my-networking"
project_id = var.project_id
regions = [...]
}
// Create the Broker abstraction.
module "cloudevent-broker" {
source = "chainguard-dev/common/infra//modules/cloudevent-broker"
name = "my-broker"
project_id = var.project_id
regions = module.networking.regional-networks
}
// Record cloudevents of type com.example.foo and com.example.bar
module "foo-emits-events" {
source = "chainguard-dev/common/infra//modules/cloudevent-recorder"
name = "my-recorder"
project_id = var.project_id
regions = module.networking.regional-networks
broker = module.cloudevent-broker.broker
retention-period = 30 // keep around 30 days worth of event data
provisioner = "user:sally@chainguard.dev"
types = {
"com.example.foo": {
schema = file("${path.module}/foo.schema.json")
}
"com.example.bar": {
schema = file("${path.module}/bar.schema.json")
}
}
}
The identity as which this module will be applied (so it may be granted permission to 'act as' the DTS service account). This should be in the form expected by an IAM subject (e.g. user:sally@example.com)
A map from region names to a network and subnetwork. A recorder service and cloud storage bucket (into which the service writes events) will be created in each region.
A map from cloudevent types to the BigQuery schema associated with them, as well as an alert threshold and a list of notification channels (for subscription-level issues).