README
¶
Amazon S3 blobstore
Configuration
See https://docs.aws.amazon.com/sdk-for-go/v1/developer-guide/configuring-sdk.html#specifying-credentials on how to set up authentication against s3
Enabling archival is done by using the configuration below. Region
and bucket URI
are required
archival:
history:
status: "enabled"
enableRead: true
provider:
s3store:
region: "us-east-1"
visibility:
status: "enabled"
enableRead: true
provider:
s3store:
region: "us-east-1"
domainDefaults:
archival:
history:
status: "enabled"
URI: "s3://<bucket-name>"
visibility:
status: "enabled"
URI: "s3://<bucket-name>"
Visibility query syntax
You can query the visibility store by using the cadence workflow listarchived
command
The syntax for the query is based on SQL
Supported column names are
- WorkflowID String
- WorkflowTypeName String
- StartTime Date
- CloseTime Date
- SearchPrecision String - Day, Hour, Minute, Second
WorkflowID or WorkflowTypeName is required. If filtering on date use StartTime or CloseTime in combination with SearchPrecision.
Searching for a record will be done in times in the UTC timezone
SearchPrecision specifies what range you want to search for records. If you use SearchPrecision = 'Day'
it will search all records starting from 2020-01-21T00:00:00Z
to 2020-01-21T59:59:59Z
Limitations
- The only operator supported is
=
due to how records are stored in s3.
Example
Searches for all records done in day 2020-01-21 with the specified workflow id
./cadence --do samples-domain workflow listarchived -q "StartTime = '2020-01-21T00:00:00Z' AND WorkflowID='workflow-id' AND SearchPrecision='Day'"
Storage in S3
Workflow runs are stored in s3 using the following structure
s3://<bucket-name>/<domain-id>/
history/<workflow-id>/<run-id>
visibility/
workflowTypeName/<workflow-type-name>/
startTimeout/2020-01-21T16:16:11Z/<run-id>
closeTimeout/2020-01-21T16:16:11Z/<run-id>
workflowID/<workflow-id>/
startTimeout/2020-01-21T16:16:11Z/<run-id>
closeTimeout/2020-01-21T16:16:11Z/<run-id>
Using localstack for local development
- Install awscli from here
- Install localstack from here
- Launch localstack with
SERVICES=s3 localstack start
- Create a bucket using
aws --endpoint-url=http://localhost:4572 s3 mb s3://cadence-development
- Configure archival and domainDefaults with the following configuration
archival:
history:
status: "enabled"
enableRead: true
provider:
s3store:
region: "us-east-1"
endpoint: "http://127.0.0.1:4572"
s3ForcePathStyle: true
visibility:
status: "enabled"
enableRead: true
provider:
s3store:
region: "us-east-1"
endpoint: "http://127.0.0.1:4572"
s3ForcePathStyle: true
domainDefaults:
archival:
history:
status: "enabled"
URI: "s3://cadence-development"
visibility:
status: "enabled"
URI: "s3://cadence-development"
Documentation
¶
Overview ¶
Package s3store is a generated GoMock package.
Index ¶
- Constants
- func NewHistoryArchiver(container *archiver.HistoryBootstrapContainer, config *config.S3Archiver) (archiver.HistoryArchiver, error)
- func NewVisibilityArchiver(container *archiver.VisibilityBootstrapContainer, config *config.S3Archiver) (archiver.VisibilityArchiver, error)
- type MockQueryParser
- type MockQueryParserMockRecorder
- type QueryParser
Constants ¶
const ( WorkflowTypeName = "WorkflowTypeName" WorkflowID = "WorkflowID" StartTime = "StartTime" CloseTime = "CloseTime" SearchPrecision = "SearchPrecision" )
All allowed fields for filtering
const ( PrecisionDay = "Day" PrecisionHour = "Hour" PrecisionMinute = "Minute" PrecisionSecond = "Second" )
Precision specific values
const (
// URIScheme is the scheme for the s3 implementation
URIScheme = "s3"
)
Variables ¶
This section is empty.
Functions ¶
func NewHistoryArchiver ¶
func NewHistoryArchiver( container *archiver.HistoryBootstrapContainer, config *config.S3Archiver, ) (archiver.HistoryArchiver, error)
NewHistoryArchiver creates a new archiver.HistoryArchiver based on s3
func NewVisibilityArchiver ¶
func NewVisibilityArchiver( container *archiver.VisibilityBootstrapContainer, config *config.S3Archiver, ) (archiver.VisibilityArchiver, error)
NewVisibilityArchiver creates a new archiver.VisibilityArchiver based on s3
Types ¶
type MockQueryParser ¶
type MockQueryParser struct {
// contains filtered or unexported fields
}
MockQueryParser is a mock of QueryParser interface.
func NewMockQueryParser ¶
func NewMockQueryParser(ctrl *gomock.Controller) *MockQueryParser
NewMockQueryParser creates a new mock instance.
func (*MockQueryParser) EXPECT ¶
func (m *MockQueryParser) EXPECT() *MockQueryParserMockRecorder
EXPECT returns an object that allows the caller to indicate expected use.
func (*MockQueryParser) Parse ¶
func (m *MockQueryParser) Parse(query string) (*parsedQuery, error)
Parse mocks base method.
type MockQueryParserMockRecorder ¶
type MockQueryParserMockRecorder struct {
// contains filtered or unexported fields
}
MockQueryParserMockRecorder is the mock recorder for MockQueryParser.
func (*MockQueryParserMockRecorder) Parse ¶
func (mr *MockQueryParserMockRecorder) Parse(query interface{}) *gomock.Call
Parse indicates an expected call of Parse.
type QueryParser ¶
QueryParser parses a limited SQL where clause into a struct
func NewQueryParser ¶
func NewQueryParser() QueryParser
NewQueryParser creates a new query parser for filestore