Documentation
¶
Overview ¶
Implements importer triggering based on SNS queues. This decodes incoming SNS messages and extracts files ready for importer code to run
Exposes the interface of the dataset importer aka converter and selecting one automatically based on what files are in the folder being imported. The converter supports various formats as delivered by GDS or test instruments and this is inteded to be extendable further to other lab instruments and devices in future.
Example (DecodeImportTrigger_Manual) ¶
Trigger for a manual dataset regeneration (user clicks save button on dataset edit page)
trigger := `{ "datasetID": "189137412", "jobID": "dataimport-zmzddoytch2krd7n" }` sourceBucket, sourceFilePath, datasetID, jobId, err := decodeImportTrigger([]byte(trigger)) fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, jobId, err)
Output: Source Bucket: "" Source file: "" Dataset: "189137412" Job: "dataimport-zmzddoytch2krd7n" Err: "<nil>"
Example (DecodeImportTrigger_ManualBadDatasetID) ¶
trigger := `{ "datasetID": "", "jobID": "dataimport-zmzddoytch2krd7n" }` sourceBucket, sourceFilePath, datasetID, jobID, err := decodeImportTrigger([]byte(trigger)) fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, jobID, err)
Output: Source Bucket: "" Source file: "" Dataset: "" Job: "" Err: "Failed to find dataset ID in reprocess trigger"
Example (DecodeImportTrigger_ManualBadLogID) ¶
trigger := `{ "datasetID": "qwerty" }` sourceBucket, sourceFilePath, datasetID, jobID, err := decodeImportTrigger([]byte(trigger)) fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, jobID, err)
Output: Source Bucket: "" Source file: "" Dataset: "" Job: "" Err: "Failed to find job ID in reprocess trigger"
Example (DecodeImportTrigger_ManualBadMsg) ¶
trigger := `{ "weird": "message" }` sourceBucket, sourceFilePath, datasetID, jobID, err := decodeImportTrigger([]byte(trigger)) fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, jobID, err)
Output: Source Bucket: "" Source file: "" Dataset: "" Job: "" Err: "Unexpected or no message type embedded in triggering SNS message"
Example (DecodeImportTrigger_OCS) ¶
Trigger from when a new zip arrives from the pipeline
trigger := `{ "Records": [ { "eventVersion": "2.1", "eventSource": "aws:s3", "awsRegion": "us-east-1", "eventTime": "2022-09-16T09:10:28.417Z", "eventName": "ObjectCreated:CompleteMultipartUpload", "userIdentity": { "principalId": "AWS:AIDA6AOWGDOHF37MOKWLS" }, "requestParameters": { "sourceIPAddress": "81.154.57.137" }, "responseElements": { "x-amz-request-id": "G3QWWT0BAYKP81QK", "x-amz-id-2": "qExUWHHDE1nL+UP3zim1XA7FIXRUoKxlIrJt/7ULAtn08/+EvRCt4sChLhCGEqMo7ny4CU/KufMNmOcyZsDPKGWHT2ukMbo+" }, "s3": { "s3SchemaVersion": "1.0", "configurationId": "OTBjMjZmYzAtYThlOC00OWRmLWIwMzUtODkyZDk0YmRhNzkz", "bucket": { "name": "prodpipeline-rawdata202c7bd0-o40ktu17o2oj", "ownerIdentity": { "principalId": "AP902Y0PI20DF" }, "arn": "arn:aws:s3:::prodpipeline-rawdata202c7bd0-o40ktu17o2oj" }, "object": { "key": "189137412-07-09-2022-10-07-57.zip", "size": 54237908, "eTag": "b21ebca14f67255be1cd28c01d494508-7", "sequencer": "0063243D6858D568F0" } } } ] }` sourceBucket, sourceFilePath, datasetID, logID, err := decodeImportTrigger([]byte(trigger)) // NOTE: we're only checking the length of the log ID because it's a timestamp+random chars. Other code has this stubbed out but here it's probably sufficient fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nLog Str Len: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, len(logID), err)
Output: Source Bucket: "prodpipeline-rawdata202c7bd0-o40ktu17o2oj" Source file: "189137412-07-09-2022-10-07-57.zip" Dataset: "189137412" Log Str Len: "43" Err: "<nil>"
Example (DecodeImportTrigger_OCS2) ¶
Trigger from when a new zip arrives from the pipeline
trigger := `{ "Records": [ { "eventVersion": "2.1", "eventSource": "aws:s3", "awsRegion": "us-east-1", "eventTime": "2022-09-25T14:33:49.456Z", "eventName": "ObjectCreated:Put", "userIdentity": { "principalId": "AWS:AIDA6AOWGDOHF37MOKWLS" }, "requestParameters": { "sourceIPAddress": "3.12.95.94" }, "responseElements": { "x-amz-request-id": "K811ZDJ52EYBJ8P2", "x-amz-id-2": "R7bGQ2fOjvSZHkHez700w3wRVpn32nmr6jVPVYhKtNE2c2KYOmgm9hjmOA5WSQFh8faLRe6fHAmANKSTNRhwCq7Xgol0DgX4" }, "s3": { "s3SchemaVersion": "1.0", "configurationId": "OTBjMjZmYzAtYThlOC00OWRmLWIwMzUtODkyZDk0YmRhNzkz", "bucket": { "name": "prodpipeline-rawdata202c7bd0-o40ktu17o2oj", "ownerIdentity": { "principalId": "AP902Y0PI20DF" }, "arn": "arn:aws:s3:::prodpipeline-rawdata202c7bd0-o40ktu17o2oj" }, "object": { "key": "197329413-25-09-2022-14-33-39.zip", "size": 1388, "eTag": "932bda7d32c05d90ecc550d061862994", "sequencer": "00633066CD68A4BF43" } } } ] }` sourceBucket, sourceFilePath, datasetID, jobID, err := decodeImportTrigger([]byte(trigger)) // NOTE: we're only checking the length of the log ID because it's a timestamp+random chars. Other code has this stubbed out but here it's probably sufficient fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob Str Len: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, len(jobID), err)
Output: Source Bucket: "prodpipeline-rawdata202c7bd0-o40ktu17o2oj" Source file: "197329413-25-09-2022-14-33-39.zip" Dataset: "197329413" Job Str Len: "43" Err: "<nil>"
Example (DecodeImportTrigger_OCS_BadEventType) ¶
trigger := `{ "Records": [ { "eventVersion": "2.1", "eventSource": "aws:sqs", "awsRegion": "us-east-1", "eventTime": "2022-09-16T09:10:28.417Z", "eventName": "ObjectCreated:CompleteMultipartUpload", "userIdentity": { "principalId": "AWS:AIDA6AOWGDOHF37MOKWLS" }, "requestParameters": { "sourceIPAddress": "81.154.57.137" }, "responseElements": { "x-amz-request-id": "G3QWWT0BAYKP81QK", "x-amz-id-2": "qExUWHHDE1nL+UP3zim1XA7FIXRUoKxlIrJt/7ULAtn08/+EvRCt4sChLhCGEqMo7ny4CU/KufMNmOcyZsDPKGWHT2ukMbo+" } } ] }` sourceBucket, sourceFilePath, datasetID, jobID, err := decodeImportTrigger([]byte(trigger)) fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, jobID, err)
Output: Source Bucket: "" Source file: "" Dataset: "" Job: "" Err: "Failed to decode dataset import trigger: Failed to decode sqs body to an S3 event: unexpected end of JSON input"
Example (DecodeImportTrigger_OCS_Error) ¶
trigger := `{ "Records": [] }` sourceBucket, sourceFilePath, datasetID, jobID, err := decodeImportTrigger([]byte(trigger)) fmt.Printf("Source Bucket: \"%v\"\nSource file: \"%v\"\nDataset: \"%v\"\nJob: \"%v\"\nErr: \"%v\"\n", sourceBucket, sourceFilePath, datasetID, jobID, err)
Output: Source Bucket: "" Source file: "" Dataset: "" Job: "" Err: "Unexpected or no message type embedded in triggering SNS message"
Example (GetUpdateType_Drive) ¶
newSummary := protos.ScanItem{ Meta: map[string]string{"DriveID": "997"}, } oldSummary := protos.ScanItem{ Meta: map[string]string{"DriveID": "0"}, } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: housekeeping|<nil>
Example (GetUpdateType_LessContextImages) ¶
newSummary := protos.ScanItem{ DataTypes: []*protos.ScanItem_ScanTypeCount{ &protos.ScanItem_ScanTypeCount{ DataType: protos.ScanDataType_SD_IMAGE, Count: 3, }, }, } oldSummary := protos.ScanItem{ DataTypes: []*protos.ScanItem_ScanTypeCount{ &protos.ScanItem_ScanTypeCount{ DataType: protos.ScanDataType_SD_IMAGE, Count: 5, }, }, } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: image|<nil>
Example (GetUpdateType_MoreContextImages) ¶
newSummary := protos.ScanItem{ DataTypes: []*protos.ScanItem_ScanTypeCount{ &protos.ScanItem_ScanTypeCount{ DataType: protos.ScanDataType_SD_IMAGE, Count: 3, }, }, } oldSummary := protos.ScanItem{ DataTypes: []*protos.ScanItem_ScanTypeCount{ &protos.ScanItem_ScanTypeCount{ DataType: protos.ScanDataType_SD_IMAGE, Count: 0, }, }, } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: image|<nil>
Example (GetUpdateType_NormalSpectra) ¶
newSummary := protos.ScanItem{ ContentCounts: map[string]int32{"NormalSpectra": 100}, } oldSummary := protos.ScanItem{ ContentCounts: map[string]int32{"NormalSpectra": 10}, } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: spectra|<nil>
Example (GetUpdateType_RTT) ¶
newSummary := protos.ScanItem{ Meta: map[string]string{"RTT": "1234"}, } oldSummary := protos.ScanItem{ Meta: map[string]string{"RTT": "123"}, } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: unknown|<nil>
Example (GetUpdateType_SameContextImages) ¶
newSummary := protos.ScanItem{ DataTypes: []*protos.ScanItem_ScanTypeCount{ &protos.ScanItem_ScanTypeCount{ DataType: protos.ScanDataType_SD_IMAGE, Count: 3, }, }, } oldSummary := protos.ScanItem{ DataTypes: []*protos.ScanItem_ScanTypeCount{ &protos.ScanItem_ScanTypeCount{ DataType: protos.ScanDataType_SD_IMAGE, Count: 3, }, }, } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: unknown|<nil>
Example (GetUpdateType_Title) ¶
newSummary := protos.ScanItem{ Title: "Analysed rock", } oldSummary := protos.ScanItem{ Title: "Freshly downloaded rock", } upd, err := getUpdateType(&newSummary, &oldSummary) fmt.Printf("%v|%v\n", upd, err)
Output: housekeeping|<nil>
Index ¶
- func ImportDataset(localFS fileaccess.FileAccess, remoteFS fileaccess.FileAccess, ...) (string, *protos.ScanItem, string, bool, error)
- func ImportFromLocalFileSystem(localFS fileaccess.FileAccess, remoteFS fileaccess.FileAccess, ...) (string, error)
- func TriggerDatasetReprocessViaSNS(snsSvc awsutil.SNSInterface, jobId string, scanId string, snsTopic string) (*sns.PublishOutput, error)
- type DatasetCustomMeta
- type ImportResult
Examples ¶
- Package (DecodeImportTrigger_Manual)
- Package (DecodeImportTrigger_ManualBadDatasetID)
- Package (DecodeImportTrigger_ManualBadLogID)
- Package (DecodeImportTrigger_ManualBadMsg)
- Package (DecodeImportTrigger_OCS)
- Package (DecodeImportTrigger_OCS2)
- Package (DecodeImportTrigger_OCS_BadEventType)
- Package (DecodeImportTrigger_OCS_Error)
- Package (GetUpdateType_Drive)
- Package (GetUpdateType_LessContextImages)
- Package (GetUpdateType_MoreContextImages)
- Package (GetUpdateType_NormalSpectra)
- Package (GetUpdateType_RTT)
- Package (GetUpdateType_SameContextImages)
- Package (GetUpdateType_Title)
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ImportDataset ¶
func ImportDataset( localFS fileaccess.FileAccess, remoteFS fileaccess.FileAccess, configBucket string, manualUploadBucket string, datasetBucket string, db *mongo.Database, datasetID string, log logger.ILogger, justArchived bool, ) (string, *protos.ScanItem, string, bool, error)
ImportFromArchive - Importing from dataset archive area. Calls ImportFromLocalFileSystem Returns: WorkingDir Saved dataset summary structure What changed (as a string), so caller can know what kind of notification to send (if any) IsUpdate flag Error (if any)
func ImportFromLocalFileSystem ¶
func ImportFromLocalFileSystem( localFS fileaccess.FileAccess, remoteFS fileaccess.FileAccess, db *mongo.Database, workingDir string, localImportPath string, localPseudoIntensityRangesPath string, datasetBucket string, datasetID string, log logger.ILogger) (string, error)
ImportFromLocalFileSystem - As the name says, imports from directory on local file system Returns: Dataset ID (in case it was modified during conversion) Error (if there was one)
func TriggerDatasetReprocessViaSNS ¶
func TriggerDatasetReprocessViaSNS(snsSvc awsutil.SNSInterface, jobId string, scanId string, snsTopic string) (*sns.PublishOutput, error)
Firing a trigger message. Anything calling this is triggering a dataset reimport via a lambda function
Types ¶
type DatasetCustomMeta ¶
type ImportResult ¶
type ImportResult struct { WorkingDir string // so it can be cleaned up by caller if needed WhatChanged string // what changed between this import and a previous one, for notification reasons IsUpdate bool // IsUpdate flag DatasetTitle string // Name of the dataset that was imported DatasetID string // ID of the dataset that was imported Logger logger.ILogger // Caller must call Close() on it, otherwise we may lose the last few log events }
Structure returned after importing NOTE: the logger must have Close() called on it, otherwise we may lose the last few log events
func ImportForTrigger ¶
func ImportForTrigger( triggerMessage []byte, envName string, configBucket string, datasetBucket string, manualBucket string, db *mongo.Database, log logger.ILogger, remoteFS fileaccess.FileAccess) (ImportResult, error)
ImportForTrigger - Parses a trigger message (from SNS) and decides what to import Returns: Result struct - NOTE: logger must have Close() called on it, otherwise we may lose the last few log events Error (or nil)
Directories
¶
Path | Synopsis |
---|---|
internal
|
|
datasetArchive
Implements archiving/retrieval of dataset source zip files as delivered by GDS.
|
Implements archiving/retrieval of dataset source zip files as delivered by GDS. |
output
Allows outputting (in PIXLISE protobuf dataset format) of in-memory representation of PIXL data that importer has read.
|
Allows outputting (in PIXLISE protobuf dataset format) of in-memory representation of PIXL data that importer has read. |