inline

package
v1.3.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 4, 2024 License: Apache-2.0 Imports: 14 Imported by: 0

Documentation

Overview

Package inline provides a storage abstraction that stores data for use by Bacalhau jobs within the storage spec itself, without needing any connection to an external storage provider.

It does this (currently) by encoding the data as a RFC 2397 "data:" URL, in Base64 encoding. The data may be transparently compressed using Gzip compression if the storage system thinks this would be sensible.

This helps us meet a number of use cases:

  1. Providing "context" to jobs from the local filesystem as a more convenient way of sharing data with jobs than having to upload to IPFS first. This is useful for e.g. sharing a script to be executed by a generic job.
  2. When we support encryption, it will be safer to transmit encrypted secrets inline with the job spec itself rather than committing them to a public storage space like IPFS. (They could be redacted in job listings.)
  3. For clients running the SDK or in constrained (e.g IoT) environments, it will be easier to interact with just the Bacalhau SDK than also having to first persist storage and wait for this to complete. E.g. an IoT client could submit some data it has collected directly to the requester node.

The storage system doesn't enforce any maximum size of the stored data. It is up to the rest of the system to pick a limit it thinks is suitable and enforce it. This is so that e.g. a requester node can decide that an inline payload is too large and commit the data to IPFS instead, which would be out of the scope of this package.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type InlineStorage

type InlineStorage struct{}

func NewStorage

func NewStorage() *InlineStorage

func (*InlineStorage) CleanupStorage

As PrepareStorage writes the data to the local filesystem, CleanupStorage just needs to remove that temporary directory.

func (*InlineStorage) GetVolumeSize

func (i *InlineStorage) GetVolumeSize(_ context.Context, spec models.InputSource) (uint64, error)

For an inline storage, we define the volume size as uncompressed data size, as this is how much resource using the storage will take up.

func (*InlineStorage) HasStorageLocally

func (*InlineStorage) HasStorageLocally(context.Context, models.InputSource) (bool, error)

The storage is always local because it is contained with the SpecConfig.

func (*InlineStorage) IsInstalled

func (*InlineStorage) IsInstalled(context.Context) (bool, error)

The storage is always installed because it has no external dependencies.

func (*InlineStorage) PrepareStorage

func (i *InlineStorage) PrepareStorage(_ context.Context, storageDirectory string, spec models.InputSource) (storage.StorageVolume, error)

PrepareStorage extracts the data from the "data:" URL and writes it to a temporary directory. If the data was a compressed tarball, it decompresses it into a directory structure.

func (*InlineStorage) StoreBytes added in v1.3.1

func (*InlineStorage) StoreBytes(data []byte) models.SpecConfig

StoreBytes returns the passed data embedded as a "data:" URL in a SpecConfig. The input is never compressed.

func (*InlineStorage) Upload

func (*InlineStorage) Upload(ctx context.Context, path string) (models.SpecConfig, error)

Upload stores the data into the returned SpecConfig. If the path points to a directory, the directory will be made into a tarball. The data might be compressed and will always be base64-encoded using a URL-safe method.

type Source added in v1.0.4

type Source struct {
	URL string `json:"URL"`
}

func DecodeSpec added in v1.0.4

func DecodeSpec(spec *models.SpecConfig) (Source, error)

func (Source) ToMap added in v1.0.4

func (c Source) ToMap() map[string]interface{}

func (Source) Validate added in v1.0.4

func (c Source) Validate() error

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL