Documentation ¶
Overview ¶
Package azureblob provides a blob implementation that uses Azure Storage’s BlockBlob. Use OpenBucket to construct a *blob.Bucket.
URLs ¶
For blob.OpenBucket, azureblob registers for the scheme "azblob". The default URL opener will use credentials from the environment variables AZURE_STORAGE_ACCOUNT, AZURE_STORAGE_KEY, and AZURE_STORAGE_SAS_TOKEN. AZURE_STORAGE_ACCOUNT is required, along with one of the other two. To customize the URL opener, or for more details on the URL format, see URLOpener. See https://godoc.org/gocloud.dev#hdr-URLs for background information.
Escaping ¶
Go CDK supports all UTF-8 strings; to make this work with providers lacking full UTF-8 support, strings must be escaped (during writes) and unescaped (during reads). The following escapes are performed for azureblob:
- Blob keys: ASCII characters 0-31, 92 ("\"), and 127 are escaped to "__0x<hex>__". Additionally, the "/" in "../" and a trailing "/" in a key (e.g., "foo/") are escaped in the same way.
- Metadata keys: Per https://docs.microsoft.com/en-us/azure/storage/blobs/storage-properties-metadata, Azure only allows C# identifiers as metadata keys. Therefore, characters other than "[a-z][A-z][0-9]_" are escaped using "__0x<hex>__". In addition, characters "[0-9]" are escaped when they start the string. URL encoding would not work since "%" is not valid.
- Metadata values: Escaped using URL encoding.
As ¶
azureblob exposes the following types for As:
- Bucket: *azblob.ContainerURL
- Error: azblob.StorageError
- ListObject: azblob.BlobItem for objects, azblob.BlobPrefix for "directories"
- ListOptions.BeforeList: *azblob.ListBlobsSegmentOptions
- Reader: azblob.DownloadResponse
- Attributes: azblob.BlobGetPropertiesResponse
- WriterOptions.BeforeWrite: *azblob.UploadStreamToBlockBlobOptions
Example ¶
package main import ( "context" "log" "github.com/Azure/azure-storage-blob-go/azblob" "gocloud.dev/blob/azureblob" ) func main() { const ( // Your Azure Storage Acount and Access Key. accountName = azureblob.AccountName("my-account") accountKey = azureblob.AccountKey("my-account-key") // The storage container to access. containerName = "mycontainer" ) // Create a credentials object. credential, err := azureblob.NewCredential(accountName, accountKey) if err != nil { log.Fatal(err) } // Create a Pipeline, using whatever PipelineOptions you need. pipeline := azureblob.NewPipeline(credential, azblob.PipelineOptions{}) // Create a *blob.Bucket. // The credential Option is required if you're going to use blob.SignedURL. ctx := context.Background() b, err := azureblob.OpenBucket(ctx, pipeline, accountName, containerName, &azureblob.Options{Credential: credential}) if err != nil { log.Fatal(err) } defer b.Close() // Now we can use b to read or write files to the container. data, err := b.ReadAll(ctx, "my-key") if err != nil { log.Fatal(err) } _ = data }
Output:
Example (OpenBucket) ¶
package main import ( "context" "gocloud.dev/blob" ) func main() { ctx := context.Background() // OpenBucket creates a *blob.Bucket from a URL. // This URL will open the container "mycontainer" using default // credentials found in the environment variables // AZURE_STORAGE_ACCOUNT plus at least one of AZURE_STORAGE_KEY // and AZURE_STORAGE_SAS_TOKEN. b, err := blob.OpenBucket(ctx, "azblob://mycontainer") defer b.Close() _, _ = b, err }
Output:
Example (SasToken) ¶
package main import ( "context" "log" "github.com/Azure/azure-storage-blob-go/azblob" "gocloud.dev/blob/azureblob" ) func main() { const ( // Your Azure Storage Account and SASToken. accountName = azureblob.AccountName("my-account") sasToken = azureblob.SASToken("my-SAS-token") // The storage container to access. containerName = "mycontainer" ) // Since we're using a SASToken, we can use anonymous credentials. credential := azblob.NewAnonymousCredential() // Create a Pipeline, using whatever PipelineOptions you need. pipeline := azureblob.NewPipeline(credential, azblob.PipelineOptions{}) // Create a *blob.Bucket. // Note that we're not supplying azureblob.Options.Credential, so SignedURL // won't work. To use SignedURL, you need a real credential (see the other // example). ctx := context.Background() b, err := azureblob.OpenBucket(ctx, pipeline, accountName, containerName, &azureblob.Options{SASToken: sasToken}) if err != nil { log.Fatal(err) } defer b.Close() // Now we can use b to read or write files to the container. data, err := b.ReadAll(ctx, "my-key") if err != nil { log.Fatal(err) } _ = data }
Output:
Index ¶
- Constants
- Variables
- func NewCredential(accountName AccountName, accountKey AccountKey) (*azblob.SharedKeyCredential, error)
- func NewPipeline(credential azblob.Credential, opts azblob.PipelineOptions) pipeline.Pipeline
- func OpenBucket(ctx context.Context, pipeline pipeline.Pipeline, accountName AccountName, ...) (*blob.Bucket, error)
- type AccountKey
- type AccountName
- type Options
- type SASToken
- type URLOpener
Examples ¶
Constants ¶
const Scheme = "azblob"
Scheme is the URL scheme gcsblob registers its URLOpener under on blob.DefaultMux.
Variables ¶
var DefaultIdentity = wire.NewSet( DefaultAccountName, DefaultAccountKey, NewCredential, wire.Bind(new(azblob.Credential), new(azblob.SharedKeyCredential)), wire.Value(azblob.PipelineOptions{}), )
DefaultIdentity is a Wire provider set that provides an Azure storage account name, key, and SharedKeyCredential from environment variables.
var SASTokenIdentity = wire.NewSet( DefaultAccountName, DefaultSASToken, azblob.NewAnonymousCredential, wire.Value(azblob.PipelineOptions{}), )
SASTokenIdentity is a Wire provider set that provides an Azure storage account name, SASToken, and anonymous credential from environment variables.
Functions ¶
func NewCredential ¶
func NewCredential(accountName AccountName, accountKey AccountKey) (*azblob.SharedKeyCredential, error)
NewCredential creates a SharedKeyCredential.
func NewPipeline ¶
func NewPipeline(credential azblob.Credential, opts azblob.PipelineOptions) pipeline.Pipeline
NewPipeline creates a Pipeline for making HTTP requests to Azure.
func OpenBucket ¶
func OpenBucket(ctx context.Context, pipeline pipeline.Pipeline, accountName AccountName, containerName string, opts *Options) (*blob.Bucket, error)
OpenBucket returns a *blob.Bucket backed by Azure Storage Account. See the package documentation for an example and https://godoc.org/github.com/Azure/azure-storage-blob-go/azblob for more details.
Types ¶
type AccountKey ¶
type AccountKey string
AccountKey is an Azure storage account key (primary or secondary).
func DefaultAccountKey ¶
func DefaultAccountKey() (AccountKey, error)
DefaultAccountKey loads the Azure storage account key (primary or secondary) from the AZURE_STORAGE_KEY environment variable.
type AccountName ¶
type AccountName string
AccountName is an Azure storage account name.
func DefaultAccountName ¶
func DefaultAccountName() (AccountName, error)
DefaultAccountName loads the Azure storage account name from the AZURE_STORAGE_ACCOUNT environment variable.
type Options ¶
type Options struct { // Credential represents the authorizer for SignedURL. // Required to use SignedURL. Credential *azblob.SharedKeyCredential // SASToken can be provided along with anonymous credentials to use // delegated privileges. // See https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1#shared-access-signature-parameters. SASToken SASToken }
Options sets options for constructing a *blob.Bucket backed by Azure Block Blob.
type SASToken ¶
type SASToken string
SASToken is an Azure shared access signature. https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
func DefaultSASToken ¶
DefaultSASToken loads a Azure SAS token from the AZURE_STORAGE_SAS_TOKEN environment variable.
type URLOpener ¶ added in v0.10.0
type URLOpener struct { // AccountName must be specified. AccountName AccountName // Pipeline must be set to a non-nil value. Pipeline pipeline.Pipeline // Options specifies the options to pass to OpenBucket. Options Options }
URLOpener opens Azure URLs like "azblob://mybucket".
The URL host is used as the bucket name.
No query parameters are supported.