cloud

package module
v0.4.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 1, 2016 License: Apache-2.0 Imports: 0 Imported by: 0

README

Google Cloud for Go

Build Status GoDoc

import "cloud.google.com/go"

Go packages for Google Cloud Platform services.

NOTE: These packages are under development, and may occasionally make backwards-incompatible changes.

NOTE: Github repo is a mirror of https://code.googlesource.com/gocloud.

News

October 27, 2016

Breaking change to bigquery: NewGCSReference is now a function, not a method on Client.

New bigquery feature: Table.LoaderFrom now accepts a ReaderSource, enabling loading data into a table from a file or any io.Reader.

October 21, 2016

Breaking change to pubsub: removed pubsub.Done.

Use iterator.Done instead, where iterator is the package google.golang.org/api/iterator.

October 19, 2016

Breaking changes to cloud.google.com/go/bigquery:

  • Client.Table and Client.OpenTable have been removed. Replace

    client.OpenTable("project", "dataset", "table")
    

    with

    client.DatasetInProject("project", "dataset").Table("table")
    
  • Client.CreateTable has been removed. Replace

    client.CreateTable(ctx, "project", "dataset", "table")
    

    with

    client.DatasetInProject("project", "dataset").Table("table").Create(ctx)
    
  • Dataset.ListTables have been replaced with Dataset.Tables. Replace

    tables, err := ds.ListTables(ctx)
    

    with

    it := ds.Tables(ctx)
    for {
        table, err := it.Next()
        if err == iterator.Done {
            break
        }
        if err != nil {
            // TODO: Handle error.
        }
        // TODO: use table.
    }
    
  • Client.Read has been replaced with Job.Read, Table.Read and Query.Read. Replace

    it, err := client.Read(ctx, job)
    

    with

    it, err := job.Read(ctx)
    

    and similarly for reading from tables or queries.

  • The iterator returned from the Read methods is now named RowIterator. Its behavior is closer to the other iterators in these libraries. It no longer supports the Schema method; see the next item. Replace

    for it.Next(ctx) {
        var vals ValueList
        if err := it.Get(&vals); err != nil {
            // TODO: Handle error.
        }
        // TODO: use vals.
    }
    if err := it.Err(); err != nil {
        // TODO: Handle error.
    }
    

    with

    for {
        var vals ValueList
        err := it.Next(&vals)
        if err == iterator.Done {
            break
        }
        if err != nil {
            // TODO: Handle error.
        }
        // TODO: use vals.
    }
    

    Instead of the RecordsPerRequest(n) option, write

    it.PageInfo().MaxSize = n
    

    Instead of the StartIndex(i) option, write

    it.StartIndex = i
    
  • ValueLoader.Load now takes a Schema in addition to a slice of Values. Replace

    func (vl *myValueLoader) Load(v []bigquery.Value)
    

    with

    func (vl *myValueLoader) Load(v []bigquery.Value, s bigquery.Schema)
    
  • Table.Patch is replace by Table.Update. Replace

    p := table.Patch()
    p.Description("new description")
    metadata, err := p.Apply(ctx)
    

    with

    metadata, err := table.Update(ctx, bigquery.TableMetadataToUpdate{
        Description: "new description",
    })
    
  • Client.Copy is replaced by separate methods for each of its four functions. All options have been replaced by struct fields.

    • To load data from Google Cloud Storage into a table, use Table.LoaderFrom.

      Replace

      client.Copy(ctx, table, gcsRef)
      

      with

      table.LoaderFrom(gcsRef).Run(ctx)
      

      Instead of passing options to Copy, set fields on the Loader:

      loader := table.LoaderFrom(gcsRef)
      loader.WriteDisposition = bigquery.WriteTruncate
      
    • To extract data from a table into Google Cloud Storage, use Table.ExtractorTo. Set fields on the returned Extractor instead of passing options.

      Replace

      client.Copy(ctx, gcsRef, table)
      

      with

      table.ExtractorTo(gcsRef).Run(ctx)
      
    • To copy data into a table from one or more other tables, use Table.CopierFrom. Set fields on the returned Copier instead of passing options.

      Replace

      client.Copy(ctx, dstTable, srcTable)
      

      with

      dst.Table.CopierFrom(srcTable).Run(ctx)
      
    • To start a query job, create a Query and call its Run method. Set fields on the query instead of passing options.

      Replace

      client.Copy(ctx, table, query)
      

      with

      query.Run(ctx)
      
  • Table.NewUploader has been renamed to Table.Uploader. Instead of options, configure an Uploader by setting its fields. Replace

    u := table.NewUploader(bigquery.UploadIgnoreUnknownValues())
    

    with

    u := table.NewUploader(bigquery.UploadIgnoreUnknownValues())
    u.IgnoreUnknownValues = true
    

October 10, 2016

Breaking changes to cloud.google.com/go/storage:

  • AdminClient replaced by methods on Client. Replace

    adminClient.CreateBucket(ctx, bucketName, attrs)
    

    with

    client.Bucket(bucketName).Create(ctx, projectID, attrs)
    
  • BucketHandle.List replaced by BucketHandle.Objects. Replace

    for query != nil {
        objs, err := bucket.List(d.ctx, query)
        if err != nil { ... }
        query = objs.Next
        for _, obj := range objs.Results {
            fmt.Println(obj)
        }
    }
    

    with

    iter := bucket.Objects(d.ctx, query)
    for {
        obj, err := iter.Next()
        if err == iterator.Done {
            break
        }
        if err != nil { ... }
        fmt.Println(obj)
    }
    

    (The iterator package is at google.golang.org/api/iterator.)

    Replace Query.Cursor with ObjectIterator.PageInfo().Token.

    Replace Query.MaxResults with ObjectIterator.PageInfo().MaxSize.

  • ObjectHandle.CopyTo replaced by ObjectHandle.CopierFrom. Replace

    attrs, err := src.CopyTo(ctx, dst, nil)
    

    with

    attrs, err := dst.CopierFrom(src).Run(ctx)
    

    Replace

    attrs, err := src.CopyTo(ctx, dst, &storage.ObjectAttrs{ContextType: "text/html"})
    

    with

    c := dst.CopierFrom(src)
    c.ContextType = "text/html"
    attrs, err := c.Run(ctx)
    
  • ObjectHandle.ComposeFrom replaced by ObjectHandle.ComposerFrom. Replace

    attrs, err := dst.ComposeFrom(ctx, []*storage.ObjectHandle{src1, src2}, nil)
    

    with

    attrs, err := dst.ComposerFrom(src1, src2).Run(ctx)
    
  • ObjectHandle.Update's ObjectAttrs argument replaced by ObjectAttrsToUpdate. Replace

    attrs, err := obj.Update(ctx, &storage.ObjectAttrs{ContextType: "text/html"})
    

    with

    attrs, err := obj.Update(ctx, storage.ObjectAttrsToUpdate{ContextType: "text/html"})
    
  • ObjectHandle.WithConditions replaced by ObjectHandle.If. Replace

    obj.WithConditions(storage.Generation(gen), storage.IfMetaGenerationMatch(mgen))
    

    with

    obj.Generation(gen).If(storage.Conditions{MetagenerationMatch: mgen})
    

    Replace

    obj.WithConditions(storage.IfGenerationMatch(0))
    

    with

    obj.If(storage.Conditions{DoesNotExist: true})
    
  • storage.Done replaced by iterator.Done (from package google.golang.org/api/iterator).

October 6, 2016

Package preview/logging deleted. Use logging instead.

September 27, 2016

Logging client replaced with preview version (see below).

September 8, 2016

  • New clients for some of Google's Machine Learning APIs: Vision, Speech, and Natural Language.

  • Preview version of a new Stackdriver Logging client in cloud.google.com/go/preview/logging. This client uses gRPC as its transport layer, and supports log reading, sinks and metrics. It will replace the current client at cloud.google.com/go/logging shortly.

Supported APIs

Google API Status Package
Datastore beta cloud.google.com/go/datastore
Storage beta cloud.google.com/go/storage
Pub/Sub experimental cloud.google.com/go/pubsub
Bigtable beta cloud.google.com/go/bigtable
BigQuery experimental cloud.google.com/go/bigquery
Logging experimental cloud.google.com/go/logging
Vision experimental cloud.google.com/go/vision
Language experimental cloud.google.com/go/language/apiv1beta1
Speech experimental cloud.google.com/go/speech/apiv1beta

Experimental status: the API is still being actively developed. As a result, it might change in backward-incompatible ways and is not recommended for production use.

Beta status: the API is largely complete, but still has outstanding features and bugs to be addressed. There may be minor backwards-incompatible changes where necessary.

Stable status: the API is mature and ready for production use. We will continue addressing bugs and feature requests.

Documentation and examples are available at https://godoc.org/cloud.google.com/go

Visit or join the google-api-go-announce group for updates on these packages.

Go Versions Supported

We support the two most recent major versions of Go. If Google App Engine uses an older version, we support that as well. You can see which versions are currently supported by looking at the lines following go: in .travis.yml.

Authorization

By default, each API will use Google Application Default Credentials for authorization credentials used in calling the API endpoints. This will allow your application to run in many environments without requiring explicit configuration.

To authorize using a JSON key file, pass option.WithServiceAccountFile to the NewClient function of the desired package. For example:

client, err := storage.NewClient(ctx, option.WithServiceAccountFile("path/to/keyfile.json"))

You can exert more control over authorization by using the golang.org/x/oauth2 package to create an oauth2.TokenSource. Then pass option.WithTokenSource to the NewClient function:

tokenSource := ...
client, err := storage.NewClient(ctx, option.WithTokenSource(tokenSource))

Cloud Datastore GoDoc

Example Usage

First create a datastore.Client to use throughout your application:

client, err := datastore.NewClient(ctx, "my-project-id")
if err != nil {
	log.Fatal(err)
}

Then use that client to interact with the API:

type Post struct {
	Title       string
	Body        string `datastore:",noindex"`
	PublishedAt time.Time
}
keys := []*datastore.Key{
	datastore.NewKey(ctx, "Post", "post1", 0, nil),
	datastore.NewKey(ctx, "Post", "post2", 0, nil),
}
posts := []*Post{
	{Title: "Post 1", Body: "...", PublishedAt: time.Now()},
	{Title: "Post 2", Body: "...", PublishedAt: time.Now()},
}
if _, err := client.PutMulti(ctx, keys, posts); err != nil {
	log.Fatal(err)
}

Cloud Storage GoDoc

Example Usage

First create a storage.Client to use throughout your application:

client, err := storage.NewClient(ctx)
if err != nil {
	log.Fatal(err)
}
// Read the object1 from bucket.
rc, err := client.Bucket("bucket").Object("object1").NewReader(ctx)
if err != nil {
	log.Fatal(err)
}
defer rc.Close()
body, err := ioutil.ReadAll(rc)
if err != nil {
	log.Fatal(err)
}

Cloud Pub/Sub GoDoc

Example Usage

First create a pubsub.Client to use throughout your application:

client, err := pubsub.NewClient(ctx, "project-id")
if err != nil {
	log.Fatal(err)
}
// Publish "hello world" on topic1.
topic := client.Topic("topic1")
msgIDs, err := topic.Publish(ctx, &pubsub.Message{
	Data: []byte("hello world"),
})
if err != nil {
	log.Fatal(err)
}

// Create an iterator to pull messages via subscription1.
it, err := client.Subscription("subscription1").Pull(ctx)
if err != nil {
	log.Println(err)
}
defer it.Stop()

// Consume N messages from the iterator.
for i := 0; i < N; i++ {
	msg, err := it.Next()
	if err == iterator.Done {
		break
	}
	if err != nil {
		log.Fatalf("Failed to retrieve message: %v", err)
	}

	fmt.Printf("Message %d: %s\n", i, msg.Data)
	msg.Done(true) // Acknowledge that we've consumed the message.
}

Cloud BigQuery GoDoc

Example Usage

First create a bigquery.Client to use throughout your application:

c, err := bigquery.NewClient(ctx, "my-project-ID")
if err != nil {
    // TODO: Handle error.
}

Then use that client to interact with the API:

// Construct a query.
q := c.Query(`
    SELECT year, SUM(number)
    FROM [bigquery-public-data:usa_names.usa_1910_2013]
    WHERE name = "William"
    GROUP BY year
    ORDER BY year
`)
// Execute the query.
it, err := q.Read(ctx)
if err != nil {
    // TODO: Handle error.
}
// Iterate through the results.
for {
    var values bigquery.ValueList
    err := it.Next(&values)
    if err == iterator.Done {
        break
    }
    if err != nil {
        // TODO: Handle error.
    }
    fmt.Println(values)
}

Stackdriver Logging GoDoc

Example Usage

First create a logging.Client to use throughout your application:

ctx := context.Background()
client, err := logging.NewClient(ctx, "my-project")
if err != nil {
    // TODO: Handle error.
}

Usually, you'll want to add log entries to a buffer to be periodically flushed (automatically and asynchronously) to the Stackdriver Logging service.

logger := client.Logger("my-log")
logger.Log(logging.Entry{Payload: "something happened!"})

Close your client before your program exits, to flush any buffered log entries.

err = client.Close()
if err != nil {
    // TODO: Handle error.
}

Contributing

Contributions are welcome. Please, see the CONTRIBUTING document for details. We're using Gerrit for our code reviews. Please don't open pull requests against this repo, new pull requests will be automatically closed.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.

Documentation

Overview

Package cloud is the root of the packages used to access Google Cloud Services. See https://godoc.org/cloud.google.com/go for a full list of sub-packages.

This package documents how to authorize and authenticate the sub packages.

Example (ApplicationDefaultCredentials)
package main

import (
	"cloud.google.com/go/datastore"
	"golang.org/x/net/context"
)

func main() {
	ctx := context.Background()
	// Use Google Application Default Credentials to authorize and authenticate the client.
	// More information about Application Default Credentials and how to enable is at
	// https://developers.google.com/identity/protocols/application-default-credentials.
	//
	// This is the recommended way of authorizing and authenticating.
	//
	// Note: The example uses the datastore client, but the same steps apply to
	// the other client libraries underneath this package.
	client, err := datastore.NewClient(ctx, "project-id")
	if err != nil {
		// TODO: handle error.
	}
	// Use the client.
	_ = client
}
Output:

Example (ServiceAccountFile)
package main

import (
	"cloud.google.com/go/datastore"
	"golang.org/x/net/context"
	"google.golang.org/api/option"
)

func main() {
	// Warning: The better way to use service accounts is to set GOOGLE_APPLICATION_CREDENTIALS
	// and use the Application Default Credentials.
	ctx := context.Background()
	// Use a JSON key file associated with a Google service account to
	// authenticate and authorize.
	// Go to https://console.developers.google.com/permissions/serviceaccounts to create
	// and download a service account key for your project.
	//
	// Note: The example uses the datastore client, but the same steps apply to
	// the other client libraries underneath this package.
	client, err := datastore.NewClient(ctx,
		"project-id",
		option.WithServiceAccountFile("/path/to/service-account-key.json"))
	if err != nil {
		// TODO: handle error.
	}
	// Use the client.
	_ = client
}
Output:

Directories

Path Synopsis
Package bigquery provides a client for the BigQuery service.
Package bigquery provides a client for the BigQuery service.
Package bigtable is an API to Google Cloud Bigtable.
Package bigtable is an API to Google Cloud Bigtable.
bttest
Package bttest contains test helpers for working with the bigtable package.
Package bttest contains test helpers for working with the bigtable package.
cmd/cbt
Cbt is a tool for doing basic interactions with Cloud Bigtable.
Cbt is a tool for doing basic interactions with Cloud Bigtable.
cmd/emulator
cbtemulator launches the in-memory Cloud Bigtable server on the given address.
cbtemulator launches the in-memory Cloud Bigtable server on the given address.
cmd/loadtest
Loadtest does some load testing through the Go client library for Cloud Bigtable.
Loadtest does some load testing through the Go client library for Cloud Bigtable.
cmd/scantest
Scantest does scan-related load testing against Cloud Bigtable.
Scantest does scan-related load testing against Cloud Bigtable.
internal/cbtrc
Package cbtrc encapsulates common code for reading .cbtrc files.
Package cbtrc encapsulates common code for reading .cbtrc files.
internal/gax
This is ia snapshot from github.com/googleapis/gax-go with minor modifications.
This is ia snapshot from github.com/googleapis/gax-go with minor modifications.
internal/option
Package option contains common code for dealing with client options.
Package option contains common code for dealing with client options.
cmd
go-cloud-debug-agent/internal/breakpoints
Package breakpoints handles breakpoint requests we get from the user through the Debuglet Controller, and manages corresponding breakpoints set in the code.
Package breakpoints handles breakpoint requests we get from the user through the Debuglet Controller, and manages corresponding breakpoints set in the code.
go-cloud-debug-agent/internal/controller
Package debuglet is a library for interacting with the Google Cloud Debugger's Debuglet Controller service.
Package debuglet is a library for interacting with the Google Cloud Debugger's Debuglet Controller service.
go-cloud-debug-agent/internal/valuecollector
Package valuecollector is used to collect the values of variables in a program.
Package valuecollector is used to collect the values of variables in a program.
compute
metadata
Package metadata provides access to Google Compute Engine (GCE) metadata and API service accounts.
Package metadata provides access to Google Compute Engine (GCE) metadata and API service accounts.
Package container contains a Google Container Engine client.
Package container contains a Google Container Engine client.
Package datastore provides a client for Google Cloud Datastore.
Package datastore provides a client for Google Cloud Datastore.
debugger
apiv2
Package debugger is an experimental, auto-generated package for the debugger API.
Package debugger is an experimental, auto-generated package for the debugger API.
errorreporting
apiv1beta1
Package errorreporting is an experimental, auto-generated package for the errorreporting API.
Package errorreporting is an experimental, auto-generated package for the errorreporting API.
Package errors is a Google Stackdriver Error Reporting library.
Package errors is a Google Stackdriver Error Reporting library.
examples
bigquery/concat_table
concat_table is an example client of the bigquery client library.
concat_table is an example client of the bigquery client library.
bigquery/load
load is an example client of the bigquery client library.
load is an example client of the bigquery client library.
bigquery/query
query is an example client of the bigquery client library.
query is an example client of the bigquery client library.
bigquery/read
read is an example client of the bigquery client library.
read is an example client of the bigquery client library.
bigtable/helloworld
Hello world is a sample program demonstrating use of the Bigtable client library to perform basic CRUD operations
Hello world is a sample program demonstrating use of the Bigtable client library to perform basic CRUD operations
bigtable/search
Search is a sample web server that uses Cloud Bigtable as the storage layer for a simple document-storage and full-text-search service.
Search is a sample web server that uses Cloud Bigtable as the storage layer for a simple document-storage and full-text-search service.
bigtable/usercounter
User counter is a program that tracks how often a user has visited the index page.
User counter is a program that tracks how often a user has visited the index page.
storage/appengine
[START sample] Package gcsdemo is an example App Engine app using the Google Cloud Storage API.
[START sample] Package gcsdemo is an example App Engine app using the Google Cloud Storage API.
storage/appenginevm
Package main is an example Mananged VM app using the Google Cloud Storage API.
Package main is an example Mananged VM app using the Google Cloud Storage API.
iam
Package iam supports the resource-specific operations of Google Cloud IAM (Identity and Access Management) for the Google Cloud Libraries.
Package iam supports the resource-specific operations of Google Cloud IAM (Identity and Access Management) for the Google Cloud Libraries.
admin/apiv1
Package admin is an experimental, auto-generated package for the admin API.
Package admin is an experimental, auto-generated package for the admin API.
Package internal provides support for the cloud packages.
Package internal provides support for the cloud packages.
optional
Package optional provides versions of primitive types that can be nil.
Package optional provides versions of primitive types that can be nil.
pretty
Package pretty implements a simple pretty-printer.
Package pretty implements a simple pretty-printer.
testutil
Package testutil contains helper functions for writing tests.
Package testutil contains helper functions for writing tests.
language
apiv1beta1
Package language is an experimental, auto-generated package for the language API.
Package language is an experimental, auto-generated package for the language API.
Package logging contains a Stackdriver Logging client suitable for writing logs.
Package logging contains a Stackdriver Logging client suitable for writing logs.
apiv2
Package logging is an experimental, auto-generated package for the logging API.
Package logging is an experimental, auto-generated package for the logging API.
internal/testing
Package testing provides support for testing the logging client.
Package testing provides support for testing the logging client.
logadmin
Package logadmin contains a Stackdriver Logging client that can be used for reading logs and working with sinks, metrics and monitored resources.
Package logadmin contains a Stackdriver Logging client that can be used for reading logs and working with sinks, metrics and monitored resources.
Package longrunning supports Long Running Operations for the Google Cloud Libraries.
Package longrunning supports Long Running Operations for the Google Cloud Libraries.
monitoring
apiv3
Package monitoring is an experimental, auto-generated package for the monitoring API.
Package monitoring is an experimental, auto-generated package for the monitoring API.
Package pubsub provides an easy way to publish and receive Google Cloud Pub/Sub messages, hiding the the details of the underlying server RPCs.
Package pubsub provides an easy way to publish and receive Google Cloud Pub/Sub messages, hiding the the details of the underlying server RPCs.
apiv1
Package pubsub is an experimental, auto-generated package for the pubsub API.
Package pubsub is an experimental, auto-generated package for the pubsub API.
speech
apiv1beta1
Package speech is an experimental, auto-generated package for the speech API.
Package speech is an experimental, auto-generated package for the speech API.
Package storage contains a Google Cloud Storage client.
Package storage contains a Google Cloud Storage client.
Package trace is a Google Stackdriver Trace library.
Package trace is a Google Stackdriver Trace library.
apiv1
Package trace is an experimental, auto-generated package for the trace API.
Package trace is an experimental, auto-generated package for the trace API.
Package vision provides a client for the Google Cloud Vision API.
Package vision provides a client for the Google Cloud Vision API.
apiv1
Package vision is an experimental, auto-generated package for the vision API.
Package vision is an experimental, auto-generated package for the vision API.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL