Documentation ¶
Index ¶
- Variables
- type FetchOptions
- type Fetcher
- func (f *Fetcher) Fetch(u url.URL, dest *os.File, opts FetchOptions) error
- func (f *Fetcher) FetchToBuffer(u url.URL, opts FetchOptions) ([]byte, error)
- func (f *Fetcher) RewriteCAsWithDataUrls(cas []types.Resource) error
- func (f *Fetcher) UpdateHttpTimeoutsAndCAs(timeouts types.Timeouts, cas []types.Resource, proxy types.Proxy) error
- type HttpClient
Constants ¶
This section is empty.
Variables ¶
var ( ErrTimeout = errors.New("unable to fetch resource in time") ErrPEMDecodeFailed = errors.New("unable to decode PEM block") )
var ( ErrSchemeUnsupported = errors.New("unsupported source scheme") ErrPathNotAbsolute = errors.New("path is not absolute") ErrNotFound = errors.New("resource not found") ErrFailed = errors.New("failed to fetch resource") ErrCompressionUnsupported = errors.New("compression is not supported with that scheme") ErrNeedNet = errors.New("resource requires networking") )
Functions ¶
This section is empty.
Types ¶
type FetchOptions ¶
type FetchOptions struct { // Headers are the http headers that will be used when fetching http(s) // resources. They have no effect on other fetching schemes. Headers http.Header // Hash is the hash to use when calculating a fetched resource's hash. If // left as nil, no hash will be calculated. Hash hash.Hash // The expected sum to be produced by the given hasher. If the Hash field is // nil, this field is ignored. ExpectedSum []byte // Compression specifies the type of compression to use when decompressing // the fetched object. If left empty, no decompression will be used. Compression string // HTTPVerb is an HTTP request method to indicate the desired action to // be performed for a given resource. HTTPVerb string }
type Fetcher ¶
type Fetcher struct { // The logger object to use when logging information. Logger *log.Logger // The AWS Session to use when fetching resources from S3. If left nil, the // first S3 object that is fetched will initialize the field. This can be // used to set credentials. AWSSession *session.Session // The region where the AWS machine trying to fetch is. // This is used as a hint to fetch the S3 bucket from the right partition and region. S3RegionHint string // GCSSession is a client for interacting with Google Cloud Storage. // It is used when fetching resources from GCS. GCSSession *storage.Client // Whether to only attempt fetches which can be performed offline. This // currently only includes the "data" scheme. Other schemes will result in // ErrNeedNet. In the future, we can improve on this by dropping this // and just making sure that we canonicalize all "insufficient // network"-related errors to ErrNeedNet. That way, distro integrators // could distinguish between "partial" and full network bring-up. Offline bool // contains filtered or unexported fields }
Fetcher holds settings for fetching resources from URLs
func (*Fetcher) Fetch ¶
Fetch calls the appropriate FetchFrom* function based on the scheme of the given URL. The results will be decompressed if compression is set in opts, and written into dest. If opts.Hash is set the data stream will also be hashed and compared against opts.ExpectedSum, and any match failures will result in an error being returned.
Fetch expects dest to be an empty file and for the cursor in the file to be at the beginning. Since some url schemes (ex: s3) use chunked downloads and fetch chunks out of order, Fetch's behavior when dest is not an empty file is undefined.
func (*Fetcher) FetchToBuffer ¶
FetchToBuffer will fetch the given url into a temporary file, and then read in the contents of the file and delete it. It will return the downloaded contents, or an error if one was encountered.
func (*Fetcher) RewriteCAsWithDataUrls ¶
RewriteCAsWithDataUrls will modify the passed in slice of CA references to contain the actual CA file via a dataurl in their source field.
type HttpClient ¶
type HttpClient struct {
// contains filtered or unexported fields
}
HttpClient is a simple wrapper around the Go HTTP client that standardizes the process and logging of fetching payloads.