dataset

package module
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 3, 2019 License: MIT Imports: 11 Imported by: 43

README

dataset

Qri GoDoc License Codecov CI Go Report Card

Dataset contains the qri ("query") dataset document definition. This package contains the base definition, as well as a number of subpackages that build from this base to add functionality as necessary Datasets take inspiration from HTML documents, deliniating semantic purpose to predefined tags of the document, but instead of orienting around presentational markup, dataset documents emphasize interoperability and composition. The principle encoding format for a dataset document is JSON.

Subpackage Overview
  • compression: defines supported types of compression for interpreting a dataset
  • detect: dataset structure & schema inference
  • dsfs: "datasets on a content-addressed file system" tools to work with datasets stored with the cafs interface: github.com/qri-io/qfs/cafs
  • dsgraph: expressing relationships between and within datasets as graphs
  • dsio: io primitives for working with dataset bodies as readers, writers, buffers, oriented around row-like "entries".
  • dstest: utility functions for working with tests that need datasets
  • dsutil: utility functions that avoid dataset bloat
  • generate: io primitives for generating data
  • use_generate: small package that uses generate to create test data
  • validate: dataset validation & checking functions
  • vals: data type mappings & definitions

Getting Involved

We would love involvement from more people! If you notice any errors or would like to submit changes, please see our Contributing Guidelines.

Documentation

Overview

Package dataset contains the qri ("query") dataset document definition This package contains the base definition, as well as a number of subpackages that build from this base to add functionality as necessary Datasets take inspiration from HTML documents, deliniating semantic purpose to predefined tags of the document, but instead of orienting around presentational markup, dataset documents emphasize interoperability and composition. The principle encoding format for a dataset document is JSON.

Alpha-Keys: Dataset documents are designed to produce consistent checksums when encoded for storage & transmission. To keep hashing consistent map keys are sorted lexographically for encoding. This applies to all fields of a dataset document except the body of a dataaset, where users may need to dictate the ordering of map keys

Pod ("Plain old Data") Pattern: To maintain high interoperability, dataset documents must support encoding & decoding ("coding", or "serialization") to and from many formats, fields of dataset documents that leverage "exotic" custom types are acommpanied by a "Plain Old Data" variant, denoted by a "Pod" suffix in their name Plain-Old-Data variants use only basic go types: string, bool, int, float64, []interface{}, etc. and have methods for clean encoding and decoding to their exotic forms

Index

Constants

View Source
const (
	// KindDataset is the current kind for datasets
	KindDataset = Kind("ds:" + CurrentSpecVersion)
	// KindMeta is the current kind for metadata
	KindMeta = Kind("md:" + CurrentSpecVersion)
	// KindStructure is the current kind for dataset structures
	KindStructure = Kind("st:" + CurrentSpecVersion)
	// KindTransform is the current kind for dataset transforms
	KindTransform = Kind("tf:" + CurrentSpecVersion)
	// KindCommit is the current kind for dataset transforms
	KindCommit = Kind("cm:" + CurrentSpecVersion)
	// KindViz is the current kind for dataset transforms
	KindViz = Kind("vz:" + CurrentSpecVersion)
)
View Source
const CurrentSpecVersion = "0"

CurrentSpecVersion is the current verion of the dataset spec

Variables

View Source
var (
	// ErrInlineBody is the error for attempting to generate a body file when
	// body data is stored as native go types
	ErrInlineBody = fmt.Errorf("dataset body is inlined")
	// ErrNoResolver is an error for missing-but-needed resolvers
	ErrNoResolver = fmt.Errorf("no resolver available to fetch path")
)
View Source
var (
	// BaseSchemaArray is a minimum schema to constitute a dataset, specifying
	// the top level of the document is an array
	BaseSchemaArray = map[string]interface{}{"type": "array"}
	// BaseSchemaObject is a minimum schema to constitute a dataset, specifying
	// the top level of the document is an object
	BaseSchemaObject = map[string]interface{}{"type": "object"}
)
View Source
var ErrUnknownDataFormat = fmt.Errorf("Unknown Data Format")

ErrUnknownDataFormat is the expected error for when a data format is missing or unknown

Functions

func AbstractColumnName

func AbstractColumnName(i int) string

AbstractColumnName is the "base26" value of a column name to make short, sql-valid, deterministic column names

func AccuralDuration

func AccuralDuration(p string) time.Duration

AccuralDuration takes an ISO 8601 periodicity measure & returns a time.Duration invalid periodicities return time.Duration(0)

func CompareCommits

func CompareCommits(a, b *Commit) error

CompareCommits checks if all fields of a Commit are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareDatasets

func CompareDatasets(a, b *Dataset) error

CompareDatasets checks if all fields of a dataset are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareLicenses

func CompareLicenses(a, b *License) error

CompareLicenses checks if all fields in two License pointers are equal, returning an error if unequal

func CompareMetas

func CompareMetas(a, b *Meta) error

CompareMetas checks if all fields of a metadata struct are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareSchemas

func CompareSchemas(a, b map[string]interface{}) error

CompareSchemas checks if all fields of two Schema pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareStringSlices

func CompareStringSlices(a, b []string) error

CompareStringSlices confirms two string slices are the same size, contain the same values, in the same order

func CompareStructures

func CompareStructures(a, b *Structure) error

CompareStructures checks if all fields of two structure pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareTransformResources

func CompareTransformResources(a, b *TransformResource) error

CompareTransformResources checks if all fields are equal in both resources

func CompareTransforms

func CompareTransforms(a, b *Transform) error

CompareTransforms checks if all fields of two transform pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func CompareVizs

func CompareVizs(a, b *Viz) error

CompareVizs checks if all fields of two Viz pointers are equal, returning an error on the first, nil if equal Note that comparison does not examine the internal path property

func HashBytes

func HashBytes(data []byte) (hash string, err error)

HashBytes generates the base-58 encoded SHA-256 hash of a byte slice It's important to note that this is *NOT* the same as an IPFS hash, These hash functions should be used for other things like checksumming, in-memory content-addressing, etc.

func JSONHash

func JSONHash(m json.Marshaler) (hash string, err error)

JSONHash calculates the hash of a json.Marshaler It's important to note that this is *NOT* the same as an IPFS hash, These hash functions should be used for other things like checksumming, in-memory content-addressing, etc.

Types

type CSVOptions

type CSVOptions struct {
	// HeaderRow specifies weather this csv file has a header row or not
	HeaderRow bool `json:"headerRow"`
	// If LazyQuotes is true, a quote may appear in an unquoted field and a
	// non-doubled quote may appear in a quoted field.
	LazyQuotes bool `json:"lazyQuotes"`
	// Separator is the field delimiter.
	// It is set to comma (',') by NewReader.
	// Comma must be a valid rune and must not be \r, \n,
	// or the Unicode replacement character (0xFFFD).
	Separator rune `json:"separator,omitempty"`
	// VariadicFields sets permits records to have a variable number of fields
	// avoid using this
	VariadicFields bool `json:"variadicFields"`
}

CSVOptions specifies configuration details for csv files This'll expand in the future to interoperate with okfn csv spec

func NewCSVOptions

func NewCSVOptions(opts map[string]interface{}) (*CSVOptions, error)

NewCSVOptions creates a CSVOptions pointer from a map

func (*CSVOptions) Format

func (*CSVOptions) Format() DataFormat

Format announces the CSV Data Format for the FormatConfig interface

func (*CSVOptions) Map

func (o *CSVOptions) Map() map[string]interface{}

Map returns a map[string]interface representation of the configuration

type Citation

type Citation struct {
	Name  string `json:"name,omitempty"`
	URL   string `json:"url,omitempty"`
	Email string `json:"email,omitempty"`
}

Citation is a place that this dataset drew it's information from

func (*Citation) Decode

func (c *Citation) Decode(val interface{}) (err error)

Decode reads json.Umarshal-style data into a Citation

type Commit

type Commit struct {
	// Author of this commit
	Author *User `json:"author,omitempty"`
	// Message is an optional
	Message string `json:"message,omitempty"`
	// Path is the location of this commit, transient
	Path string `json:"path,omitempty"`
	// Qri is this commit's qri kind
	Qri string `json:"qri,omitempty"`
	// Signature is a base58 encoded privateKey signing of Title
	Signature string `json:"signature,omitempty"`
	// Time this dataset was created. Required.
	Timestamp time.Time `json:"timestamp"`
	// Title of the commit. Required.
	Title string `json:"title"`
}

Commit encapsulates information about changes to a dataset in relation to other entries in a given history. Commit is directly analogous to the concept of a Commit Message in the git version control system. A full commit defines the administrative metadata of a dataset, answering "who made this dataset, when, and why"

func NewCommitRef

func NewCommitRef(path string) *Commit

NewCommitRef creates an empty struct with it's internal path set

func UnmarshalCommit

func UnmarshalCommit(v interface{}) (*Commit, error)

UnmarshalCommit tries to extract a dataset type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Commit) Assign

func (cm *Commit) Assign(msgs ...*Commit)

Assign collapses all properties of a set of Commit onto one. this is directly inspired by Javascript's Object.assign

func (*Commit) DropTransientValues

func (cm *Commit) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Commit) IsEmpty

func (cm *Commit) IsEmpty() bool

IsEmpty checks to see if any fields are filled out other than Path and Qri

func (*Commit) MarshalJSON

func (cm *Commit) MarshalJSON() ([]byte, error)

MarshalJSON implements the json.Marshaler interface for Commit Empty Commit instances with a non-empty path marshal to their path value otherwise, Commit marshals to an object

func (*Commit) MarshalJSONObject

func (cm *Commit) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Commit) UnmarshalJSON

func (cm *Commit) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaller for Commit

type DataFormat

type DataFormat int

DataFormat represents different types of data formats. formats specified here have some degree of support within the dataset packages TODO - consider placing this in a subpackage: dataformats

const (
	// UnknownDataFormat is the default dataformat, meaning
	// that a data format should always be specified when
	// using the DataFormat type
	UnknownDataFormat DataFormat = iota
	// CSVDataFormat specifies comma separated value-formatted data
	CSVDataFormat
	// JSONDataFormat specifies Javascript Object Notation-formatted data
	JSONDataFormat
	// CBORDataFormat specifies RFC 7049 Concise Binary Object Representation
	// read more at cbor.io
	CBORDataFormat
	// XMLDataFormat specifies eXtensible Markup Language-formatted data
	// currently not supported.
	XMLDataFormat
	// XLSXDataFormat specifies microsoft excel formatted data
	XLSXDataFormat
)

func ParseDataFormatString

func ParseDataFormatString(s string) (df DataFormat, err error)

ParseDataFormatString takes a string representation of a data format TODO (b5): trim "." prefix, remove prefixed map keys

func SupportedDataFormats

func SupportedDataFormats() []DataFormat

SupportedDataFormats gives a slice of data formats that are expected to work with this dataset package. As we work through support for different formats, the last step of providing full support to a format will be an addition to this slice

func (DataFormat) MarshalJSON

func (f DataFormat) MarshalJSON() ([]byte, error)

MarshalJSON satisfies the json.Marshaler interface

func (DataFormat) String

func (f DataFormat) String() string

String implements stringer interface for DataFormat

func (*DataFormat) UnmarshalJSON

func (f *DataFormat) UnmarshalJSON(data []byte) error

UnmarshalJSON satisfies the json.Unmarshaler interface

type Dataset

type Dataset struct {

	// Body represents dataset data with native go types.
	// Datasets have at most one body. Body, BodyBytes, and BodyPath
	// work together, often with only one field used at a time
	Body interface{} `json:"body,omitempty"`
	// BodyBytes is for representing dataset data as a slice of bytes
	BodyBytes []byte `json:"bodyBytes,omitempty"`
	// BodyPath is the path to the hash of raw data as it resolves on the network
	BodyPath string `json:"bodyPath,omitempty"`

	// Commit contains author & change message information that describes this
	// version of a dataset
	Commit *Commit `json:"commit,omitempty"`
	// Meta contains all human-readable meta about this dataset intended to aid
	// in discovery and organization of this document
	Meta *Meta `json:"meta,omitempty"`

	// name reference for this dataset, transient
	Name string `json:"name,omitempty"`
	// Location of this dataset, transient
	Path string `json:"path,omitempty"`
	// Peername of dataset owner, transient
	Peername string `json:"peername,omitempty"`
	// PreviousPath connects datasets to form a historical merkle-DAG of snapshots
	// of this document, creating a version history
	PreviousPath string `json:"previousPath,omitempty"`
	// ProfileID of dataset owner, transient
	ProfileID string `json:"profileID,omitempty"`
	// Number of versions this dataset has, transient
	NumVersions int `json:"numVersions,omitempty"`
	// Qri is a key for both identifying this document type, and versioning the
	// dataset document definition itself.
	Qri string `json:"qri"`
	// Structure of this dataset
	Structure *Structure `json:"structure,omitempty"`
	// Transform is a path to the transformation that generated this resource
	Transform *Transform `json:"transform,omitempty"`
	// Viz stores configuration data related to representing a dataset as
	// a visualization
	Viz *Viz `json:"viz,omitempty"`
	// contains filtered or unexported fields
}

Dataset is a document for describing & storing structured data. Dataset documents are designed to satisfy the FAIR principle of being Findable, Accessible, Interoperable, and Reproducible, in relation to other dataset documents, and related-but-separate technologies such as data catalogs, HTTP API's, and data package formats Datasets are designed to be stored and distributed on content-addressed (identify-by-hash) systems The dataset document definition is built from a research-first principle, valuing direct interoperability with existing standards over novel definitions or specifications

func NewDatasetRef

func NewDatasetRef(path string) *Dataset

NewDatasetRef creates a Dataset pointer with the internal path property specified, and no other fields.

func UnmarshalDataset

func UnmarshalDataset(v interface{}) (*Dataset, error)

UnmarshalDataset tries to extract a dataset type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Dataset) Assign

func (ds *Dataset) Assign(datasets ...*Dataset)

Assign collapses all properties of a group of datasets onto one. this is directly inspired by Javascript's Object.assign

func (*Dataset) BodyFile

func (ds *Dataset) BodyFile() qfs.File

BodyFile exposes bodyFile if one is set. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Dataset) DropTransientValues

func (ds *Dataset) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs note that DropTransientValues does *not* drop the transient values of child components of a dataset, each component's DropTransientValues method must be called separately

func (*Dataset) IsEmpty

func (ds *Dataset) IsEmpty() bool

IsEmpty checks to see if dataset has any fields other than the Path & Qri fields

func (*Dataset) MarshalJSON

func (ds *Dataset) MarshalJSON() ([]byte, error)

MarshalJSON uses a map to combine meta & standard fields. Marshalling a map[string]interface{} automatically alpha-sorts the keys.

func (*Dataset) OpenBodyFile

func (ds *Dataset) OpenBodyFile(resolver qfs.PathResolver) (err error)

OpenBodyFile sets the byte stream of file data, prioritizing: * erroring when the body is inline * creating an in-place file from bytes * passing BodyPath to the resolver once resolved, the file is set to an internal field, which is accessible via the BodyFile method. separating into two steps decouples loading from access

func (*Dataset) SetBodyFile

func (ds *Dataset) SetBodyFile(file qfs.File)

SetBodyFile assigns the bodyFile.

func (*Dataset) SignableBytes

func (ds *Dataset) SignableBytes() ([]byte, error)

SignableBytes produces the portion of a commit message used for signing the format for signable bytes is: * commit timestamp in RFC3339 format, UTC timezone * newline character * dataset structure checksum string checksum string should be a base58-encoded multihash of the dataset data

func (*Dataset) UnmarshalJSON

func (ds *Dataset) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaller

type FormatConfig

type FormatConfig interface {
	// Format gives the data format being configured
	Format() DataFormat
	// map gives an object of configuration details
	Map() map[string]interface{}
}

FormatConfig is the interface for data format configurations

func NewXLSXOptions

func NewXLSXOptions(opts map[string]interface{}) (FormatConfig, error)

NewXLSXOptions creates a XLSXOptions pointer from a map

func ParseFormatConfigMap

func ParseFormatConfigMap(f DataFormat, opts map[string]interface{}) (FormatConfig, error)

ParseFormatConfigMap returns a FormatConfig implementation for a given data format and options map, often used in decoding from recorded formats like, say, JSON

type JSONOptions

type JSONOptions struct {
}

JSONOptions specifies configuration details for json file format

func NewJSONOptions

func NewJSONOptions(opts map[string]interface{}) (*JSONOptions, error)

NewJSONOptions creates a JSONOptions pointer from a map

func (*JSONOptions) Format

func (*JSONOptions) Format() DataFormat

Format announces the JSON Data Format for the FormatConfig interface

func (*JSONOptions) Map

func (o *JSONOptions) Map() map[string]interface{}

Map returns a map[string]interface representation of the configuration

type Kind

type Kind string

Kind is a short identifier for all types of qri dataset objects Kind does three things: 1. Distinguish qri datasets from other formats 2. Distinguish different types (Dataset/Structure/Transform/etc.) 3. Distinguish between versions of the dataset spec Kind is a string in the format ds:[version]

func (Kind) String

func (k Kind) String() string

String implements the stringer interface

func (Kind) Type

func (k Kind) Type() string

Type returns the type identifier

func (*Kind) UnmarshalJSON

func (k *Kind) UnmarshalJSON(data []byte) error

UnmarshalJSON implements the JSON.Unmarshaler interface, rejecting any strings that are not a valid kind

func (Kind) Valid

func (k Kind) Valid() error

Valid checks to see if a kind string is valid

func (Kind) Version

func (k Kind) Version() string

Version returns the version portion of the kind identifier

type License

type License struct {
	Type string `json:"type,omitempty"`
	URL  string `json:"url,omitempty"`
}

License represents a legal licensing agreement

func (*License) Decode

func (l *License) Decode(val interface{}) (err error)

Decode reads json.Umarshal-style data into a License

type Meta

type Meta struct {

	// Url to access the dataset
	AccessURL string `json:"accessURL,omitempty"`
	// The frequency with which dataset changes. Must be an ISO 8601 repeating
	// duration
	AccrualPeriodicity string `json:"accrualPeriodicity,omitempty"`
	// Citations is a slice of assets used to build this dataset
	Citations []*Citation `json:"citations"`
	// Contribute
	Contributors []*User `json:"contributors,omitempty"`
	// Description follows the DCAT sense of the word, it should be around a
	// paragraph of human-readable text
	Description string `json:"description,omitempty"`
	// Url that should / must lead directly to the data itself
	DownloadURL string `json:"downloadURL,omitempty"`
	// HomeURL is a path to a "home" resource
	HomeURL string `json:"homeURL,omitempty"`
	// Identifier is for *other* data catalog specifications. Identifier should
	// not be used or relied on to be unique, because this package does not
	// enforce any of these rules.
	Identifier string `json:"identifier,omitempty"`
	// String of Keywords
	Keywords []string `json:"keywords,omitempty"`
	// Languages this dataset is written in
	Language []string `json:"language,omitempty"`
	// License will automatically parse to & from a string value if provided as a
	// raw string
	License *License `json:"license,omitempty"`
	// path is the location of meta, transient
	Path string `json:"path,omitempty"`
	// Kind is required, must be qri:md:[version]
	Qri string `json:"qri,omitempty"`
	// path to dataset readme file, not part of the DCAT spec, but a common
	// convention in software dev
	ReadmeURL string `json:"readmeURL,omitempty"`
	// Title of this dataset
	Title string `json:"title,omitempty"`
	// "Category" for
	Theme []string `json:"theme,omitempty"`
	// Version is the version identifier for this dataset
	Version string `json:"version,omitempty"`
	// contains filtered or unexported fields
}

Meta contains human-readable descriptive metadata that qualifies and distinguishes a dataset. Well-defined Meta should aid in making datasets Findable by describing a dataset in generalizable taxonomies that can aggregate across other dataset documents. Because dataset documents are intended to interoperate with many other data storage and cataloging systems, meta fields and conventions are derived from existing metadata formats whenever possible

func NewMetaRef

func NewMetaRef(path string) *Meta

NewMetaRef creates a Meta pointer with the internal path property specified, and no other fields.

func UnmarshalMeta

func UnmarshalMeta(v interface{}) (*Meta, error)

UnmarshalMeta tries to extract a metadata type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Meta) Assign

func (md *Meta) Assign(metas ...*Meta)

Assign collapses all properties of a group of metadata structs onto one. this is directly inspired by Javascript's Object.assign

func (*Meta) DropTransientValues

func (md *Meta) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Meta) IsEmpty

func (md *Meta) IsEmpty() bool

IsEmpty checks to see if dataset has any fields other than the internal path

func (*Meta) MarshalJSON

func (md *Meta) MarshalJSON() ([]byte, error)

MarshalJSON uses a map to combine meta & standard fields. Marshalling a map[string]interface{} automatically alpha-sorts the keys.

func (*Meta) MarshalJSONObject

func (md *Meta) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Meta) Meta

func (md *Meta) Meta() map[string]interface{}

Meta gives access to additional metadata not covered by dataset metadata

func (*Meta) Set

func (md *Meta) Set(key string, val interface{}) (err error)

Set writes value to key in metadata, erroring if the type is invalid input values are expected to be json.Unmarshal types

func (*Meta) SetArbitrary

func (md *Meta) SetArbitrary(key string, val interface{}) (err error)

SetArbitrary is for implementing the ArbitrarySetter interface defined by base/fill_struct.go

func (*Meta) UnmarshalJSON

func (md *Meta) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaller

type Structure

type Structure struct {
	// Checksum is a bas58-encoded multihash checksum of the entire data
	// file this structure points to. This is different from IPFS
	// hashes, which are calculated after breaking the file into blocks
	Checksum string `json:"checksum,omitempty"`
	// Compression specifies any compression on the source data,
	// if empty assume no compression
	Compression string `json:"compression,omitempty"`
	// Maximum nesting level of composite types in the dataset. eg: depth 1 == [], depth 2 == [[]]
	Depth int `json:"depth,omitempty"`
	// Encoding specifics character encoding, assume utf-8 if not specified
	Encoding string `json:"encoding,omitempty"`
	// ErrCount is the number of errors returned by validating data
	// against this schema. required
	ErrCount int `json:"errCount"`
	// Entries is number of top-level entries in the dataset. With tablular data
	// this is the same as the number of "rows"
	Entries int `json:"entries,omitempty"`
	// Format specifies the format of the raw data MIME type
	Format string `json:"format"`
	// FormatConfig removes as much ambiguity as possible about how
	// to interpret the speficied format.
	// FormatConfig FormatConfig `json:"formatConfig,omitempty"`
	FormatConfig map[string]interface{} `json:"formatConfig,omitempty"`

	// Length is the length of the data object in bytes.
	// must always match & be present
	Length int `json:"length,omitempty"`
	// location of this structure, transient
	Path string `json:"path,omitempty"`
	// Qri should always be KindStructure
	Qri string `json:"qri"`
	// Schema contains the schema definition for the underlying data, schemas
	// are defined using the IETF json-schema specification. for more info
	// on json-schema see: https://json-schema.org
	Schema map[string]interface{} `json:"schema,omitempty"`
	// Strict requires schema validation to pass without error. Datasets with
	// strict: true can have additional functionality and performance speedups
	// that comes with being able to assume that all data is valid
	Strict bool `json:"strict,omitempty"`
}

Structure defines the characteristics of a dataset document necessary for a machine to interpret the dataset body. Structure fields are things like the encoding data format (JSON,CSV,etc.), length of the dataset body in bytes, stored in a rigid form intended for machine use. A well defined structure & accompanying software should allow the end user to spend more time focusing on the data itself Two dataset documents that both have a defined structure will have some degree of natural interoperability, depending first on the amount of detail provided in a dataset's structure, and then by the natural comparibilty of the datasets

func NewStructureRef

func NewStructureRef(path string) *Structure

NewStructureRef creates an empty struct with it's internal path set

func UnmarshalStructure

func UnmarshalStructure(v interface{}) (*Structure, error)

UnmarshalStructure tries to extract a structure type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Structure) Abstract

func (s *Structure) Abstract() *Structure

Abstract returns this structure instance in it's "Abstract" form stripping all nonessential values & renaming all schema field names to standard variable names

func (*Structure) Assign

func (s *Structure) Assign(structures ...*Structure)

Assign collapses all properties of a group of structures on to one this is directly inspired by Javascript's Object.assign

func (*Structure) DataFormat

func (s *Structure) DataFormat() DataFormat

DataFormat gives format as a DataFormat type, returning UnknownDataFormat in any case where st.DataFormat is an invalid string

func (*Structure) DropTransientValues

func (s *Structure) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Structure) Hash

func (s *Structure) Hash() (string, error)

Hash gives the hash of this structure

func (*Structure) IsEmpty

func (s *Structure) IsEmpty() bool

IsEmpty checks to see if structure has any fields other than the internal path

func (*Structure) JSONSchema

func (s *Structure) JSONSchema() (*jsonschema.RootSchema, error)

JSONSchema parses the Schema field into a json-schema

func (Structure) MarshalJSON

func (s Structure) MarshalJSON() (data []byte, err error)

MarshalJSON satisfies the json.Marshaler interface

func (Structure) MarshalJSONObject

func (s Structure) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Structure) UnmarshalJSON

func (s *Structure) UnmarshalJSON(data []byte) (err error)

UnmarshalJSON satisfies the json.Unmarshaler interface

type Theme

type Theme struct {
	Description     string `json:"description,omitempty"`
	DisplayName     string `json:"display_name,omitempty"`
	ImageDisplayURL string `json:"image_display_url,omitempty"`
	ID              string `json:"id,omitempty"`
	Name            string `json:"name,omitempty"`
	Title           string `json:"title,omitempty"`
}

Theme is pulled from the Project Open Data Schema version 1.1

type Transform

type Transform struct {
	// Config outlines any configuration that would affect the resulting hash
	Config map[string]interface{} `json:"config,omitempty"`
	// location of the transform object, transient
	Path string `json:"path,omitempty"`
	// Kind should always equal KindTransform
	Qri string `json:"qri,omitempty"`
	// Resources is a map of all datasets referenced in this transform, with
	// alphabetical keys generated by datasets in order of appearance within the
	// transform
	Resources map[string]*TransformResource `json:"resources,omitempty"`

	// ScriptBytes is for representing a script as a slice of bytes, transient
	ScriptBytes []byte `json:"scriptBytes,omitempty"`
	// ScriptPath is the path to the script that produced this transformation.
	ScriptPath string `json:"scriptPath,omitempty"`
	// Secrets is a map of secret values used in the transformation, transient.
	// TODO (b5): make this not-transient by censoring the values used, but not keys
	Secrets map[string]string `json:"secrets,omitempty"`
	// Syntax this transform was written in
	Syntax string `json:"syntax,omitempty"`
	// SyntaxVersion is an identifier for the application and version number that
	// produced the result
	SyntaxVersion string `json:"syntaxVersion,omitempty"`
	// contains filtered or unexported fields
}

Transform is a record of executing a transformation on data. Transforms can theoretically be anything from an SQL query, a jupyter notebook, the state of an ETL pipeline, etc, so long as the input is zero or more datasets, and the output is a single dataset Ideally, transforms should contain all the machine-necessary bits to deterministicly execute the algorithm referenced in "ScriptPath".

func NewTransformRef

func NewTransformRef(path string) *Transform

NewTransformRef creates a Transform pointer with the internal path property specified, and no other fields.

func UnmarshalTransform

func UnmarshalTransform(v interface{}) (*Transform, error)

UnmarshalTransform tries to extract a resource type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Transform) Assign

func (q *Transform) Assign(qs ...*Transform)

Assign collapses all properties of a group of queries onto one. this is directly inspired by Javascript's Object.assign

func (*Transform) DropTransientValues

func (q *Transform) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Transform) IsEmpty

func (q *Transform) IsEmpty() bool

IsEmpty checks to see if transform has any fields other than the internal path

func (Transform) MarshalJSON

func (q Transform) MarshalJSON() ([]byte, error)

MarshalJSON satisfies the json.Marshaler interface

func (Transform) MarshalJSONObject

func (q Transform) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if meta is empty or a reference

func (*Transform) OpenScriptFile

func (q *Transform) OpenScriptFile(resolver qfs.PathResolver) (err error)

OpenScriptFile generates a byte stream of script data prioritizing creating an in-place file from ScriptBytes when defined, fetching from the passed-in resolver otherwise

func (*Transform) ScriptFile

func (q *Transform) ScriptFile() qfs.File

ScriptFile gives the internal file, if any. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Transform) SetScriptFile

func (q *Transform) SetScriptFile(file qfs.File)

SetScriptFile assigns the scriptFile

func (*Transform) UnmarshalJSON

func (q *Transform) UnmarshalJSON(data []byte) error

UnmarshalJSON satisfies the json.Unmarshaler interface

type TransformResource

type TransformResource struct {
	Path string `json:"path"`
}

TransformResource describes an external data dependency, the prime use case is for importing other datasets, but in the future this may be expanded to include details that specify resources other than datasets (urls?), and details for interpreting the resource (eg. a selector to specify only a subset of a resource is required)

func (*TransformResource) UnmarshalJSON

func (r *TransformResource) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler, allowing both string and object representations

type User

type User struct {
	ID       string `json:"id,omitempty"`
	Fullname string `json:"name,omitempty"`
	Email    string `json:"email,omitempty"`
}

User is a placholder for talking about people, groups, organizations

func (*User) Decode

func (u *User) Decode(val interface{}) (err error)

Decode reads json.Umarshal-style data into a User

type Viz

type Viz struct {
	// Format designates the visualization configuration syntax. currently the
	// only supported syntax is "html"
	Format string `json:"format,omitempty"`
	// path is the location of a viz, transient
	Path string `json:"path,omitempty"`
	// Qri should always be "vc:0"
	Qri string `json:"qri,omitempty"`

	// ScriptBytes is for representing a script as a slice of bytes, transient
	ScriptBytes []byte `json:"scriptBytes,omitempty"`
	// ScriptPath is the path to the script that created this
	ScriptPath string `json:"scriptPath,omitempty"`
	// RenderedPath is the path to the file rendered using the viz script and the body
	RenderedPath string `json:"renderedPath,omitempty"`
	// contains filtered or unexported fields
}

Viz stores configuration data related to representing a dataset as a visualization

func NewVizRef

func NewVizRef(path string) *Viz

NewVizRef creates an empty struct with it's internal path set

func UnmarshalViz

func UnmarshalViz(v interface{}) (*Viz, error)

UnmarshalViz tries to extract a resource type from an empty interface. Pairs nicely with datastore.Get() from github.com/ipfs/go-datastore

func (*Viz) Assign

func (v *Viz) Assign(visConfigs ...*Viz)

Assign collapses all properties of a group of structures on to one this is directly inspired by Javascript's Object.assign

func (*Viz) DropTransientValues

func (v *Viz) DropTransientValues()

DropTransientValues removes values that cannot be recorded when the dataset is rendered immutable, usually by storing it in a cafs

func (*Viz) IsEmpty

func (v *Viz) IsEmpty() bool

IsEmpty checks to see if Viz has any fields other than the internal path

func (*Viz) MarshalJSON

func (v *Viz) MarshalJSON() ([]byte, error)

MarshalJSON satisfies the json.Marshaler interface

func (*Viz) MarshalJSONObject

func (v *Viz) MarshalJSONObject() ([]byte, error)

MarshalJSONObject always marshals to a json Object, even if Viz is empty or a reference

func (*Viz) OpenRenderedFile

func (v *Viz) OpenRenderedFile(resolver qfs.PathResolver) (err error)

OpenRenderedFile generates a byte stream of the rendered data

func (*Viz) OpenScriptFile

func (v *Viz) OpenScriptFile(resolver qfs.PathResolver) (err error)

OpenScriptFile generates a byte stream of script data prioritizing creating an in-place file from ScriptBytes when defined, fetching from the passed-in resolver otherwise

func (*Viz) RenderedFile

func (v *Viz) RenderedFile() qfs.File

RenderedFile exposes renderedFile if one is set. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Viz) ScriptFile

func (v *Viz) ScriptFile() qfs.File

ScriptFile exposes scriptFile if one is set. Callers that use the file in any way (eg. by calling Read) should consume the entire file and call Close

func (*Viz) SetRenderedFile

func (v *Viz) SetRenderedFile(file qfs.File)

SetRenderedFile assigns the unexported renderedFile

func (*Viz) SetScriptFile

func (v *Viz) SetScriptFile(file qfs.File)

SetScriptFile assigns the unexported scriptFile

func (*Viz) UnmarshalJSON

func (v *Viz) UnmarshalJSON(data []byte) error

UnmarshalJSON satisfies the json.Unmarshaler interface

type XLSXOptions

type XLSXOptions struct {
	SheetName string `json:"sheetName,omitempty"`
}

XLSXOptions specifies configuraiton details for the xlsx file format

func (*XLSXOptions) Format

func (*XLSXOptions) Format() DataFormat

Format announces the XLSX data format for the FormatConfig interface

func (*XLSXOptions) Map

func (o *XLSXOptions) Map() map[string]interface{}

Map structures XLSXOptions as a map of string keys to values

Directories

Path Synopsis
Package compression is a horrible hack & should be replaced as soon as humanly possible
Package compression is a horrible hack & should be replaced as soon as humanly possible
Package dsfs glues datsets to cafs (content-addressed-file-system)
Package dsfs glues datsets to cafs (content-addressed-file-system)
Package dsgraph is a placeholder package for linking queries, resources, and metadata until proper packaging & architectural decisions can be made
Package dsgraph is a placeholder package for linking queries, resources, and metadata until proper packaging & architectural decisions can be made
Package dsio defines writers & readers for operating on "container" data structures (objects and arrays)
Package dsio defines writers & readers for operating on "container" data structures (objects and arrays)
replacecr
Package replacecr defines a wrapper for replacing solo carriage return characters (\r) with carriage-return + line feed (\r\n)
Package replacecr defines a wrapper for replacing solo carriage return characters (\r) with carriage-return + line feed (\r\n)
Package dstest defines an interface for reading test cases from static files leveraging directories of test dataset input files & expected output files
Package dstest defines an interface for reading test cases from static files leveraging directories of test dataset input files & expected output files
Package dsutil includes dataset util funcs, placed here to avoid dataset package bloat TODO - consider merging this package with the dsfs package, as most of the functions in here rely on a Filestore argument
Package dsutil includes dataset util funcs, placed here to avoid dataset package bloat TODO - consider merging this package with the dsfs package, as most of the functions in here rely on a Filestore argument
Package dsviz renders the viz component of a dataset, returning a qfs.File of data HTML rendering uses go's html/template package to generate html documents from an input dataset.
Package dsviz renders the viz component of a dataset, returning a qfs.File of data HTML rendering uses go's html/template package to generate html documents from an input dataset.
Package generate is for generating random data from given structures
Package generate is for generating random data from given structures
Package subset provides methods for extracting defined abbreviations of a dataset document.
Package subset provides methods for extracting defined abbreviations of a dataset document.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL