importer

package
v0.0.0-...-e4696f9 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 23, 2014 License: MIT Imports: 8 Imported by: 0

Documentation

Overview

package importer implements utilities used to create ipfs DAGs from files and readers

Index

Constants

This section is empty.

Variables

View Source
var BlockSizeLimit = int64(1048576) // 1 MB

BlockSizeLimit specifies the maximum size an imported block can have.

View Source
var ErrSizeLimitExceeded = fmt.Errorf("object size limit exceeded")

ErrSizeLimitExceeded signals that a block is larger than BlockSizeLimit.

Functions

func BuildDagFromFile

func BuildDagFromFile(fpath string, ds dag.DAGService, mp pin.ManualPinner) (*dag.Node, error)

Builds a DAG from the given file, writing created blocks to disk as they are created

func BuildDagFromReader

func BuildDagFromReader(r io.Reader, ds dag.DAGService, mp pin.ManualPinner, spl chunk.BlockSplitter) (*dag.Node, error)

Builds a DAG from the data in the given reader, writing created blocks to disk as they are created

func NewDagFromFile

func NewDagFromFile(fpath string) (*dag.Node, error)

NewDagFromFile constructs a Merkle DAG from the file at given path.

func NewDagFromReader

func NewDagFromReader(r io.Reader) (*dag.Node, error)

NewDagFromReader constructs a Merkle DAG from the given io.Reader. size required for block construction.

func NewDagFromReaderWithSplitter

func NewDagFromReaderWithSplitter(r io.Reader, spl chunk.BlockSplitter) (*dag.Node, error)

Creates an in memory DAG from data in the given reader

Types

This section is empty.

Directories

Path Synopsis
package chunk implements streaming block splitters
package chunk implements streaming block splitters

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL