Documentation ¶
Overview ¶
Package inference allows users to do inference through tflite (tf, pytorch, etc in the future)
Index ¶
- Constants
- func FailedToGetError(name string) error
- func FailedToLoadError(name string) error
- func MetadataDoesNotExistError() error
- func TFliteTensorToGorgoniaTensor(t tflite.TensorType) tensor.Dtype
- type InTensorType
- type Interpreter
- type MLModel
- type TFLiteInfo
- type TFLiteModelLoader
- type TFLiteStruct
Constants ¶
const ( UInt8 = InTensorType("UInt8") Float32 = InTensorType("Float32") )
UInt8 and Float32 are the currently supported input tensor types.
Variables ¶
This section is empty.
Functions ¶
func FailedToGetError ¶
FailedToGetError is the default error message for when expected information will be fetched fails.
func FailedToLoadError ¶
FailedToLoadError is the default error message for when expected resources for inference fail to load.
func MetadataDoesNotExistError ¶
func MetadataDoesNotExistError() error
MetadataDoesNotExistError returns a metadata does not exist error.
func TFliteTensorToGorgoniaTensor ¶ added in v0.8.0
func TFliteTensorToGorgoniaTensor(t tflite.TensorType) tensor.Dtype
TFliteTensorToGorgoniaTensor converts the constants from one tensor library to another.
Types ¶
type InTensorType ¶
type InTensorType string
InTensorType is a wrapper around a string that details the allowed input tensor types.
type Interpreter ¶
type Interpreter interface { AllocateTensors() tflite.Status Invoke() tflite.Status GetOutputTensorCount() int GetInputTensorCount() int GetInputTensor(i int) *tflite.Tensor GetOutputTensor(i int) *tflite.Tensor Delete() }
Interpreter interface holds methods used by a tflite interpreter.
type MLModel ¶
type MLModel interface { // Infer takes an already ordered input tensor as an array, // and makes an inference on the model, returning an output tensor map Infer(inputTensors ml.Tensors) (ml.Tensors, error) // Metadata gets the entire model metadata structure from file Metadata() (interface{}, error) // Close closes the model and interpreter that allows inferences to be made, opens up space in memory. // All models must be closed when done using Close() error }
MLModel represents a trained machine learning model.
type TFLiteInfo ¶
type TFLiteInfo struct { InputHeight int InputWidth int InputChannels int InputShape []int InputTensorType InTensorType InputTensorCount int OutputTensorCount int OutputTensorTypes []string }
TFLiteInfo holds information about a model that are useful for creating input tensors bytes.
type TFLiteModelLoader ¶
type TFLiteModelLoader struct {
// contains filtered or unexported fields
}
TFLiteModelLoader holds functions that sets up a tflite model to be used.
func NewDefaultTFLiteModelLoader ¶
func NewDefaultTFLiteModelLoader() (*TFLiteModelLoader, error)
NewDefaultTFLiteModelLoader returns the default loader when using tflite.
func NewTFLiteModelLoader ¶
func NewTFLiteModelLoader(numThreads int) (*TFLiteModelLoader, error)
NewTFLiteModelLoader returns a loader that allows you to set threads when using tflite.
func (TFLiteModelLoader) Load ¶
func (loader TFLiteModelLoader) Load(modelPath string) (*TFLiteStruct, error)
Load returns a TFLite struct that is ready to be used for inferences.
type TFLiteStruct ¶
type TFLiteStruct struct { Info *TFLiteInfo // contains filtered or unexported fields }
TFLiteStruct holds information, model and interpreter of a tflite model in go.
func (*TFLiteStruct) Close ¶
func (model *TFLiteStruct) Close() error
Close should be called at the end of using the interpreter to delete related models and interpreters.
func (*TFLiteStruct) Infer ¶
Infer takes an input map of tensors and returns an output map of tensors.
func (*TFLiteStruct) Metadata ¶ added in v0.1.0
func (model *TFLiteStruct) Metadata() (*metadata.ModelMetadataT, error)
Metadata provides the metadata information based on the model flatbuffer file.