num

package
v0.0.0-...-c5f678a Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 17, 2019 License: GPL-3.0 Imports: 9 Imported by: 0

Documentation

Overview

Package num contains numeric Array processing routines such as optimised matix multiplication.

Index

Constants

This section is empty.

Variables

View Source
var (
	PrintThreshold = 12
	PrintEdgeitems = 4
)

Parameters for array printing

Functions

func Bytes

func Bytes(arr ...*Array) (bytes int)

Total size of one of more arrays in bytes

func Prod

func Prod(arr []int) int

Product of elements of an integer array. Zero dimension array (scalar) has size 1.

func Release

func Release(arr ...Buffer)

Release one or more arrays or buffers

func SameShape

func SameShape(xd, yd []int) bool

Check if two arrays are the same shape

Types

type Array

type Array struct {
	Buffer
	Dtype DataType
	Dims  []int
}

Array struct is a general n dimensional tensor similar to a numpy ndarray data is stored internally in column major order, may be either on CPU or on GPU depending on buffer type

func NewArray

func NewArray(buf Buffer, dtype DataType, dims ...int) *Array

Allocate a new array using the provided buffer

func (*Array) Reshape

func (a *Array) Reshape(dims ...int) *Array

Reshape returns a new array of the same size with a view on the same data but with a different shape

func (*Array) Size

func (a *Array) Size() int

Size of data in array in words

func (*Array) String

func (a *Array) String(q Queue) string

String returns pretty printed output

type BatchNormLayer

type BatchNormLayer interface {
	ParamLayer
	InitParams(q Queue)
	Stats() (runMean, runVar *Array)
}

BatchNorm layer has extra parameters

func NewBatchNormLayer

func NewBatchNormLayer(q Queue, opts LayerOpts, avgFactor, epsilon float64, shape []int) BatchNormLayer

Create new batch normalisation layer

type Buffer

type Buffer interface {
	// pointer to data
	Data() unsafe.Pointer
	// size of buffer in 32 bit words
	Capacity() int
	// release frees the memory
	Release()
}

Buffer interface type represents the underlying data for an array

type ConvLayer

type ConvLayer interface {
	ParamLayer
	Algorithm() string
}

Convolutional network layer type

func NewConvLayer

func NewConvLayer(q Queue, opts LayerOpts, inShape []int, nFeats, size, stride int, pad bool) (ConvLayer, int)

Create new convolution layer, input shape is nBatch x depth x h x w, returns workspace needed in 32 bit words

type DataType

type DataType int

Data type of an element of the array

const (
	Int32   DataType = C.I32
	Float32 DataType = C.F32
)

type Device

type Device interface {
	// Setup new worker queue
	NewQueue() Queue
	// Allocate new n dimensional array
	NewArray(dtype DataType, dims ...int) *Array
	NewArrayLike(a *Array) *Array
	// Allocate memory with given size in 32 bit words
	NewBuffer(size int) Buffer
}

Device interface type

func NewDevice

func NewDevice(useGPU bool) Device

Initialise new CPU or GPU device

type Function

type Function struct {
	// contains filtered or unexported fields
}

Function which may be called via the queue

func Axpy

func Axpy(alpha float32, x, y *Array) Function

Array addition and scaling: y <- alpha*x + y

func Copy

func Copy(src, dst *Array) Function

Copy from src to dst, broadcast vector to matrix if needed, vector is tiled row wise

func Div

func Div(epsilon float32, a, b, c *Array) Function

Element wise array division: c = a / (b+epsilon)

func Fill

func Fill(a *Array, scalar float32) Function

Fill array with a scalar value

func Gemm

func Gemm(alpha float32, mA, mB, mC *Array, aTrans, bTrans TransType, incr bool) Function

Matrix matrix multiplication: mC <- alpha*dot(mA, mB) or mC <- alpha*dot(mA, mB) + mC if incr = true

func Gemv

func Gemv(alpha float32, mA, x, y *Array, aTrans TransType) Function

Matrix vector multiplication: y <- alpha * dot(mA,x)

func Max

func Max(a, b, c *Array) Function

Element wise maximum: c = max(a, b)

func Min

func Min(a, b, c *Array) Function

Element wise minimum: c = min(a, b)

func Mul

func Mul(a, b, c *Array) Function

Element wise array multiplication: c = a*b

func Neq

func Neq(x, y, res *Array) Function

Element wise != comparison

func Onehot

func Onehot(x, y *Array, classes int) Function

Convert to one hot representation

func QuadraticLoss

func QuadraticLoss(x, y, res *Array) Function

Quadratic loss function: (x-y)**2

func Read

func Read(a *Array, data interface{}) Function

Read data from array into a slice.

func Scale

func Scale(alpha float32, x *Array) Function

Scale array elementwise

func Softmax

func Softmax(x, res *Array) Function

Softmax activation function

func SoftmaxLoss

func SoftmaxLoss(x, y, res *Array) Function

Softmax loss function

func Sqrt

func Sqrt(x, y *Array) Function

Element wise square root: y <- sqrt(x)

func Square

func Square(x, y *Array) Function

Element wise square of array: y <- x**2

func Sum

func Sum(a, total *Array) Function

Calculate the scalar sum of the values in the array.

func Transpose

func Transpose(mA, mB *Array) Function

Transpose sets mB to a copy of mA with the data transposed.

func Unhot

func Unhot(x, y *Array) Function

Convert from OneHot format back to labels

func Write

func Write(a *Array, data interface{}) Function

Write data from a slice into the given array.

func WriteCol

func WriteCol(a *Array, col int, data interface{}) Function

Write to one column in the array

func (Function) String

func (f Function) String() string

type Layer

type Layer interface {
	InShape() []int
	OutShape() []int
	Fprop(q Queue, in *Array, work Buffer, trainMode bool) *Array
	Bprop(q Queue, grad, dsrc *Array, work [3]Buffer) *Array
	Output() *Array
	Memory() (weights, outputs, temp int)
	BpropData() bool
	Release()
}

Layer interface type represents an Activation or MaxPool layer

func NewActivationLayer

func NewActivationLayer(q Queue, typ string, shape []int) Layer

Create new activation layer, typ may be sigmoid, tanh or relu

func NewDropoutLayer

func NewDropoutLayer(q Queue, ratio float64, shape []int, seed int64) Layer

Create new dropout layer.

func NewPoolLayer

func NewPoolLayer(q Queue, opts LayerOpts, inShape []int, size, stride int, pad, average bool) Layer

Create new max pooling layer, prev layer should be a ConvLayer

type LayerOpts

type LayerOpts int
const (
	FpropOnly    LayerOpts = 0
	NoBias       LayerOpts = 1
	BpropData    LayerOpts = 2
	BpropWeights LayerOpts = 4
)

func (LayerOpts) String

func (l LayerOpts) String() string

type ParamLayer

type ParamLayer interface {
	Layer
	BiasShape() []int
	FilterShape() []int
	SetParamData(W, B, dW, dB *Array)
}

Param layer also has weights and biases

type Queue

type Queue interface {
	Device
	Dev() Device
	// Asyncronous function call
	Call(args ...Function) Queue
	// Wait for any pending requests to complete
	Finish()
	// Shutdown the queue and release any resources
	Shutdown()
	// Enable profiling
	Profiling(on bool, title string)
	Profile() string
}

A Queue processes a series of operations on a Device

type TransType

type TransType int

TransType flag indicates if matrix is transposed

const (
	NoTrans TransType = C.CblasNoTrans
	Trans   TransType = C.CblasTrans
)

Directories

Path Synopsis
Package cuda contains wrapper functions for Cuda api
Package cuda contains wrapper functions for Cuda api
Package mkl wraps wraps the Intel MKL DNN functions
Package mkl wraps wraps the Intel MKL DNN functions

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL