ag

package
v1.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 30, 2023 License: BSD-2-Clause Imports: 13 Imported by: 23

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Abs

func Abs(x mat.Tensor) mat.Tensor

Abs returns a new operator node as a result of the `Abs` function.

func Add

func Add(x1 mat.Tensor, x2 mat.Tensor) mat.Tensor

Add returns a new operator node as a result of the gradfn.Add function. As special case, the first node may be null. This help to keep the code as concise as possible e.g. during accumulation.

func AddScalar

func AddScalar(x1, x2 mat.Tensor) mat.Tensor

AddScalar returns a new operator node as a result of the gradfn.AddScalar function.

func Affine

func Affine(b, w1, x1 mat.Tensor, wxPairs ...mat.Tensor) mat.Tensor

Affine returns a new operator node as a result of the gradfn.Affine function.

func AppendRows

func AppendRows(x mat.Tensor, vs ...mat.Tensor) mat.Tensor

AppendRows returns a new operator node as a result of the gradfn.AppendRows function.

func At

func At(x mat.Tensor, indices ...int) mat.Tensor

At returns a new operator node as a result of the gradfn.At function.

func Backward

func Backward(xs ...mat.Tensor) error

Backward initiates back-propagation from the input tensors.

The function operates according to the following mutually exclusive rules:

  • If the tensors already has gradients (likely assigned externally via node.AccGrads()), those gradients are used.
  • If the tensors does not have gradients assigned and is a scalar, the output gradients are automatically assigned by finding the derivative of the tensors with respect to itself (dy/dy = 1).
  • If the tensors does not have gradients assigned and is not a scalar, it returns an error.

During the back-propagation process, the gradients of all tensors, except for the given tensors, are summed to the existing gradients. Unless you intend to do so, ensure that all tensors have zero gradients.

func BiAffine

func BiAffine(w, u, v, b, x1, x2 mat.Tensor) mat.Tensor

BiAffine performs a biaffine transformation.

func BiLinear

func BiLinear(w, x1, x2 mat.Tensor) mat.Tensor

BiLinear performs a bilinear transformation of the type (x_1 W x_2)

func CELU

func CELU(x, alpha mat.Tensor) mat.Tensor

CELU returns a new operator node as a result of the gradfn.CELU function.

func ColView

func ColView(x mat.Tensor, column int) mat.Tensor

ColView returns a new operator node as a result of the gradfn.ColView function.

func ColViews

func ColViews(x mat.Tensor) []mat.Tensor

ColViews calls ColView for each column of x, returning a new slice of column-view Nodes.

func Concat

func Concat(xs ...mat.Tensor) mat.Tensor

Concat returns a new operator node as a result of the gradfn.Concat function.

func Copy added in v1.1.0

func Copy(x mat.Tensor) mat.Tensor

Copy returns a new operator node as a result of the gradfn.Copy function.

func Cos

func Cos(x mat.Tensor) mat.Tensor

Cos returns a new operator node as a result of the `Cos` function.

func Div

func Div(x1, x2 mat.Tensor) mat.Tensor

Div returns a new operator node as a result of the gradfn.Div function.

func DivScalar

func DivScalar(x1, x2 mat.Tensor) mat.Tensor

DivScalar returns a new operator node as a result of the gradfn.DivScalar function.

func Dot

func Dot(x1, x2 mat.Tensor) mat.Tensor

Dot returns a new operator node as a result of the gradfn.Dot function.

func Dropout

func Dropout(x mat.Tensor, p float64) mat.Tensor

Dropout returns a new operator node as a result of the gradfn.Dropout function. If the dropout probability is zero, the operator will not be created, so the input itself is returned directly.

func DropoutFunc

func DropoutFunc(p float64) func(x mat.Tensor) mat.Tensor

DropoutFunc returns a function to create a Dropout operator working with the given dropout probability.

func ELU

func ELU(x, alpha mat.Tensor) mat.Tensor

ELU returns a new operator node as a result of the gradfn.ELU function.

func Exp

func Exp(x mat.Tensor) mat.Tensor

Exp returns a new operator node as a result of the `Exp` function.

func Flatten

func Flatten(x mat.Tensor) mat.Tensor

Flatten returns a new operator node as a result of the gradfn.Flatten function.

func GELU

func GELU(x mat.Tensor) mat.Tensor

GELU returns a new operator node as a result of the gradfn.GELU function.

func HardSigmoid

func HardSigmoid(x mat.Tensor) mat.Tensor

HardSigmoid returns a new operator node as a result of the `HardSigmoid` function.

func HardTanh

func HardTanh(x mat.Tensor) mat.Tensor

HardTanh returns a new operator node as a result of the `HardTanh` function.

func LeakyReLU

func LeakyReLU(x, alpha mat.Tensor) mat.Tensor

LeakyReLU returns a new operator node as a result of the gradfn.LeakyReLU function.

func Log

func Log(x mat.Tensor) mat.Tensor

Log returns a new operator node as a result of the `Log` function.

func LogSoftmax

func LogSoftmax(x mat.Tensor) mat.Tensor

LogSoftmax returns a new operator node as a result of Log(Softmax(x)).

func LogSumExp

func LogSumExp(xs ...mat.Tensor) mat.Tensor

LogSumExp "trick" computes the log of the sum of exponentials of input elements. When the input is one, this must be a vector. Alternatively, the calculation is conducted on a list of scalars.

func ManualSeed

func ManualSeed(seed uint64) *rand.LockedRand

ManualSeed sets the seed for generating random numbers.

func Map

func Map(mapping func(mat.Tensor) mat.Tensor, xs []mat.Tensor) []mat.Tensor

Map returns a transformed version of xs with all its components modified according to the mapping function. It is useful for applying an operator to a sequence of nodes. Keep in mind that using this function has an overhead because of the callback, however insignificant compared to mathematical computations.

func Map2

func Map2(mapping func(a mat.Tensor, b mat.Tensor) mat.Tensor, xs1 []mat.Tensor, xs2 []mat.Tensor) []mat.Tensor

Map2 takes two arguments and applies a mapping function (that must take two arguments) to the items from the two node-slices in parallel. It panics if one slice is shorter than the other.

func Max

func Max(x1, x2 mat.Tensor) mat.Tensor

Max returns a new operator node as a result of the gradfn.Max function.

func MaxPooling

func MaxPooling(x mat.Tensor, rows, columns int) mat.Tensor

MaxPooling returns a new operator node as a result of the gradfn.MaxPooling function.

func Maximum

func Maximum(xs []mat.Tensor) mat.Tensor

Maximum returns the value that describes the maximum of the sample.

func Mean

func Mean(xs []mat.Tensor) mat.Tensor

Mean returns the value that describes the average of the sample.

func Min

func Min(x1, x2 mat.Tensor) mat.Tensor

Min returns a new operator node as a result of the gradfn.Min function.

func Minimum

func Minimum(xs []mat.Tensor) mat.Tensor

Minimum returns the value that describes the minimum of the sample.

func Mish

func Mish(x mat.Tensor) mat.Tensor

Mish returns a new operator node as a result of the `Mish` function.

func Mul

func Mul(x1, x2 mat.Tensor) mat.Tensor

Mul returns a new operator node as a result of the gradfn.Mul function.

func MulT

func MulT(x1, x2 mat.Tensor) mat.Tensor

func Neg

func Neg(x mat.Tensor) mat.Tensor

Neg returns a new operator node as a result of the `Neg` function.

func Pad

func Pad(xs []mat.Tensor, seqLen int, padding func(i int) mat.Tensor) []mat.Tensor

Pad down/up samples the input to the given size.

func PositiveELU

func PositiveELU(x mat.Tensor) mat.Tensor

PositiveELU returns a new operator node as a result of ELU(x) + 1.

func Pow

func Pow(x mat.Tensor, power float64) mat.Tensor

Pow returns a new operator node as a result of the gradfn.Pow function.

func Prod

func Prod(x1, x2 mat.Tensor) mat.Tensor

Prod returns a new operator node as a result of the gradfn.Prod function.

func ProdScalar

func ProdScalar(x1, x2 mat.Tensor) mat.Tensor

ProdScalar returns a new operator node as a result of the gradfn.ProdScalar function.

func Rand added in v1.1.0

func Rand() *rand.LockedRand

Rand returns the global random number generator.

func ReLU

func ReLU(x mat.Tensor) mat.Tensor

ReLU returns a new operator node as a result of the `ReLU` function.

func Reciprocal

func Reciprocal(x mat.Tensor) mat.Tensor

Reciprocal returns a new operator node as a result of the `Reciprocal` function.

func ReduceMax

func ReduceMax(x mat.Tensor) mat.Tensor

ReduceMax returns a new operator node as a result of the gradfn.ReduceMax function.

func ReduceMean

func ReduceMean(x mat.Tensor) mat.Tensor

ReduceMean returns a new operator node as a result of the gradfn.ReduceMean function.

func ReduceSum

func ReduceSum(x mat.Tensor) mat.Tensor

ReduceSum returns a new operator node as a result of the gradfn.ReduceSum function.

func Reshape

func Reshape(x mat.Tensor, rows, columns int) mat.Tensor

Reshape returns a new operator node as a result of the gradfn.Reshape function.

func ReverseSub

func ReverseSub(x1, x2 mat.Tensor) mat.Tensor

ReverseSub returns a new operator node as a result of the fn.ReverseSub function.

func ReverseSubOne added in v1.1.0

func ReverseSubOne(x mat.Tensor) mat.Tensor

ReverseSubOne returns a new operator node as a result of applying reverse subtraction with 1.0 to the input using the fn.ReverseSub function.

func RotateR

func RotateR(x mat.Tensor, i int) mat.Tensor

RotateR performs the right circular shift. `i` is the number of places by which the elements are shifted.

func RowView

func RowView(x mat.Tensor, row int) mat.Tensor

RowView returns a new operator node as a result of the gradfn.RowView function.

func RowViews

func RowViews(x mat.Tensor) []mat.Tensor

RowViews calls RowView for each row of x, returning a new slice of row-view Nodes.

func SELU

func SELU(x, alpha mat.Tensor, scale mat.Tensor) mat.Tensor

SELU returns a new operator node as a result of the gradfn.SELU function.

func ScalarMax

func ScalarMax(xs []mat.Tensor) mat.Tensor

ScalarMax returns a new operator node as a result of the gradfn.ScalarMax function.

func Seed

func Seed() *rand.LockedRand

Seed sets the seed for generating random numbers to the current time (converted to uint64).

func SeparateMatrix

func SeparateMatrix(x mat.Tensor) [][]mat.Tensor

SeparateMatrix returns a matrix of Node(s) represented as a slice of slice containing the elements extracted from the input. The dimensions of the resulting matrix are the same of the input.

func SeparateVec

func SeparateVec(x mat.Tensor) []mat.Tensor

SeparateVec returns a slice of Node(s) containing the elements extracted from the input. The size of the vector equals the number of input elements. You can think of this method as the inverse of the ag.Concat operator.

func SetForceSyncExecution added in v1.1.0

func SetForceSyncExecution(enable bool)

SetForceSyncExecution enables or disables the forcing of synchronous execution for all operators. When enabled, the operators will run synchronously, regardless of the "async" flag in the Run() function. This setting can be particularly useful for debugging.

func SiLU

func SiLU(x mat.Tensor) mat.Tensor

SiLU returns a new operator node as a result of the fn.SiLU function.

func Sigmoid

func Sigmoid(x mat.Tensor) mat.Tensor

Sigmoid returns a new operator node as a result of the `Sigmoid` function.

func Sin

func Sin(x mat.Tensor) mat.Tensor

Sin returns a new operator node as a result of the `Sin` function.

func Slice

func Slice(x mat.Tensor, fromRow, fromCol, toRow, toCol int) mat.Tensor

Slice returns a new operator node as a result of the gradfn.Slice function.

func SoftPlus

func SoftPlus(x, beta, threshold mat.Tensor) mat.Tensor

SoftPlus returns a new operator node as a result of the gradfn.SoftPlus function.

func SoftShrink

func SoftShrink(x, lambda mat.Tensor) mat.Tensor

SoftShrink returns a new operator node as a result of the gradfn.SoftShrink function.

func Softmax

func Softmax(x mat.Tensor) mat.Tensor

Softmax returns a new operator node as a result of the gradfn.Softmax function.

func Softsign

func Softsign(x mat.Tensor) mat.Tensor

Softsign returns a new operator node as a result of the `SoftSign` function.

func SparseMax

func SparseMax(x mat.Tensor) mat.Tensor

SparseMax returns a new operator node as a result of the gradfn.SparseMax function.

func SparseMaxLoss

func SparseMaxLoss(x mat.Tensor) mat.Tensor

SparseMaxLoss returns a new operator node as a result of the gradfn.SparseMaxLoss function.

func SplitVec

func SplitVec(x mat.Tensor, chunks int) []mat.Tensor

SplitVec splits the x Node into multiple chunks.

func Sqrt

func Sqrt(x mat.Tensor) mat.Tensor

Sqrt returns a new operator node as a result of the `Sqrt` function.

func Square

func Square(x mat.Tensor) mat.Tensor

Square returns a new operator node as a result of the gradfn.Prod(x, x) function.

func Stack

func Stack(xs ...mat.Tensor) mat.Tensor

Stack returns a new operator node as a result of the gradfn.Stack function.

func StopGrad

func StopGrad(t mat.Tensor) mat.Tensor

StopGrad creates a new GradientBlocker that stops the accumulated gradients from flowing through the wrapped Node.

func Sub

func Sub(x1, x2 mat.Tensor) mat.Tensor

Sub returns a new operator node as a result of the gradfn.Sub function.

func SubScalar

func SubScalar(x1, x2 mat.Tensor) mat.Tensor

SubScalar returns a new operator node as a result of the gradfn.SubScalar function.

func Sum

func Sum(xs ...mat.Tensor) mat.Tensor

Sum returns the value that describes the sum of the sample. It panics if the input is empty.

func Swish

func Swish(x mat.Tensor) mat.Tensor

Swish returns a new operator node as a result of the gradfn.Swish function.

func SwishB

func SwishB(x, beta mat.Tensor) mat.Tensor

SwishB returns a new operator node as a result of the gradfn.SwishB function.

func T

func T(x mat.Tensor) mat.Tensor

T returns a new operator node as a result of the fn.T function.

func Tan

func Tan(x mat.Tensor) mat.Tensor

Tan returns a new operator node as a result of the `Tan` function.

func Tanh

func Tanh(x mat.Tensor) mat.Tensor

Tanh returns a new operator node as a result of the `Tanh` function.

func Threshold

func Threshold(x, threshold, k mat.Tensor) mat.Tensor

Threshold returns a new operator node as a result of the gradfn.Threshold function.

Types

type AutoGradFunction added in v1.1.0

type AutoGradFunction interface {
	// Forward computes the output of the function.
	Forward() (mat.Tensor, error)
	// Backward computes the backward pass given the gradient of the output.
	Backward(gy mat.Tensor) error
	// Operands returns the list of operands.
	Operands() []mat.Tensor
}

AutoGradFunction represents a function with automatic differentiation features. It's used to define a new operator.

type GradientBlocker added in v1.1.0

type GradientBlocker struct {
	mat.Tensor
}

GradientBlocker embeds any tensors implementation disabling gradients handling and blocking gradients accumulation.

func (*GradientBlocker) AccGrad added in v1.1.0

func (r *GradientBlocker) AccGrad(_ mat.Tensor)

AccGrad has no effects on a GradientBlocker Node.

func (*GradientBlocker) Grad added in v1.1.0

func (r *GradientBlocker) Grad() mat.Tensor

Grad always returns nil on a GradientBlocker Node.

func (*GradientBlocker) HasGrad added in v1.1.0

func (r *GradientBlocker) HasGrad() bool

HasGrad always returns false on a GradientBlocker Node.

func (*GradientBlocker) RequiresGrad added in v1.1.0

func (r *GradientBlocker) RequiresGrad() bool

RequiresGrad always returns false on a GradientBlocker Node.

func (*GradientBlocker) ZeroGrad added in v1.1.0

func (r *GradientBlocker) ZeroGrad()

ZeroGrad has no effects on a GradientBlocker Node.

type Operator

type Operator struct {
	// contains filtered or unexported fields
}

Operator is a type of node. It's used to represent a function with automatic differentiation features.

func NewOperator

func NewOperator(f AutoGradFunction) *Operator

NewOperator creates a new operator with the given AutoGradFunction. Note that the operator's Value() can only be accessed after calling the Run() function.

func (*Operator) AccGrad

func (o *Operator) AccGrad(grad mat.Tensor)

AccGrad accumulates the gradients to the node itself.

func (*Operator) At added in v1.1.0

func (o *Operator) At(indices ...int) mat.Tensor

At returns the value at the given indices. It panics if the given indices are out of range.

func (*Operator) Data added in v1.1.0

func (o *Operator) Data() float.Slice

Data returns the underlying data of the tensor.

func (*Operator) Dims added in v1.1.0

func (o *Operator) Dims() int

Dims returns the number of dimensions.

func (*Operator) Grad

func (o *Operator) Grad() mat.Tensor

Grad returns the gradients accumulated during the backward pass.

func (*Operator) HasGrad

func (o *Operator) HasGrad() bool

HasGrad returns true if there are accumulated gradients.

func (*Operator) Item added in v1.1.0

func (o *Operator) Item() float.Float

func (*Operator) Operands

func (o *Operator) Operands() []mat.Tensor

Operands returns the operands of the operator.

func (*Operator) RequiresGrad

func (o *Operator) RequiresGrad() bool

RequiresGrad returns true if the node requires gradients.

func (*Operator) Run added in v1.1.0

func (o *Operator) Run(async ...bool) *Operator

Run starts the execution of the operator, performing the forward pass. If the optional async argument is set to true, the forward pass will be executed in a separate goroutine. The function returns a pointer to the Operator, allowing for method chaining.

func (*Operator) SetAt added in v1.1.0

func (o *Operator) SetAt(m mat.Tensor, indices ...int)

SetAt sets the value at the given indices. It panics if the given indices are out of range.

func (*Operator) Shape added in v1.1.0

func (o *Operator) Shape() []int

Shape returns the size in each dimension.

func (*Operator) Size added in v1.1.0

func (o *Operator) Size() int

Size returns the total number of elements.

func (*Operator) Value

func (o *Operator) Value() mat.Tensor

Value returns the result of the function.

func (*Operator) ZeroGrad

func (o *Operator) ZeroGrad()

ZeroGrad clears the gradients.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL