attention

package
v1.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 30, 2023 License: BSD-2-Clause Imports: 3 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func ScaledDotProductAttention

func ScaledDotProductAttention(q []mat.Tensor, k, v, scaleFactor mat.Tensor, useCausalMask bool) ([]mat.Tensor, []mat.Tensor)

ScaledDotProductAttention is a self-attention mechanism relating different positions of a single sequence to compute a representation of the same sequence. This method requires that the query, the key and the value vectors have already been obtained from the input sequence. The scaled factor is the square root of the dimension of the key vectors.

Types

This section is empty.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL