Documentation
¶
Overview ¶
Package lshattention provides an implementation of the LSH-Attention model, as describe in `Reformer: The Efficient Transformer` by N. Kitaev, Ł. Kaiser, A. Levskaya (https://arxiv.org/pdf/2001.04451.pdf). TODO: Check compatibility with the LSH Attention implemented by Hugging Face: TODO: https://huggingface.co/transformers/model_doc/reformer.html
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct { InputSize int QuerySize int ValueSize int BucketSize int // num of buckets / 2 ScaleFactor mat.Float }
Config provides configuration settings for a LSH-Attention Model.
type ContextProb ¶
type ContextProb struct { // Context encodings. Context []ag.Node // Prob attention scores. Prob []mat.Matrix }
ContextProb is a pair of Context encodings and Prob attention scores.
Click to show internal directories.
Click to hide internal directories.