Documentation
¶
Overview ¶
Package nru provides an implementation of the NRU (Non-Saturating Recurrent Units) recurrent network as described in "Towards Non-Saturating Recurrent Units for Modelling Long-Term Dependencies" by Chandar et al., 2019. (https://www.aaai.org/ojs/index.php/AAAI/article/view/4200/4078)
Unfortunately this implementation is extremely inefficient due to the lack of functionality in the auto-grad (ag) package at the moment.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct { InputSize int HiddenSize int MemorySize int K int UseReLU bool UseLayerNorm bool States []*State `spago:"scope:processor"` }
Config provides configuration settings for a NRU Model.
type Model ¶
type Model struct { nn.BaseModel Config SqrtMemK int Wx nn.Param `spago:"type:weights"` Wh nn.Param `spago:"type:weights"` Wm nn.Param `spago:"type:weights"` B nn.Param `spago:"type:biases"` Whm2alpha nn.Param `spago:"type:weights"` Bhm2alpha nn.Param `spago:"type:biases"` Whm2alphaVec nn.Param `spago:"type:weights"` Bhm2alphaVec nn.Param `spago:"type:biases"` Whm2beta nn.Param `spago:"type:weights"` Bhm2beta nn.Param `spago:"type:biases"` Whm2betaVec nn.Param `spago:"type:weights"` Bhm2betaVec nn.Param `spago:"type:biases"` HiddenLayerNorm *layernorm.Model }
Model contains the serializable parameters.
func (*Model) Forward ¶
Forward performs the forward step for each input node and returns the result.
func (*Model) LastState ¶
LastState returns the last state of the recurrent network. It returns nil if there are no states.
func (*Model) SetInitialState ¶
SetInitialState sets the initial state of the recurrent network. It panics if one or more states are already present.