Documentation
¶
Overview ¶
Package trainer is a package that is used for training networks. There is not much support for this yet. It will have vanilla and momentum on using a device. Its hard to build any kind of trainer using cudnn. the optensor is kind of limited.
Index ¶
- func CreateTrainingMem(handle *cudnn.Handler, trainer Trainer, w *layers.Tensor) error
- func DebuggingAdam()
- func SetupAdamWandB(tctx *xtra.Handle, decay1, decay2 float32, batch int32) (*Adam, *Adam, error)
- func SetupAdamWandB2(handle *cudnn.Handler, rate, dwalpha, decay1, decay2 float32, batch int32) (*Adam, *Adam, error)
- type Adam
- func (a *Adam) Dims() []int32
- func (a *Adam) L1L2Loss() (float32, float32)
- func (a *Adam) SetBatch(batch float32)
- func (a *Adam) SetBeta1(beta1 float32)
- func (a *Adam) SetBeta2(beta2 float32)
- func (a *Adam) SetDecay1(decay1 float32)
- func (a *Adam) SetDecay2(decay2 float32)
- func (a *Adam) SetDecays(l1, l2 float32)
- func (a *Adam) SetEps(eps float32)
- func (a *Adam) SetRates(rate, dwalpha float32)
- func (a *Adam) SetTrainingMem(han *cudnn.Handler, w *layers.Tensor) error
- func (a *Adam) UpdateWeights(handle *cudnn.Handler, dw, w *layers.Tensor, batchsize, counter int) error
- type Momentum
- func (t *Momentum) L1L2Loss() (float32, float32)
- func (t *Momentum) SetDecays(l1, l2 float32)
- func (t *Momentum) SetRate(rate float32)
- func (t *Momentum) SetTrainingMem(handle *cudnn.Handler, w *layers.Tensor) error
- func (t *Momentum) UpdateWeights(handle *cudnn.Handler, dw, w *layers.Tensor, batch int) error
- type Settings
- type TSettings
- type Trainer
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func CreateTrainingMem ¶
CreateTrainingMem creates trainingmem for the trainer
func SetupAdamWandB ¶
SetupAdamWandB returns a trainer for both WandB
Types ¶
type Adam ¶
type Adam struct {
// contains filtered or unexported fields
}
Adam is a struct that does the holds the params for adam optimization
func (*Adam) SetTrainingMem ¶
SetTrainingMem creates the training mem for the adam trainer
type Momentum ¶
type Momentum struct {
// contains filtered or unexported fields
}
Momentum is a stuct that is used for the momentum operation in updating weights. E
func SetupMomentum ¶
SetupMomentum sets up the trainer for one and zero put the cscalar of 1 and 0 that matches the datatype there. (example gocudnn.CFloat(1.0) and gocudnn.CFloat(0.0). this is a hack, but it has to be done for sanity sake. (my sanity not yours :) ) I know of a way to fix this, but I am not able to do that right now. That being said. Maybe the reflect package might help (idk maybe not). The best thing I can think of is a type switch, but I would have to do that for every types, and I might add some more types in the gocudnn package. Or I could make some training stuff in C in the gocudnn Package.
func (*Momentum) SetTrainingMem ¶
SetTrainingMem will load the gsum values
type Settings ¶
type Settings struct { Beta1 float64 `json:"beta_1,omitempty"` Beta2 float64 `json:"beta_2,omitempty"` Decay1 float64 `json:"decay_1,omitempty"` Decay2 float64 `json:"decay_2,omitempty"` Rate float64 `json:"rate,omitempty"` Momentum float64 `json:"momentum,omitempty"` Eps float64 `json:"eps,omitempty"` Batch float64 `json:"batch,omitempty"` Managed bool `json:"managed,omitempty"` }
Settings contains the settings of a trainer
type Trainer ¶
type Trainer interface { UpdateWeights(ctx *cudnn.Handler, dw, w *layers.Tensor, batch, counter int) error L1L2Loss() (float32, float32) SetRates(rate, dwalpha float32) SetDecays(l1, l2 float32) }
Trainer will be used for updating weights. Only momentum and adam are available right now