Documentation ¶
Overview ¶
Package softmax uses the softmax functions from gocudnn which is from cudnn. Except it doesn't use any of the flags.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type OpInfo ¶
type OpInfo struct { Algo gocudnn.SoftMaxAlgorithm `json:"Algo"` Mode gocudnn.SoftMaxMode `json:"Mode"` }
OpInfo Contains all the needed information to build a softmax.op
type Ops ¶
type Ops struct {
// contains filtered or unexported fields
}
Ops does the softmax algo
func StageAccuratePerChannel ¶
func StageAccuratePerChannel() *Ops
StageAccuratePerChannel stages the op to do a accurate softmax per channel.
func StageAccuratePerInstance ¶
func StageAccuratePerInstance() *Ops
StageAccuratePerInstance stages the op to do a accurate softmax per instance.
func StageFastPerChannel ¶
func StageFastPerChannel() *Ops
StageFastPerChannel stages the op to do a fast softmax per channel.
func StageFastPerInstance ¶
func StageFastPerInstance() *Ops
StageFastPerInstance stages the op to do a fast softmax per instance.
func StageLogPerChannel ¶
func StageLogPerChannel() *Ops
StageLogPerChannel stages the op to do a log softmax per channel.
func StageLogPerInstance ¶
func StageLogPerInstance() *Ops
StageLogPerInstance stages the op to do a log softmax per instance.
func (*Ops) BackProp ¶
func (s *Ops) BackProp(handle *cudnn.Handler, alpha float64, y, dy *tensor.Volume, beta float64, dx *tensor.Volume) error
BackProp performs the backward propigation