Documentation ¶
Index ¶
- Constants
- func LibVersion() fu.VersionType
- func LuckyObjectify(source iokit.InputOutput, collection ...string) model.PredictionModel
- func Objectify(source iokit.InputOutput, collection ...string) (fm model.PredictionModel, err error)
- func ObjectifyModel(c map[string]iokit.Input) (pm model.PredictionModel, err error)
- type Model
- type Param
- type Params
Constants ¶
const Binary = objective("binary:logistic")
const DartBoost = booster("dart")
const GammaRegress = objective("reg:gamma")
gamma regression with log-link. Output is a mean of gamma distribution. It might be useful, e.g., for modeling insurance claims severity, or for any outcome that might be gamma-distributed.
const HingeBinary = objective("binary:hinge")
const Linear = objective("reg:linear")
const LinearBoost = booster("gblinear")
const Logistic = objective("reg:logistic")
const RawBinary = objective("binary:logitraw")
const Softmax = objective("multi:softmax")
set XGBoost to do multiclass classification using the softmax objective, you also need to set num_class(number of classes)
const Softprob = objective("multi:softprob")
same as softmax, but output a vector of ndata * nclass, which can be further reshaped to ndata * nclass matrix. The result contains predicted probability of each data point belonging to each class.
const SquareLinear = objective("reg:squarederror")
const SqureLogistic = objective("reg:squaredlogerror")
const TreeBoost = booster("gbtree")
const Tweedie = objective("reg:tweedie")
Variables ¶
This section is empty.
Functions ¶
func LibVersion ¶
func LibVersion() fu.VersionType
func LuckyObjectify ¶
func LuckyObjectify(source iokit.InputOutput, collection ...string) model.PredictionModel
LuckyObjectify is the errorless version of Objectify
func Objectify ¶
func Objectify(source iokit.InputOutput, collection ...string) (fm model.PredictionModel, err error)
Objectify creates xgboost prediction object from an input
Types ¶
type Model ¶
type Model struct { Algorithm booster Function objective Seed int // random generator seed Predicted string // name of predicted value column MinChildWeight float64 //the minimum sum of weights of all observations required in a child. Gamma float64 // Specifies the minimum loss reduction required to make a split. // Denotes the fraction of observations to be randomly samples for each tree. // Typical values: 0.5-1 Subsample float64 Lambda float64 // L2 regularization Alpha float64 // L1 regularization // Makes the model more robust by shrinking the weights on each step // Typical values: 0.01-0.2 LearningRate float64 // The maximum depth of a tree. // Used to control over-fitting as higher depth will allow model // to learn relations very specific to a particular sample. // Typical values: 3-10 MaxDepth int Extra Params }
Model is a XGBoost model definition
func (Model) Apply ¶
Apply parameters to define model specific