xgb

package module
v0.0.0-...-e4ecff4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 29, 2020 License: Apache-2.0 Imports: 14 Imported by: 0

README

CircleCI Maintainability Test Coverage Go Report Card License

import (
	"fmt"
	"go-ml.dev/pkg/base/model"
	"go-ml.dev/pkg/dataset/mnist"
	"go-ml.dev/pkg/iokit"
	"go-ml.dev/pkg/xgb"
	"gotest.tools/assert"
	"testing"
)

func Test_minstXgb(t *testing.T) {
	modelFile := iokit.File(model.Path("mnist_test_xgb.zip"))
	report := xgb.Model{
		Algorithm:    xgb.TreeBoost,
		Function:     xgb.Softmax,
		LearningRate: 0.54,
		MaxDepth:     7,
		Extra:        map[string]interface{}{"tree_method": "hist"},
	}.Feed(model.Dataset{
		Source:   mnist.Data.RandomFlag(model.TestCol, 42, 0.1),
		Features: mnist.Features,
	}).LuckyTrain(model.Training{
		Iterations: 30,
		ModelFile:  modelFile,
		Metrics:    model.Classification{Accuracy: 0.96},
		Score:      model.AccuracyScore,
	})

	fmt.Println(report.TheBest, report.Score)
	fmt.Println(report.History.Round(5))
	assert.Assert(t, model.Accuracy(report.Test) >= 0.96)

	pred := xgb.LuckyObjectify(modelFile)
	lr := model.LuckyEvaluate(mnist.T10k, model.LabelCol, pred, 32, model.Classification{})
	fmt.Println(lr.Round(5))
	assert.Assert(t, model.Accuracy(lr) >= 0.96)
}

Documentation

Index

Constants

View Source
const Binary = objective("binary:logistic")
View Source
const DartBoost = booster("dart")
View Source
const GammaRegress = objective("reg:gamma")

gamma regression with log-link. Output is a mean of gamma distribution. It might be useful, e.g., for modeling insurance claims severity, or for any outcome that might be gamma-distributed.

View Source
const HingeBinary = objective("binary:hinge")
View Source
const Linear = objective("reg:linear")
View Source
const LinearBoost = booster("gblinear")
View Source
const Logistic = objective("reg:logistic")
View Source
const RawBinary = objective("binary:logitraw")
View Source
const Softmax = objective("multi:softmax")

set XGBoost to do multiclass classification using the softmax objective, you also need to set num_class(number of classes)

View Source
const Softprob = objective("multi:softprob")

same as softmax, but output a vector of ndata * nclass, which can be further reshaped to ndata * nclass matrix. The result contains predicted probability of each data point belonging to each class.

View Source
const SquareLinear = objective("reg:squarederror")
View Source
const SqureLogistic = objective("reg:squaredlogerror")
View Source
const TreeBoost = booster("gbtree")
View Source
const Tweedie = objective("reg:tweedie")

Variables

This section is empty.

Functions

func LibVersion

func LibVersion() fu.VersionType

func LuckyObjectify

func LuckyObjectify(source iokit.InputOutput, collection ...string) model.PredictionModel

LuckyObjectify is the errorless version of Objectify

func Objectify

func Objectify(source iokit.InputOutput, collection ...string) (fm model.PredictionModel, err error)

Objectify creates xgboost prediction object from an input

func ObjectifyModel

func ObjectifyModel(c map[string]iokit.Input) (pm model.PredictionModel, err error)

ObjectifyModel creates xgboost predictor from the model collection

Types

type Model

type Model struct {
	Algorithm booster
	Function  objective

	Seed      int    // random generator seed
	Predicted string // name of predicted value column

	MinChildWeight float64 //the minimum sum of weights of all observations required in a child.
	Gamma          float64 // Specifies the minimum loss reduction required to make a split.

	// Denotes the fraction of observations to be randomly samples for each tree.
	// Typical values: 0.5-1
	Subsample float64

	Lambda float64 // L2 regularization
	Alpha  float64 // L1 regularization

	// Makes the model more robust by shrinking the weights on each step
	// Typical values: 0.01-0.2
	LearningRate float64

	// The maximum depth of a tree.
	// Used to control over-fitting as higher depth will allow model
	// to learn relations very specific to a particular sample.
	// Typical values: 3-10
	MaxDepth int

	Extra Params
}

Model is a XGBoost model definition

func (Model) Apply

func (m Model) Apply(p model.Params) Model

Apply parameters to define model specific

func (Model) Feed

func (e Model) Feed(ds model.Dataset) model.FatModel

Feed model with data

func (Model) ModelFunc

func (m Model) ModelFunc(p model.Params) model.HungryModel

ModelFunc updates xgboost model with parameters for hyper-optimization

type Param

type Param struct{ Name, Value string }

type Params

type Params map[string]interface{}

Params - xgboost model extra parameters

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL