leaves icon indicating copy to clipboard operation
leaves copied to clipboard

NOutputGroups always be 0

Open dwbaron opened this issue 4 years ago • 9 comments

after I followed the example in the doc, there exist a starnge error 截屏2020-02-25下午5 57 29

dwbaron avatar Feb 25 '20 10:02 dwbaron

I also find this error

liuguiyangnwpu avatar May 23 '20 08:05 liuguiyangnwpu

also having this error

yunfei86 avatar May 30 '20 00:05 yunfei86

same issue. Did you guys find a solution?

training code:

import xgboost as xgb

train_labels = train['click']
train_features = train.drop('click', axis=1)

test_labels = test['click']
test_features = test.drop('click', axis=1)

# convert to data matrix
train_matrix = xgb.DMatrix(train_features, train_labels)
test_matrix = xgb.DMatrix(test_features, test_labels)

# group by query
train_matrix.set_group(train_group)
test_matrix.set_group(test_group)

watchlist = [(train_matrix, 'train'), (test_matrix, 'eval')]

param = {
    'objective': 'rank:pairwise',
    'max_depth': 10,
    'eta': 0.1,
    'eval_metric': ['ndcg'],
    'colsample_bytree': 0.8,
    'subsample': 0.8,
    'tree_method': 'hist',
    'nthread': 64,
    'verbosity': 1
}
bst = xgb.train(param, train_matrix, 50, watchlist, early_stopping_rounds=10)
bst.save_model('xgboost.model')

prediction code:

package main

import (
	"fmt"

	"github.com/dmitryikh/leaves"
)

func main() {
	// loading model
	model, err := leaves.XGEnsembleFromFile("/Users/anuragkyal/Downloads/xgboost.model", false)
	if err != nil {
		panic(err)
	}
	fmt.Printf("Name: %s\n", model.Name())
	fmt.Printf("NFeatures: %d\n", model.NFeatures())
	fmt.Printf("NOutputGroups: %d\n", model.NOutputGroups())

	test_features := []float64{0.09688013136288999,  1.0,  3.0,  2.0,  0.0,  11.0,  2.0,  0.0,  0.0,  0.0,  3.0,  0.0,  1.0,  0.0,  0.0,  0.0,  1.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  1.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0}
	fmt.Printf("feature size = %d\n", len(test_features))
	p := model.Predict(test_features, 50, make([]float64, 1))
	fmt.Printf("Prediction for %f\n", p)
}

Output:

Name: xgboost.gbtree
NFeatures: 175
NOutputGroups: 0
My feature size = 175
panic: runtime error: integer divide by zero

goroutine 1 [running]:
github.com/dmitryikh/leaves.(*xgEnsemble).NEstimators(...)
	/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/xgensemble.go:22
github.com/dmitryikh/leaves.(*xgEnsemble).adjustNEstimators(0xc000090000, 0xa, 0x13)
	/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/xgensemble.go:42 +0x77
github.com/dmitryikh/leaves.(*Ensemble).Predict(0xc00000c020, 0xc00007e000, 0xaf, 0xaf, 0xa, 0xc000484010, 0x1, 0x1, 0x0, 0x0)
	/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/leaves.go:72 +0x114

anuragkyal avatar Jul 02 '20 05:07 anuragkyal

Hi all! I assume that this is because of using objective function ('objective': 'rank:ndcg', as example) that isn't supported by leaves. Anyway leaves should be not panic and show the error on model loading stage. I will try to fix it

dmitryikh avatar Jul 03 '20 07:07 dmitryikh

@dmitryikh got it, thanks for your response. Is any ranking objective supported with xgboost like rank:pariwise etc?

It would help if the list of objectives supported is added to the README.

anuragkyal avatar Jul 05 '20 05:07 anuragkyal

my objective function is binary:logistic, but also have this error

siyi8088 avatar Jan 20 '21 04:01 siyi8088

Name: xgboost.gbtree NFeatures: 31 NOutputGroups: 1 panic: runtime error: integer divide by zero

siyi8088 avatar Jan 20 '21 12:01 siyi8088

same here with XGBClassifier().save_model()

Name: xgboost.gbtree NFeatures: 8 NOutputGroups: 0 panic: runtime error: integer divide by zero

qjebbs avatar Apr 07 '21 06:04 qjebbs

I set python module xgboost===0.90 instead of 1.0.0 . so this error disappeared

kasiss-liu avatar Oct 25 '22 06:10 kasiss-liu