-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
panic: interface conversion: ... missing method CatToNum #47
Comments
It looks like applyforest thinks you're doing classification (it uses a stupid method to guess) and is trying to treat you're target feature as categorical. You can give it the -mean flag to force it to vote numerically but I should probably make it better about guessing for libsvm files. |
Ryan, thanks for suggesting the -mean flag. That works. I already converted any integers to floats for the target feature and assumed that would have been sufficient. This asides: your implementation works really well. |
Hi @ryanbressler I am trying to run the following code using your CloudForest. My input file format is in libsvm. And 1st column is the target. The labels are 0,1 and 2. I want the classification, not regression. Thanks for your help. #------------code #---------------error message goroutine 1 [running]: |
@mhfzsharmin. Your prams don't make sense...l1 and ordinal are regression methods and not compatible with gradient boosting. Also since it is erroring at test time you may just be able to test with applyforest. |
@ryanbressler I tried without "-ordinal" and "-l1=true" as well. Still, I get the same error. |
@ryanbressler msharmin@nandi:~/cloudforest$ grep 'PRED=6' forest..binary.sf | wc -l msharmin@nandi:~/cloudforest$ grep 'PRED=0' forest.binary.sf | wc -l msharmin@nandi:~/cloudforest$ grep 'PRED=1' msforest.binary.sf | wc -l msharmin@nandi:~/cloudforest$ grep 'PRED=2' msforest.binary.sf | wc -l msharmin@nandi:~/cloudforest$ cut -f1 test.binary.libsvm | sort |uniq msharmin@nandi:~/cloudforest$ cut -f1 train.binary.libsvm | sort |uniq |
Gradient boosting can be used with (binary) classification but you have to use it with the sum and expit options in applyforest IIRC. 6 has no special meaning in the output. There may be a bug/quirk in my libsvm parsing code causing it to see that value or learn from the wrong column, you could try one of the other formats or using applyforest instead of the -test option. IF you post the whole command and the output it gives i may be able to tell if anything funky is going on, otherwise sorry I can't be more helpful. |
@ryanbressler using *.afm format worked. I had 3 labels in my target column. To confirm my understanding Multi-class classification works in cloudforest, right? If this is a parsing issue of libSVM, may be instead of using numeric category (0,1 and 2), direct A,B,C as categories would have worked in libSVM. If I get to that, I will post the result in case any future users find it helpful. Thanks for your response. |
Multi class classification works with the standard gini impurity random forest and the -entropy option. Gradient Boosting boosting relies on the gradient of the logistic loss function which is intrinsically binary (IIRC the same is true with adaboost). You can however train one boosted forest for each class if you want to experiment with multiclass boosting. |
I run into an error using libsvm's sparse format, given the following files:
Training goes well, but testing throws an error. Any idea what is going on?
The text was updated successfully, but these errors were encountered: