You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
And the parameter refit_decay_rate controls the leaf_output, which is kind of like to avoid overfitting.
Sorry that I didn't find some useful relevant information about it so I was not sure of its use.
Motivation
So I want to know there is possible to implement learning rate decay in the sklearn interface, LGBMClassifier(), like the lgb original code example before.
Thanks in advance.
The text was updated successfully, but these errors were encountered:
Summary
I have searched some relevant information before, but I only get the implement of learning rate decay in LightGBM original interface like follows:
#129
https://lightgbm.readthedocs.io/en/latest/Parameters.html
https://www.kaggle.com/c/talkingdata-adtracking-fraud-detection/discussion/56158
And the parameter
refit_decay_rate
controls theleaf_output
, which is kind of like to avoid overfitting.Sorry that I didn't find some useful relevant information about it so I was not sure of its use.
Motivation
So I want to know there is possible to implement learning rate decay in the sklearn interface,
LGBMClassifier()
, like the lgb original code example before.Thanks in advance.
The text was updated successfully, but these errors were encountered: