Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to set learning rate decay on sklearn interface like LGBMClassifier #2698

Closed
BovenPeng opened this issue Jan 17, 2020 · 1 comment
Closed
Labels

Comments

@BovenPeng
Copy link

Summary

I have searched some relevant information before, but I only get the implement of learning rate decay in LightGBM original interface like follows:
#129
https://lightgbm.readthedocs.io/en/latest/Parameters.html
https://www.kaggle.com/c/talkingdata-adtracking-fraud-detection/discussion/56158

gbm = lgb.train(
    params,
    lgb_train,
    num_boost_round=10,
    init_model=gbm,
    valid_sets=lgb_eval,
    callbacks=[lgb.reset_parameter(learning_rate = lambda current_round: alpha * e**(k*current_round)
)

And the parameter refit_decay_rate controls the leaf_output, which is kind of like to avoid overfitting.
Sorry that I didn't find some useful relevant information about it so I was not sure of its use.

Motivation

So I want to know there is possible to implement learning rate decay in the sklearn interface, LGBMClassifier(), like the lgb original code example before.
Thanks in advance.

@StrikerRUS
Copy link
Collaborator

Hi @BovenPeng !

Sure, you can write learning rate decay for the sklearn wrapper just like for the original interface:

est = lgb.LGBMClassifier().fit(X, y, callbacks=[lgb.reset_parameter(learning_rate=lambda current_round: alpha * e**(k * current_round))])

callbacks : list of callback functions or None, optional (default=None)
List of callback functions that are applied at each iteration.
See Callbacks in Python API for more information.

``

@lock lock bot locked as resolved and limited conversation to collaborators Mar 17, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

2 participants