Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Save model on every iteration #5178

Closed
Qashqay opened this issue Apr 25, 2022 · 7 comments
Closed

Save model on every iteration #5178

Qashqay opened this issue Apr 25, 2022 · 7 comments
Labels

Comments

@Qashqay
Copy link

Qashqay commented Apr 25, 2022

Hello,

Is it possible to save intermediate model on every boosting iteration with the standard train function?
For example:

params  = {'num_leaves': 10, 'objective': 'rmse', 'save_freq': 1, 'save_path': '/tmp/'}
model = train(params, data)

and in /tmp/ we will have 10 files with identical to model.save_model('/tmp/model.txt') format.

Thank you

@jmoralez
Copy link
Collaborator

Hello @Qashqay, thank you for your interest in LightGBM. This isn't implemented in the library but since there are callbacks this is very easy to achieve if you implement your own callback. Every callback takes a CallbackEnv as argument, so you can use something like the following:

import lightgbm as lgb
import numpy as np

def save_model_callback(env):
    env.model.save_model(f'booster_{env.iteration}.txt')

ds = lgb.Dataset(np.random.rand(100, 2), np.random.rand(100))
model = lgb.train({'num_leaves': 10}, ds, num_boost_round=5, callbacks=[save_model_callback])

After running this you should see 5 files in your current directory (one for each iteration). You can modify the logic of that function to only save every x iterations or saving to a different path.

Please let us know if this helps.

@Qashqay
Copy link
Author

Qashqay commented Apr 25, 2022

Great, thank you. That's exactly what I wanted.

@jmoralez
Copy link
Collaborator

Great. I'm closing this issue but feel free to reopen if you run into some problem using this.

@gaebw
Copy link

gaebw commented Sep 22, 2022

running on windows 10:
Name: lightgbm
Version: 3.1.1
Summary: LightGBM Python Package
Home-page: https://github.com/microsoft/LightGBM
Author: None
Author-email: None
License: The MIT License (Microsoft)
Location: c:\users\info\anaconda3\lib\site-packages
Requires: scipy, scikit-learn, wheel, numpy

tried above code:
import lightgbm as lgb
import numpy as np

def save_model_callback(env):
env.model.save_model(f'booster_{env.iteration}.txt')

ds = lgb.Dataset(np.random.rand(100, 2), np.random.rand(100))
model = lgb.train({'num_leaves': 10}, ds, num_boost_round=5, callbacks=[save_model_callback])

Error:
AttributeError: function 'LGBM_DumpParamAliases' not found

@gaebw
Copy link

gaebw commented Sep 22, 2022

On ubuntu it works!

@jameslamb
Copy link
Collaborator

jameslamb commented Sep 28, 2022

@gaebw LGBM_DumpParamAliases was introduced December 2, 2021 (#4829), so it's very surprising to see an error message about it given that you said you are using v3.1.1 (December 7, 2020).

I suspect that maybe you've built a newer version of the Python code (using python setup.py install --precompile) but linking to an older version of lib_lightgbm.so.

Please try building lightgbm's development version from the source code on GitHub, following the steps at https://github.com/microsoft/LightGBM/releases/tag/v3.3.1.

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 19, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

4 participants