Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which method is recommended for model online inference? #5257

Closed
hepengfei-ml opened this issue Jun 1, 2022 · 2 comments
Closed

Which method is recommended for model online inference? #5257

hepengfei-ml opened this issue Jun 1, 2022 · 2 comments
Labels

Comments

@hepengfei-ml
Copy link

Which method is recommended for model online inference? I want high performance concurrency,
PMML ?

@StrikerRUS
Copy link
Collaborator

It depends on your use case.

Initially, LightGBM wasn't optimized for fast predictions. Some time ago *SingleRowFast API was introduced: #2992. You can start with it.

Also, you can try to convert fitted model into if/else cpp code: https://lightgbm.readthedocs.io/en/latest/Parameters.html#convert_model.

In addition, you can find a lot of third-party libraries to inference LightGBM models here: https://github.com/microsoft/LightGBM#external-unofficial-repositories.

I'd personally recommend starting from the treelite library.

@jameslamb jameslamb changed the title inference Which method is recommended for model online inference? Jun 5, 2022
@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 19, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

3 participants