-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Support for multi-parameter loss functions in derivative aggregation and finding best splits #993
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good altogether. I added a few minor comments wrt. understandability.
// We wrap the writer in another lambda which we know takes advantage | ||
// of std::function small size optimization to avoid heap allocations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea 👍
Thanks for the review @valeriy42. I've addressed all your feedback, can you take another look and let me know if you're happy? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Good job 👍
…tion and finding best splits (elastic#993)
This extends computing aggregate derivatives and solving for the highest quadratic gain split to support multi-parameter loss functions.
I experimented with various implementations. This version, which minimises the number of allocations by using memory mapped classes, proved to be much the most efficient.
This is the next step towards #982.