-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add precision_recall_fscore_support function #186
Add precision_recall_fscore_support function #186
Conversation
93d3da6
to
cc965d5
Compare
|
||
:micro -> | ||
{precision, recall, per_class_fscore} | ||
{precision, recall, per_class_fscore, Nx.Constants.nan()} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any idea on what to add here? The first try was to use :none
, but it seems atoms can't be returned from defn
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using NaNs seems reasonable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would personally prefer to have something more clear that this value contains "nothing" useful. But with the current constraints I thought this was ok(ish). Maybe a new type/struct could be created for these cases?
f32 | ||
NaN | ||
>} | ||
iex> Scholar.Metrics.Classification.precision_recall_fscore_support(Nx.tensor([1, 0, 1, 0]), Nx.tensor([0, 1, 0, 1]), beta: 2, num_classes: 2, average: :none) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add new lines before iex
whenever you can, so ExDoc formats them as distinct examples. Also, if you can break this line into several ones (for exmaple, by assigning the tensors and opts to variables, it can help with readability).
check_shape(y_pred, y_true) | ||
num_classes = check_num_classes(opts[:num_classes]) | ||
average = opts[:average] | ||
check_beta(beta) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can remove this. It has a default value and it cannot be nil as an option.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not defining a default value of beta in fbeta_score_schema
, only in precision_recall_fscore_support_schema
, which was the reason to leave the check there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it is required there, why not make it an argument then?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That was the idea, but we've moved to this instead in order to be consistent with the other functions that use the beta parameter.
#186 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe using the NimbleOptions schema options with the required as true, would be better in this case?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Honestly, I would prefer to keep it as an argument, but it is not a strong opinion. Or we remove the function and make it an option on f1_score.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In another kind of library I would agree, but for these kinds of data science/statistical learning libraries, I'm partial to having a consistent/predictable API.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @0urobor0s! For fbeta_score
, I am still not convinced we should have an option as required value. I think we should either give it a default or expect it as argument.
I can think of two options:
-
Rename the function to
fscore
, to mirrorfscore
in "precision_recall" and makebeta
optional -
Keep beta a required argument in fbeta_score (as the name says)
Hi @josevalim, In the topic of renaming, I think |
Co-authored-by: Mateusz Sluszniak <[email protected]>
💚 💙 💜 💛 ❤️ |
Noticed that in #122 there was also the issue for
precision_recall_fscore_support
which was mainly done in #185.