Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add precision_recall_fscore_support function #186

Merged

Conversation

0urobor0s
Copy link
Contributor

Noticed that in #122 there was also the issue for precision_recall_fscore_support which was mainly done in #185.

lib/scholar/metrics/classification.ex Outdated Show resolved Hide resolved
lib/scholar/metrics/classification.ex Outdated Show resolved Hide resolved
@0urobor0s 0urobor0s force-pushed the ouro/precision_recall_fscore_support branch from 93d3da6 to cc965d5 Compare October 11, 2023 22:33
@0urobor0s 0urobor0s changed the title Draft: Add precision_recall_fscore_support function Add precision_recall_fscore_support function Oct 11, 2023
lib/scholar/metrics/classification.ex Outdated Show resolved Hide resolved

:micro ->
{precision, recall, per_class_fscore}
{precision, recall, per_class_fscore, Nx.Constants.nan()}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any idea on what to add here? The first try was to use :none, but it seems atoms can't be returned from defn.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using NaNs seems reasonable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would personally prefer to have something more clear that this value contains "nothing" useful. But with the current constraints I thought this was ok(ish). Maybe a new type/struct could be created for these cases?

f32
NaN
>}
iex> Scholar.Metrics.Classification.precision_recall_fscore_support(Nx.tensor([1, 0, 1, 0]), Nx.tensor([0, 1, 0, 1]), beta: 2, num_classes: 2, average: :none)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add new lines before iex whenever you can, so ExDoc formats them as distinct examples. Also, if you can break this line into several ones (for exmaple, by assigning the tensors and opts to variables, it can help with readability).

check_shape(y_pred, y_true)
num_classes = check_num_classes(opts[:num_classes])
average = opts[:average]
check_beta(beta)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can remove this. It has a default value and it cannot be nil as an option.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not defining a default value of beta in fbeta_score_schema, only in precision_recall_fscore_support_schema, which was the reason to leave the check there.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it is required there, why not make it an argument then?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That was the idea, but we've moved to this instead in order to be consistent with the other functions that use the beta parameter.
#186 (comment)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe using the NimbleOptions schema options with the required as true, would be better in this case?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly, I would prefer to keep it as an argument, but it is not a strong opinion. Or we remove the function and make it an option on f1_score.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In another kind of library I would agree, but for these kinds of data science/statistical learning libraries, I'm partial to having a consistent/predictable API.

Copy link
Contributor

@josevalim josevalim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @0urobor0s! For fbeta_score, I am still not convinced we should have an option as required value. I think we should either give it a default or expect it as argument.

I can think of two options:

  1. Rename the function to fscore, to mirror fscore in "precision_recall" and make beta optional

  2. Keep beta a required argument in fbeta_score (as the name says)

@0urobor0s
Copy link
Contributor Author

Hi @josevalim,
Not sure about the ergonomics of the API (and also this case), but I understand the rationale behind your thinking. I've made the necessary changes.

In the topic of renaming, I think precision_recall_fscore_support, would be better called precision_recall_fbeta_score_support. However there might be value in having the same name as in scikit-learn.

lib/scholar/options.ex Outdated Show resolved Hide resolved
Co-authored-by: Mateusz Sluszniak <[email protected]>
@josevalim josevalim merged commit b36df2f into elixir-nx:main Oct 17, 2023
1 of 2 checks passed
@josevalim
Copy link
Contributor

💚 💙 💜 💛 ❤️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants