Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Outreachy applications] Learning from misclassifications #63

Closed
dzeber opened this issue Mar 13, 2020 · 0 comments · Fixed by #74, #98, #112 or #160
Closed

[Outreachy applications] Learning from misclassifications #63

dzeber opened this issue Mar 13, 2020 · 0 comments · Fixed by #74, #98, #112 or #160

Comments

@dzeber
Copy link
Contributor

dzeber commented Mar 13, 2020

When training a classification model, it is common to look at accuracy and the confusion matrix, which give a summary view of misclassifications. By itself, these metrics are informative but not very actionable.

Develop a metric or visualization that reveals something more about each misclassified point (beyond just the fact that it was misclassified) that can be used to improve the model.

Some examples of a metric might be the classification probability scores for the different classes, which can indicate whether the misclassified points were close to the decision boundary or not, or the distance from the class mean in feature space, indicating whether the misclassified points are outliers.

A good place to start is to study the misclassifications you got from your model for task #2. What do they tell you about how to improve your model?

KaairaGupta added a commit to KaairaGupta/PRESC that referenced this issue Mar 14, 2020
mlopatka pushed a commit that referenced this issue Mar 20, 2020
* WIP: created function to plot classification probablities for misclassified data points

* completed first attemp at #63

* minor updates in histogram plots in function plot_misclassified_probablities

* plot correct classes
mlopatka pushed a commit that referenced this issue Mar 20, 2020
* adds a module to visualize misclassification and tests it on winequality.csv

* adds .ipynb and .py for learning from misclassififcations

* Delete visualize_misclass.py

* Delete winequality.ipynb

* Delete winequality_modules.py
@mlopatka mlopatka reopened this Mar 20, 2020
@dzeber dzeber reopened this Mar 26, 2020
@mlopatka mlopatka reopened this Mar 27, 2020
@dzeber dzeber changed the title Learning from misclassifications [Outreachy applications] Learning from misclassifications Jul 14, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment