Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scan: Add a robustness detector to the scan that perturbs numerical values #1846

Open
kevinmessiaen opened this issue Mar 14, 2024 · 6 comments · May be fixed by #2040
Open

Scan: Add a robustness detector to the scan that perturbs numerical values #1846

kevinmessiaen opened this issue Mar 14, 2024 · 6 comments · May be fixed by #2040
Assignees
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@kevinmessiaen
Copy link
Member

🚀 Feature Request

Add a robustness detector to the scan that perturbs numerical values.

The scan should be able to a set of issues that capture the minimum amount of perturbation (lying in the bounds of the feature distribution) needed on a single numerical feature to:

  • (a) change the predicted label (classification)
  • (b) change the prediction by an amount that exceeds a certain threshold (regression)

🔈 Motivation

Currently the scan does not have any numerical perturbation.

@kevinmessiaen kevinmessiaen added enhancement New feature or request good first issue Good for newcomers labels Mar 14, 2024
@pranavm7
Copy link

Hey @kevinmessiaen! This seems to be a duplicate of #1847
PS: I'd love to contribute to the tool! I'll be on the lookout for new issues/improvements :)

@kevinmessiaen
Copy link
Member Author

@pranavm7 Hey, it's not exactly the same. One is for numerical values and the other is for categorical ones which differs a bit.

We would be happy to have you contribute on this tool, do you have any improvement ideas in mind?

@Kranium2002
Copy link
Contributor

@kevinmessiaen I would like to work on this if this is still open.

@kevinmessiaen
Copy link
Member Author

@Kranium2002 Sure we appreciate that, I assigned you to the issue. Let me know if you have some questions or need some help!

@Kranium2002
Copy link
Contributor

I would be working on adding a numerical perturbation detector to test model robustness by tweaking numerical features and seeing how much the model's predictions shift by around 1 %. For classification models, it'll flag cases where the predicted label changes, and for regression, it'll detect when predictions differ beyond a threshold (like 5%). I'll integrate this into the existing framework so it reports any significant sensitivity issues. Plus, I'll build out tests to ensure it's flexible across model types and datasets.

PS: Do I make the thresholds of 1 and 5 % editable by the user or do I keep them fixed?
Your thoughts? @kevinmessiaen

@Kranium2002 Kranium2002 linked a pull request Oct 13, 2024 that will close this issue
11 tasks
@Kranium2002
Copy link
Contributor

Working on this in #2040

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Development

Successfully merging a pull request may close this issue.

3 participants