Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ignore FallbackClassifier predictions during evaluation #7203

Merged
merged 6 commits into from
Nov 10, 2020

Conversation

wochinge
Copy link
Contributor

@wochinge wochinge commented Nov 6, 2020

Proposed changes:

  • fix Improvements to FallbackClassifier #6285
  • FallbackClassifer predictions are ignored when evaluating the NLU model. This is done as the nlu_fallback intent otherwise shadows the actual misclassifications.
  • add some missing type annotations to the test_evaluation module

Status (please check what you already did):

  • added some tests for the functionality
  • updated the documentation
  • updated the changelog (please check changelog for instructions)
  • reformat files using black (please check Readme for instructions)

@wochinge wochinge marked this pull request as ready for review November 6, 2020 15:53
rasa/nlu/test.py Outdated Show resolved Hide resolved
@rasabot rasabot merged commit e2f6d02 into master Nov 10, 2020
@rasabot rasabot deleted the fallback-classifier-improvement branch November 10, 2020 09:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improvements to FallbackClassifier
3 participants