-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training core only model and testing on test stories with entity annotations causes evaluation to crash. #8386
Comments
This also happens if you run on data with both of intent label and user text annotated in the test stories. For example in the retail-demo starter pack there are stories like this: - story: faq
steps:
- intent: greet
user: |-
hi
- action: utter_greet
- intent: faq
user: |
what kind of payment you take?
- action: utter_faq If you remove the |
@TyDunn I'd like to lobby for this one because the cost/benefit ratio is, I think, very favourable. Hacking around the bug took me ~3 lines of code (though a clean solution can still be complex, who knows). Once fixed, this would unblock our Core regression testing on the big Sara test set (which is the best Core dataset that we've got right now). |
I also just ran into this issue while working on a bot for IntentTED evaluation. |
Would also be helpful for sanity checking carbon bot (not crucial, but nice to have) |
We have estimated and are picking it up next sprint |
Hi @wochinge I started a having a look into this issue - I think this has to do with the fact that the The question I have is whether the wrong method was used here or if an implementation is required for |
@joejuzl implemented that to allow testing the extraction of entities by end-to-end policies ( |
The body is only |
Bugfix PR merged 😃 |
Rasa version: 2.4.2
Rasa SDK version (if used & relevant):
Rasa X version (if used & relevant):
Python version: 3.7.6
Operating system (windows, osx, ...): macos, Darwin-20.2.0-x86_64-i386-64bit
Issue:
Training non-e2e core only model and evaluating on a test stories with entity annotations causes evaluation to crash. We would like to be able to add starter pack datasets like the insurance-demo to the model regression tests, but this bug prevents us from doing regression tests on core policies in isolation.
Error (including full traceback):
Command or request that led to error:
Content of configuration file (config.yml) (if relevant):
Content of domain file (domain.yml) (if relevant):
The text was updated successfully, but these errors were encountered: