-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Invalid argument: ConcatOp : Dimensions of inputs should match #1140
Comments
This last warning looks very suspicious:
If it mixes up the dims here, marks them incorrectly as same, this could explain such an error. |
I temporarily made that dim tag warning an exception (see also #1141), and this is the stacktrace:
|
You see from that log the problem. Specifically this:
|
Hm this is tricky. We have a case where the original config clearly was incorrect, but due to bugs (or rather missing extra checks) it worked, because in the end all the broken parts were not used (but still the layers were constructed). So, what should we do? Either be strict about it and make sure that existing configs keep working, or in this case break an existing. Or rather, it's already broken, so the question is more, fix RETURNN to make it work again, or not. Specifically, the
So in training, all layers are moved out, and this is the target seq. But the loop goes over the enc seq, so different seq len. So the |
Full log and partial config is here.
I don't really know what the problem is yet.
In the feed dict, the
Neg_1
looks suspicious:This could be some dim tag mess up.
In another run, I also saw this:
And:
Net construction log:
The text was updated successfully, but these errors were encountered: