Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the bug of dropout and prenet. #205

Merged
merged 2 commits into from
Aug 31, 2018
Merged

Conversation

npuichigo
Copy link
Contributor

I have tested that with both of the bugs fixed, good alignments can be learned with reduction factor 2 easily.

We should put DecoderPrenetWrapper outside AttentionWrapper. Otherwise, the AttentionWrapper concatenates the input and previous attention vector before calling the internal cell, so dropout in the DecoderPrenetWrapper will drop 50% information of previous attention vector.
@begeekmyfriend
Copy link

begeekmyfriend commented Aug 31, 2018

Hi I would like to ask you that if we cut off half of samples in metadata.scv of LJSpeech, can we still get alignment with your fix?

@npuichigo
Copy link
Contributor Author

LJSpeech is about 23 hours, and I think maybe 10+ hours of data is enough to learn the alignments.

@keithito
Copy link
Owner

Thanks for the fixes!

@keithito keithito merged commit 231d6d7 into keithito:master Aug 31, 2018
@LearnedVector
Copy link

@npuichigo have you tested this on reduction factor of 5 (outputs_per_step=5)? With this change, my model is not wanting to align. Usually it'll align at about 10k steps. Currently at 30 and it's not wanting to align correctly.

step-30000-align

Changed found here.
https://github.com/MycroftAI/mimic2/blob/master/models/tacotron.py#L51

@npuichigo
Copy link
Contributor Author

@LearnedVector I tested on reduction factor 2 with LJSpeech and alignments can be learned after about 20k steps.
image
Here's the alignment of 32k with r=2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants