-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
实验复现结果不一致 #7
Comments
@wljcode Relevant logs are uploaded for comparson. |
@wljcode |
Thank you for your reply. Due to the limitation of GPU memory, we did not continue the experiment reproduction work recently. We will complete your work later when the equipment is ready ! |
1 similar comment
Thank you for your reply. Due to the limitation of GPU memory, we did not continue the experiment reproduction work recently. We will complete your work later when the equipment is ready ! |
Hi, @ycmin95 , thanks for your great work. |
Hi, @sunke123, thanks for your attention to our work. The parameter label_smoothing is adopted in our early experiment about iterative training and I forget to delete this parameter, I will correct it in the next update. |
@ycmin95 |
Hi, @sunke123, |
你好,请问我下载了代码重新训练了一下,但几轮下来DEV wer依旧是100%.我把batchsize调成1,lr调成0.000010.可以给点意见吗,谢谢. |
@herochen7372 |
@ycmin95 |
I noticed in your log that there are Dev WER and Test WER for each epoch of training, but mine is only Dev WER. |
Hello, @ycmin95. I've noticed "# ConvCTC: 1.0" in the baseline.yaml file. I added ConvCTC: 1.0 and retrained 80 epoch, but the results were even worse. |
Hi, @kido1412y2y |
home/pickleball/公共的/oldli/VAC_CSLR-main |
作者您好,我们通过下载您的代码并对您提出的VAC进行了重跑了50个epoch(没有使用BN),结果最好只有35.1%的词错率。此外,我们调整代码中的权重,对baseline算法进行实验(不使用BN),发现结果也与论文中结果相差甚多,请问是否代码版本不一致,又或我们训练时间过短?
The text was updated successfully, but these errors were encountered: