batch average ctc loss (#567)
* when loss div batchsize, change lr, more epoch, loss can reduce more and cer lower than before * since loss reduce more when loss div batchsize, less lm alpha can be better. * less lm alpha, more cer reduce * alpha 2.2, cer 0.077478 * alpha 1.9, cer 0.077249 * large librispeech lr for batch_average ctc loss * since loss reduce and model more confidence, then less lm alphapull/570/head
parent
258307df9b
commit
e0a87a5ab1
@ -1,7 +1,7 @@
|
||||
# Aishell-1
|
||||
|
||||
## CTC
|
||||
| Model | Config | Test set | CER |
|
||||
| --- | --- | --- | --- |
|
||||
| DeepSpeech2 | conf/deepspeech2.yaml | test | 0.078977 |
|
||||
| DeepSpeech2 | release 1.8.5 | test | 0.080447 |
|
||||
| Model | Config | Test Set | CER | Valid Loss |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| DeepSpeech2 | conf/deepspeech2.yaml | test | 0.077249 | 7.036566 |
|
||||
| DeepSpeech2 | release 1.8.5 | test | 0.087004 | 8.575452 |
|
||||
|
@ -1,7 +1,7 @@
|
||||
# LibriSpeech
|
||||
|
||||
## CTC
|
||||
| Model | Config | Test set | WER |
|
||||
| --- | --- | --- | --- |
|
||||
| DeepSpeech2 | conf/deepspeech2.yaml | test-clean | 0.073973 |
|
||||
| DeepSpeech2 | release 1.8.5 | test-clean | 0.074939 |
|
||||
| Model | Config | Test Set | WER | Valid Loss |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| DeepSpeech2 | conf/deepspeech2.yaml | test-clean | 0.069357 | 15.078561 |
|
||||
| DeepSpeech2 | release 1.8.5 | test-clean | 0.074939 | 15.351633 |
|
||||
|
Loading…
Reference in new issue