You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
PaddleSpeech/examples/librispeech/s2
Hui Zhang 171fa353ee
refactor libri s2 conf
3 years ago
..
conf refactor libri s2 conf 3 years ago
local Add the feature: caculating the perplexity of transformerLM 3 years ago
.gitignore update gitignore 3 years ago
README.md format code 3 years ago
cmd.sh fix set_device; more utils; args.opts support multi same name 3 years ago
path.sh change the lm dataset dir 3 years ago
run.sh default gpu 0 for scripts 3 years ago
steps more utils to support kaldi/espnet data preocess 3 years ago
utils more utils to support kaldi/espnet data preocess 3 years ago

README.md

LibriSpeech

Transformer

| Model | Params | GPUS | Averaged Model | Config | Augmentation| Loss | | --- | --- | --- | --- | --- | --- |
| transformer | 32.52 M | 8 Tesla V100-SXM2-32GB | 10-best val_loss | conf/transformer.yaml | spec_aug | 6.3197922706604 |

Test Set Decode Method #Snt #Wrd Corr Sub Del Ins Err S.Err
test-clean attention 2620 52576 96.4 2.5 1.1 0.4 4.0 34.7
test-clean ctc_greedy_search 2620 52576 95.9 3.7 0.4 0.5 4.6 48.0
test-clean ctc_prefix_beamsearch 2620 52576 95.9 3.7 0.4 0.5 4.6 47.6
test-clean attention_rescore 2620 52576 96.8 2.9 0.3 0.4 3.7 38.0

JoinCTC

Test Set Decode Method #Snt #Wrd Corr Sub Del Ins Err S.Err
test-clean join_ctc_only_att 2620 52576 96.1 2.5 1.4 0.4 4.4 34.7
test-clean join_ctc_w/o_lm 2620 52576 97.2 2.6 0.3 0.4 3.2 34.9
test-clean join_ctc_w_lm 2620 52576 97.9 1.8 0.2 0.3 2.4 27.8

Compare with ESPNET we using 8gpu, but model size (aheads4-adim256) small than it.