|
|
|
@ -1,11 +1,14 @@
|
|
|
|
|
# Released Models
|
|
|
|
|
|
|
|
|
|
## Acoustic Model Released
|
|
|
|
|
Acoustic Model | Training Data | Token-based | Size | Descriptions | CER | Hours of speech
|
|
|
|
|
Acoustic Model | Training Data | Token-based | Size | Descriptions | CER or WER | Hours of speech
|
|
|
|
|
:-------------:| :------------:| :-----: | -----: | :----------------- | :---------- | :---------
|
|
|
|
|
[Ds2 Online Aishell Model](https://deepspeech.bj.bcebos.com/release2.1/aishell/s0/aishell.s0.ds_online.5rnn.debug.tar.gz) | Aishell Dataset | Char-based | 345 MB | 2 Conv + 5 LSTM layers with only forward direction | 0.0824 | 151 h
|
|
|
|
|
[Ds2 Offline Aishell Model](https://deepspeech.bj.bcebos.com/release2.1/aishell/s0/aishell.s0.ds2.offline.cer6p65.release.tar.gz)| Aishell Dataset | Char-based | 306 MB | 2 Conv + 3 bidirectional gru layers| 0.065 | 151 h
|
|
|
|
|
|
|
|
|
|
[Conformer Online Aishell Model](https://deepspeech.bj.bcebos.com/release2.1/aishell/s1/aishell.chunk.release.tar.gz) | Aishell Dataset | Char-based | 283 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention + CTC | 0.0594 | 151 h
|
|
|
|
|
[Conformer Offline Aishell Model](https://deepspeech.bj.bcebos.com/release2.1/aishell/s1/aishell.release.tar.gz) | Aishell Dataset | Char-based | 284 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0547 | 151 h
|
|
|
|
|
[Conformer Librispeech Model](https://deepspeech.bj.bcebos.com/release2.1/librispeech/s1/conformer.release.tar.gz) | Librispeech Dataset | Word-based | 287 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0325 | 960 h
|
|
|
|
|
[Transformer Librispeech Model](https://deepspeech.bj.bcebos.com/release2.1/librispeech/s1/transformer.release.tar.gz) | Librispeech Dataset | Word-based | 195 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0544 | 960 h
|
|
|
|
|
|
|
|
|
|
## Language Model Released
|
|
|
|
|
|
|
|
|
|