You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
PaddleSpeech/speechx
Hui Zhang 32a75cd5f2
Merge pull request #1751 from SmileGoat/add_websocketlib
3 years ago
..
cmake fix openfst patch 3 years ago
docker dir arch (#1347) 3 years ago
examples Merge pull request #1751 from SmileGoat/add_websocketlib 3 years ago
patch align linear_feature & nnet 3 years ago
speechx fix param path name; ws client 3 years ago
tools fix utils for ngram and wfst 3 years ago
.gitignore fix speech egs 3 years ago
CMakeLists.txt add websocket 3 years ago
README.md update speechx install doc,test=doc 3 years ago
build.sh update speechx install doc,test=doc 3 years ago

README.md

SpeechX -- All in One Speech Task Inference

Environment

We develop under:

  • docker - registry.baidubce.com/paddlepaddle/paddle:2.2.2-gpu-cuda10.2-cudnn7
  • os - Ubuntu 16.04.7 LTS
  • gcc/g++/gfortran - 8.2.0
  • cmake - 3.16.0

We make sure all things work fun under docker, and recommend using it to develop and deploy.

Build

  1. First to launch docker container.
docker run --privileged  --net=host --ipc=host -it --rm -v $PWD:/workspace --name=dev registry.baidubce.com/paddlepaddle/paddle:2.2.2-gpu-cuda10.2-cudnn7 /bin/bash
  • More Paddle docker images you can see here.

  • If you want only work under cpu, please download corresponded image, and using docker instead nvidia-docker.

  1. Build speechx and examples.

Do not source venv.

pushd /path/to/speechx
./build.sh
  1. Go to examples to have a fun.

More details please see README.md under examples.

Valgrind (Optional)

If using docker please check --privileged is set when docker run.

  • Fatal error at startup: a function redirection which is mandatory for this platform-tool combination cannot be set up
apt-get install libc6-dbg
  • Install
pushd tools
./setup_valgrind.sh
popd

TODO

Deepspeech2 with linear feature

  • DecibelNormalizer: there is a little bit difference between offline and online db norm. The computation of online db norm read feature chunk by chunk, which causes the feature size is different with offline db norm. In normalizer.cc:73, the samples.size() is different, which causes the difference of result.