diff --git a/.readthedocs.yml b/.readthedocs.yml
index dc38a20fc..e922891e1 100644
--- a/.readthedocs.yml
+++ b/.readthedocs.yml
@@ -21,5 +21,6 @@ python:
version: 3.7
install:
- requirements: docs/requirements.txt
-
-
+ - method: setuptools
+ path: .
+ system_packages: true
\ No newline at end of file
diff --git a/MANIFEST.in b/MANIFEST.in
new file mode 100644
index 000000000..7c9f4165c
--- /dev/null
+++ b/MANIFEST.in
@@ -0,0 +1,2 @@
+include paddlespeech/t2s/exps/*.txt
+include paddlespeech/t2s/frontend/*.yaml
\ No newline at end of file
diff --git a/README.md b/README.md
index c9d4796c8..e35289e2b 100644
--- a/README.md
+++ b/README.md
@@ -1,3 +1,4 @@
+
([简体中文](./README_cn.md)|English)
@@ -24,14 +25,16 @@
| Documents
| Models List
| AIStudio Courses
- | Paper
+ | NAACL2022 Best Demo Award Paper
| Gitee
------------------------------------------------------------------------------------
-**PaddleSpeech** is an open-source toolkit on [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) platform for a variety of critical tasks in speech and audio, with the state-of-art and influential models.
+**PaddleSpeech** is an open-source toolkit on [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) platform for a variety of critical tasks in speech and audio, with the state-of-art and influential models.
+
+**PaddleSpeech** won the [NAACL2022 Best Demo Award](https://2022.naacl.org/blog/best-demo-award/), please check out our paper on [Arxiv](https://arxiv.org/abs/2205.12007).
##### Speech Recognition
@@ -176,7 +179,7 @@ Via the easy-to-use, efficient, flexible and scalable implementation, our vision
## Installation
-We strongly recommend our users to install PaddleSpeech in **Linux** with *python>=3.7*.
+We strongly recommend our users to install PaddleSpeech in **Linux** with *python>=3.7* and *paddlepaddle>=2.3.1*.
Up to now, **Linux** supports CLI for the all our tasks, **Mac OSX** and **Windows** only supports PaddleSpeech CLI for Audio Classification, Speech-to-Text and Text-to-Speech. To install `PaddleSpeech`, please see [installation](./docs/source/install.md).
@@ -494,6 +497,14 @@ PaddleSpeech supports a series of most popular models. They are summarized in [r
ge2e-fastspeech2-aishell3
+
+ End-to-End
+ VITS
+ CSMSC
+
+ VITS-csmsc
+
+
@@ -688,6 +699,7 @@ You are warmly welcome to submit questions in [discussions](https://github.com/P
## Acknowledgement
+- Many thanks to [BarryKCL](https://github.com/BarryKCL) improved TTS Chinses frontend based on [G2PW](https://github.com/GitYCC/g2pW)
- Many thanks to [yeyupiaoling](https://github.com/yeyupiaoling)/[PPASR](https://github.com/yeyupiaoling/PPASR)/[PaddlePaddle-DeepSpeech](https://github.com/yeyupiaoling/PaddlePaddle-DeepSpeech)/[VoiceprintRecognition-PaddlePaddle](https://github.com/yeyupiaoling/VoiceprintRecognition-PaddlePaddle)/[AudioClassification-PaddlePaddle](https://github.com/yeyupiaoling/AudioClassification-PaddlePaddle) for years of attention, constructive advice and great help.
- Many thanks to [mymagicpower](https://github.com/mymagicpower) for the Java implementation of ASR upon [short](https://github.com/mymagicpower/AIAS/tree/main/3_audio_sdks/asr_sdk) and [long](https://github.com/mymagicpower/AIAS/tree/main/3_audio_sdks/asr_long_audio_sdk) audio files.
- Many thanks to [JiehangXie](https://github.com/JiehangXie)/[PaddleBoBo](https://github.com/JiehangXie/PaddleBoBo) for developing Virtual Uploader(VUP)/Virtual YouTuber(VTuber) with PaddleSpeech TTS function.
@@ -696,6 +708,8 @@ You are warmly welcome to submit questions in [discussions](https://github.com/P
- Many thanks to [awmmmm](https://github.com/awmmmm) for contributing fastspeech2 aishell3 conformer pretrained model.
- Many thanks to [phecda-xu](https://github.com/phecda-xu)/[PaddleDubbing](https://github.com/phecda-xu/PaddleDubbing) for developing a dubbing tool with GUI based on PaddleSpeech TTS model.
- Many thanks to [jerryuhoo](https://github.com/jerryuhoo)/[VTuberTalk](https://github.com/jerryuhoo/VTuberTalk) for developing a GUI tool based on PaddleSpeech TTS and code for making datasets from videos based on PaddleSpeech ASR.
+- Many thanks to [vpegasus](https://github.com/vpegasus)/[xuesebot](https://github.com/vpegasus/xuesebot) for developing a rasa chatbot,which is able to speak and listen thanks to PaddleSpeech.
+- Many thanks to [chenkui164](https://github.com/chenkui164)/[FastASR](https://github.com/chenkui164/FastASR) for the C++ inference implementation of PaddleSpeech ASR.
Besides, PaddleSpeech depends on a lot of open source repositories. See [references](./docs/source/reference.md) for more information.
diff --git a/README_cn.md b/README_cn.md
index c751b061d..1c6a949fd 100644
--- a/README_cn.md
+++ b/README_cn.md
@@ -1,3 +1,4 @@
+
(简体中文|[English](./README.md))
@@ -19,13 +20,14 @@
@@ -34,6 +36,11 @@
------------------------------------------------------------------------------------
**PaddleSpeech** 是基于飞桨 [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) 的语音方向的开源模型库,用于语音和音频中的各种关键任务的开发,包含大量基于深度学习前沿和有影响力的模型,一些典型的应用示例如下:
+
+**PaddleSpeech** 荣获 [NAACL2022 Best Demo Award](https://2022.naacl.org/blog/best-demo-award/), 请访问 [Arxiv](https://arxiv.org/abs/2205.12007) 论文。
+
+### 效果展示
+
##### 语音识别
@@ -150,7 +157,7 @@
本项目采用了易用、高效、灵活以及可扩展的实现,旨在为工业应用、学术研究提供更好的支持,实现的功能包含训练、推断以及测试模块,以及部署过程,主要包括
- 📦 **易用性**: 安装门槛低,可使用 [CLI](#quick-start) 快速开始。
- 🏆 **对标 SoTA**: 提供了高速、轻量级模型,且借鉴了最前沿的技术。
-- 🏆 **流式ASR和TTS系统**:工业级的端到端流式识别、流式合成系统。
+- 🏆 **流式 ASR 和 TTS 系统**:工业级的端到端流式识别、流式合成系统。
- 💯 **基于规则的中文前端**: 我们的前端包含文本正则化和字音转换(G2P)。此外,我们使用自定义语言规则来适应中文语境。
- **多种工业界以及学术界主流功能支持**:
- 🛎️ 典型音频任务: 本工具包提供了音频任务如音频分类、语音翻译、自动语音识别、文本转语音、语音合成、声纹识别、KWS等任务的实现。
@@ -159,6 +166,7 @@
### 近期更新
+
- 👑 2022.05.13: PaddleSpeech 发布 [PP-ASR](./docs/source/asr/PPASR_cn.md) 流式语音识别系统、[PP-TTS](./docs/source/tts/PPTTS_cn.md) 流式语音合成系统、[PP-VPR](docs/source/vpr/PPVPR_cn.md) 全链路声纹识别系统
- 👏🏻 2022.05.06: PaddleSpeech Streaming Server 上线! 覆盖了语音识别(标点恢复、时间戳),和语音合成。
- 👏🏻 2022.05.06: PaddleSpeech Server 上线! 覆盖了声音分类、语音识别、语音合成、声纹识别,标点恢复。
@@ -177,61 +185,195 @@
+
## 安装
我们强烈建议用户在 **Linux** 环境下,*3.7* 以上版本的 *python* 上安装 PaddleSpeech。
-目前为止,**Linux** 支持声音分类、语音识别、语音合成和语音翻译四种功能,**Mac OSX、 Windows** 下暂不支持语音翻译功能。 想了解具体安装细节,可以参考[安装文档](./docs/source/install_cn.md)。
+
+### 相关依赖
++ gcc >= 4.8.5
++ paddlepaddle >= 2.3.1
++ python >= 3.7
++ linux(推荐), mac, windows
+
+PaddleSpeech依赖于paddlepaddle,安装可以参考[paddlepaddle官网](https://www.paddlepaddle.org.cn/),根据自己机器的情况进行选择。这里给出cpu版本示例,其它版本大家可以根据自己机器的情况进行安装。
+
+```shell
+pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple
+```
+
+PaddleSpeech快速安装方式有两种,一种是pip安装,一种是源码编译(推荐)。
+
+### pip 安装
+```shell
+pip install pytest-runner
+pip install paddlespeech
+```
+
+### 源码编译
+```shell
+git clone https://github.com/PaddlePaddle/PaddleSpeech.git
+cd PaddleSpeech
+pip install pytest-runner
+pip install .
+```
+
+更多关于安装问题,如 conda 环境,librosa 依赖的系统库,gcc 环境问题,kaldi 安装等,可以参考这篇[安装文档](docs/source/install_cn.md),如安装上遇到问题可以在 [#2150](https://github.com/PaddlePaddle/PaddleSpeech/issues/2150) 上留言以及查找相关问题
## 快速开始
-安装完成后,开发者可以通过命令行快速开始,改变 `--input` 可以尝试用自己的音频或文本测试。
+安装完成后,开发者可以通过命令行或者Python快速开始,命令行模式下改变 `--input` 可以尝试用自己的音频或文本测试,支持16k wav格式音频。
-**声音分类**
+你也可以在`aistudio`中快速体验 👉🏻[PaddleSpeech API Demo ](https://aistudio.baidu.com/aistudio/projectdetail/4281335?shared=1)。
+
+测试音频示例下载
```shell
-paddlespeech cls --input input.wav
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
```
-**声纹识别**
+
+### 语音识别
+ (点击可展开)开源中文语音识别
+
+命令行一键体验
+
```shell
-paddlespeech vector --task spk --input input_16k.wav
+paddlespeech asr --lang zh --input zh.wav
+```
+
+Python API 一键预测
+
+```python
+>>> from paddlespeech.cli.asr.infer import ASRExecutor
+>>> asr = ASRExecutor()
+>>> result = asr(audio_file="zh.wav")
+>>> print(result)
+我认为跑步最重要的就是给我带来了身体健康
```
-**语音识别**
+
+
+### 语音合成
+
+ 开源中文语音合成
+
+输出 24k 采样率wav格式音频
+
+
+命令行一键体验
+
```shell
-paddlespeech asr --lang zh --input input_16k.wav
+paddlespeech tts --input "你好,欢迎使用百度飞桨深度学习框架!" --output output.wav
```
-**语音翻译** (English to Chinese)
+
+Python API 一键预测
+
+```python
+>>> from paddlespeech.cli.tts.infer import TTSExecutor
+>>> tts = TTSExecutor()
+>>> tts(text="今天天气十分不错。", output="output.wav")
+```
+- 语音合成的 web demo 已经集成进了 [Huggingface Spaces](https://huggingface.co/spaces). 请参考: [TTS Demo](https://huggingface.co/spaces/KPatrick/PaddleSpeechTTS)
+
+
+
+### 声音分类
+
+ 适配多场景的开放领域声音分类工具
+
+基于AudioSet数据集527个类别的声音分类模型
+
+命令行一键体验
+
```shell
-paddlespeech st --input input_16k.wav
+paddlespeech cls --input zh.wav
+```
+
+python API 一键预测
+
+```python
+>>> from paddlespeech.cli.cls.infer import CLSExecutor
+>>> cls = CLSExecutor()
+>>> result = cls(audio_file="zh.wav")
+>>> print(result)
+Speech 0.9027186632156372
```
-**语音合成**
+
+
+
+### 声纹提取
+
+ 工业级声纹提取工具
+
+命令行一键体验
+
```shell
-paddlespeech tts --input "你好,欢迎使用百度飞桨深度学习框架!" --output output.wav
+paddlespeech vector --task spk --input zh.wav
+```
+
+Python API 一键预测
+
+```python
+>>> from paddlespeech.cli.vector import VectorExecutor
+>>> vec = VectorExecutor()
+>>> result = vec(audio_file="zh.wav")
+>>> print(result) # 187维向量
+[ -0.19083306 9.474295 -14.122263 -2.0916545 0.04848729
+ 4.9295826 1.4780062 0.3733844 10.695862 3.2697146
+ -4.48199 -0.6617882 -9.170393 -11.1568775 -1.2358263 ...]
```
-- 语音合成的 web demo 已经集成进了 [Huggingface Spaces](https://huggingface.co/spaces). 请参考: [TTS Demo](https://huggingface.co/spaces/akhaliq/paddlespeech)
-**文本后处理**
- - 标点恢复
- ```bash
- paddlespeech text --task punc --input 今天的天气真不错啊你下午有空吗我想约你一起去吃饭
- ```
+
-**批处理**
+### 标点恢复
+
+ 一键恢复文本标点,可与ASR模型配合使用
+
+命令行一键体验
+
+```shell
+paddlespeech text --task punc --input 今天的天气真不错啊你下午有空吗我想约你一起去吃饭
```
-echo -e "1 欢迎光临。\n2 谢谢惠顾。" | paddlespeech tts
+
+Python API 一键预测
+
+```python
+>>> from paddlespeech.cli.text.infer import TextExecutor
+>>> text_punc = TextExecutor()
+>>> result = text_punc(text="今天的天气真不错啊你下午有空吗我想约你一起去吃饭")
+今天的天气真不错啊!你下午有空吗?我想约你一起去吃饭。
```
-**Shell管道**
-ASR + Punc:
+
+
+### 语音翻译
+
+ 端到端英译中语音翻译工具
+
+使用预编译的kaldi相关工具,只支持在Ubuntu系统中体验
+
+命令行一键体验
+
+```shell
+paddlespeech st --input en.wav
```
-paddlespeech asr --input ./zh.wav | paddlespeech text --task punc
+
+python API 一键预测
+
+```python
+>>> from paddlespeech.cli.st.infer import STExecutor
+>>> st = STExecutor()
+>>> result = st(audio_file="en.wav")
+['我 在 这栋 建筑 的 古老 门上 敲门 。']
```
-更多命令行命令请参考 [demos](https://github.com/PaddlePaddle/PaddleSpeech/tree/develop/demos)
-> Note: 如果需要训练或者微调,请查看[语音识别](./docs/source/asr/quick_start.md), [语音合成](./docs/source/tts/quick_start.md)。
+
+
+
## 快速使用服务
-安装完成后,开发者可以通过命令行快速使用服务。
+安装完成后,开发者可以通过命令行一键启动语音识别,语音合成,音频分类三种服务。
**启动服务**
```shell
@@ -480,6 +622,15 @@ PaddleSpeech 的 **语音合成** 主要包含三个模块:文本前端、声
ge2e-fastspeech2-aishell3
+
+
+ 端到端
+ VITS
+ CSMSC
+
+ VITS-csmsc
+
+
@@ -600,6 +751,7 @@ PaddleSpeech 的 **语音合成** 主要包含三个模块:文本前端、声
语音合成模块最初被称为 [Parakeet](https://github.com/PaddlePaddle/Parakeet),现在与此仓库合并。如果您对该任务的学术研究感兴趣,请参阅 [TTS 研究概述](https://github.com/PaddlePaddle/PaddleSpeech/tree/develop/docs/source/tts#overview)。此外,[模型介绍](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/tts/models_introduction.md) 是了解语音合成流程的一个很好的指南。
+
## ⭐ 应用案例
- **[PaddleBoBo](https://github.com/JiehangXie/PaddleBoBo): 使用 PaddleSpeech 的语音合成模块生成虚拟人的声音。**
@@ -681,6 +833,7 @@ PaddleSpeech 的 **语音合成** 主要包含三个模块:文本前端、声
## 致谢
+- 非常感谢 [BarryKCL](https://github.com/BarryKCL)基于[G2PW](https://github.com/GitYCC/g2pW)对TTS中文文本前端的优化。
- 非常感谢 [yeyupiaoling](https://github.com/yeyupiaoling)/[PPASR](https://github.com/yeyupiaoling/PPASR)/[PaddlePaddle-DeepSpeech](https://github.com/yeyupiaoling/PaddlePaddle-DeepSpeech)/[VoiceprintRecognition-PaddlePaddle](https://github.com/yeyupiaoling/VoiceprintRecognition-PaddlePaddle)/[AudioClassification-PaddlePaddle](https://github.com/yeyupiaoling/AudioClassification-PaddlePaddle) 多年来的关注和建议,以及在诸多问题上的帮助。
- 非常感谢 [mymagicpower](https://github.com/mymagicpower) 采用PaddleSpeech 对 ASR 的[短语音](https://github.com/mymagicpower/AIAS/tree/main/3_audio_sdks/asr_sdk)及[长语音](https://github.com/mymagicpower/AIAS/tree/main/3_audio_sdks/asr_long_audio_sdk)进行 Java 实现。
- 非常感谢 [JiehangXie](https://github.com/JiehangXie)/[PaddleBoBo](https://github.com/JiehangXie/PaddleBoBo) 采用 PaddleSpeech 语音合成功能实现 Virtual Uploader(VUP)/Virtual YouTuber(VTuber) 虚拟主播。
@@ -690,7 +843,8 @@ PaddleSpeech 的 **语音合成** 主要包含三个模块:文本前端、声
- 非常感谢 [phecda-xu](https://github.com/phecda-xu)/[PaddleDubbing](https://github.com/phecda-xu/PaddleDubbing) 基于 PaddleSpeech 的 TTS 模型搭建带 GUI 操作界面的配音工具。
- 非常感谢 [jerryuhoo](https://github.com/jerryuhoo)/[VTuberTalk](https://github.com/jerryuhoo/VTuberTalk) 基于 PaddleSpeech 的 TTS GUI 界面和基于 ASR 制作数据集的相关代码。
-
+- 非常感谢 [vpegasus](https://github.com/vpegasus)/[xuesebot](https://github.com/vpegasus/xuesebot) 基于 PaddleSpeech 的 ASR 与 TTS 设计的可听、说对话机器人。
+- 非常感谢 [chenkui164](https://github.com/chenkui164)/[FastASR](https://github.com/chenkui164/FastASR) 对 PaddleSpeech 的 ASR 进行 C++ 推理实现。
此外,PaddleSpeech 依赖于许多开源存储库。有关更多信息,请参阅 [references](./docs/source/reference.md)。
diff --git a/dataset/aidatatang_200zh/README.md b/dataset/aidatatang_200zh/README.md
index e6f1eefbd..addc323a6 100644
--- a/dataset/aidatatang_200zh/README.md
+++ b/dataset/aidatatang_200zh/README.md
@@ -1,4 +1,4 @@
-# [Aidatatang_200zh](http://www.openslr.org/62/)
+# [Aidatatang_200zh](http://openslr.elda.org/62/)
Aidatatang_200zh is a free Chinese Mandarin speech corpus provided by Beijing DataTang Technology Co., Ltd under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License.
The contents and the corresponding descriptions of the corpus include:
diff --git a/dataset/aishell/README.md b/dataset/aishell/README.md
index 6770cd207..a7dd0cf32 100644
--- a/dataset/aishell/README.md
+++ b/dataset/aishell/README.md
@@ -1,3 +1,3 @@
-# [Aishell1](http://www.openslr.org/33/)
+# [Aishell1](http://openslr.elda.org/33/)
This Open Source Mandarin Speech Corpus, AISHELL-ASR0009-OS1, is 178 hours long. It is a part of AISHELL-ASR0009, of which utterance contains 11 domains, including smart home, autonomous driving, and industrial production. The whole recording was put in quiet indoor environment, using 3 different devices at the same time: high fidelity microphone (44.1kHz, 16-bit,); Android-system mobile phone (16kHz, 16-bit), iOS-system mobile phone (16kHz, 16-bit). Audios in high fidelity were re-sampled to 16kHz to build AISHELL- ASR0009-OS1. 400 speakers from different accent areas in China were invited to participate in the recording. The manual transcription accuracy rate is above 95%, through professional speech annotation and strict quality inspection. The corpus is divided into training, development and testing sets. ( This database is free for academic research, not in the commerce, if without permission. )
diff --git a/dataset/aishell/aishell.py b/dataset/aishell/aishell.py
index 7431fc083..ec43104db 100644
--- a/dataset/aishell/aishell.py
+++ b/dataset/aishell/aishell.py
@@ -31,7 +31,7 @@ from utils.utility import unpack
DATA_HOME = os.path.expanduser('~/.cache/paddle/dataset/speech')
-URL_ROOT = 'http://www.openslr.org/resources/33'
+URL_ROOT = 'http://openslr.elda.org/resources/33'
# URL_ROOT = 'https://openslr.magicdatatech.com/resources/33'
DATA_URL = URL_ROOT + '/data_aishell.tgz'
MD5_DATA = '2f494334227864a8a8fec932999db9d8'
diff --git a/dataset/librispeech/librispeech.py b/dataset/librispeech/librispeech.py
index 65cab2490..2d6f1763d 100644
--- a/dataset/librispeech/librispeech.py
+++ b/dataset/librispeech/librispeech.py
@@ -31,7 +31,7 @@ import soundfile
from utils.utility import download
from utils.utility import unpack
-URL_ROOT = "http://www.openslr.org/resources/12"
+URL_ROOT = "http://openslr.elda.org/resources/12"
#URL_ROOT = "https://openslr.magicdatatech.com/resources/12"
URL_TEST_CLEAN = URL_ROOT + "/test-clean.tar.gz"
URL_TEST_OTHER = URL_ROOT + "/test-other.tar.gz"
diff --git a/dataset/magicdata/README.md b/dataset/magicdata/README.md
index 083aee97b..4641a21d6 100644
--- a/dataset/magicdata/README.md
+++ b/dataset/magicdata/README.md
@@ -1,4 +1,4 @@
-# [MagicData](http://www.openslr.org/68/)
+# [MagicData](http://openslr.elda.org/68/)
MAGICDATA Mandarin Chinese Read Speech Corpus was developed by MAGIC DATA Technology Co., Ltd. and freely published for non-commercial use.
The contents and the corresponding descriptions of the corpus include:
diff --git a/dataset/mini_librispeech/mini_librispeech.py b/dataset/mini_librispeech/mini_librispeech.py
index 730c73a8b..0eb80bf8f 100644
--- a/dataset/mini_librispeech/mini_librispeech.py
+++ b/dataset/mini_librispeech/mini_librispeech.py
@@ -30,7 +30,7 @@ import soundfile
from utils.utility import download
from utils.utility import unpack
-URL_ROOT = "http://www.openslr.org/resources/31"
+URL_ROOT = "http://openslr.elda.org/resources/31"
URL_TRAIN_CLEAN = URL_ROOT + "/train-clean-5.tar.gz"
URL_DEV_CLEAN = URL_ROOT + "/dev-clean-2.tar.gz"
diff --git a/dataset/musan/musan.py b/dataset/musan/musan.py
index 2ac701bed..ae3430b2a 100644
--- a/dataset/musan/musan.py
+++ b/dataset/musan/musan.py
@@ -34,7 +34,7 @@ from utils.utility import unpack
DATA_HOME = os.path.expanduser('~/.cache/paddle/dataset/speech')
-URL_ROOT = 'https://www.openslr.org/resources/17'
+URL_ROOT = 'https://openslr.elda.org/resources/17'
DATA_URL = URL_ROOT + '/musan.tar.gz'
MD5_DATA = '0c472d4fc0c5141eca47ad1ffeb2a7df'
diff --git a/dataset/primewords/README.md b/dataset/primewords/README.md
index a4f1ed65d..dba51cec7 100644
--- a/dataset/primewords/README.md
+++ b/dataset/primewords/README.md
@@ -1,4 +1,4 @@
-# [Primewords](http://www.openslr.org/47/)
+# [Primewords](http://openslr.elda.org/47/)
This free Chinese Mandarin speech corpus set is released by Shanghai Primewords Information Technology Co., Ltd.
The corpus is recorded by smart mobile phones from 296 native Chinese speakers. The transcription accuracy is larger than 98%, at the confidence level of 95%. It is free for academic use.
diff --git a/dataset/rir_noise/rir_noise.py b/dataset/rir_noise/rir_noise.py
index 009175e5b..b1d475584 100644
--- a/dataset/rir_noise/rir_noise.py
+++ b/dataset/rir_noise/rir_noise.py
@@ -34,7 +34,7 @@ from utils.utility import unzip
DATA_HOME = os.path.expanduser('~/.cache/paddle/dataset/speech')
-URL_ROOT = '--no-check-certificate http://www.openslr.org/resources/28'
+URL_ROOT = '--no-check-certificate https://us.openslr.org/resources/28/rirs_noises.zip'
DATA_URL = URL_ROOT + '/rirs_noises.zip'
MD5_DATA = 'e6f48e257286e05de56413b4779d8ffb'
diff --git a/dataset/st-cmds/README.md b/dataset/st-cmds/README.md
index c7ae50e59..bbf85c3e7 100644
--- a/dataset/st-cmds/README.md
+++ b/dataset/st-cmds/README.md
@@ -1 +1 @@
-# [FreeST](http://www.openslr.org/38/)
+# [FreeST](http://openslr.elda.org/38/)
diff --git a/dataset/thchs30/README.md b/dataset/thchs30/README.md
index 6b59d663a..b488a3551 100644
--- a/dataset/thchs30/README.md
+++ b/dataset/thchs30/README.md
@@ -1,4 +1,4 @@
-# [THCHS30](http://www.openslr.org/18/)
+# [THCHS30](http://openslr.elda.org/18/)
This is the *data part* of the `THCHS30 2015` acoustic data
& scripts dataset.
diff --git a/dataset/thchs30/thchs30.py b/dataset/thchs30/thchs30.py
index cdfc0a75c..d41c0e175 100644
--- a/dataset/thchs30/thchs30.py
+++ b/dataset/thchs30/thchs30.py
@@ -32,7 +32,7 @@ from utils.utility import unpack
DATA_HOME = os.path.expanduser('~/.cache/paddle/dataset/speech')
-URL_ROOT = 'http://www.openslr.org/resources/18'
+URL_ROOT = 'http://openslr.elda.org/resources/18'
# URL_ROOT = 'https://openslr.magicdatatech.com/resources/18'
DATA_URL = URL_ROOT + '/data_thchs30.tgz'
TEST_NOISE_URL = URL_ROOT + '/test-noise.tgz'
diff --git a/demos/README.md b/demos/README.md
index 2a306df6b..72b70b237 100644
--- a/demos/README.md
+++ b/demos/README.md
@@ -12,6 +12,7 @@ This directory contains many speech applications in multiple scenarios.
* speech recognition - recognize text of an audio file
* speech server - Server for Speech Task, e.g. ASR,TTS,CLS
* streaming asr server - receive audio stream from websocket, and recognize to transcript.
+* streaming tts server - receive text from http or websocket, and streaming audio data stream.
* speech translation - end to end speech translation
* story talker - book reader based on OCR and TTS
* style_fs2 - multi style control for FastSpeech2 model
diff --git a/demos/README_cn.md b/demos/README_cn.md
index 471342127..04fc1fa7d 100644
--- a/demos/README_cn.md
+++ b/demos/README_cn.md
@@ -10,8 +10,9 @@
* 元宇宙 - 基于语音合成的 2D 增强现实。
* 标点恢复 - 通常作为语音识别的文本后处理任务,为一段无标点的纯文本添加相应的标点符号。
* 语音识别 - 识别一段音频中包含的语音文字。
-* 语音服务 - 离线语音服务,包括ASR、TTS、CLS等
-* 流式语音识别服务 - 流式输入语音数据流识别音频中的文字
+* 语音服务 - 离线语音服务,包括ASR、TTS、CLS等。
+* 流式语音识别服务 - 流式输入语音数据流识别音频中的文字。
+* 流式语音合成服务 - 根据待合成文本流式生成合成音频数据流。
* 语音翻译 - 实时识别音频中的语言,并同时翻译成目标语言。
* 会说话的故事书 - 基于 OCR 和语音合成的会说话的故事书。
* 个性化语音合成 - 基于 FastSpeech2 模型的个性化语音合成。
diff --git a/demos/audio_searching/requirements.txt b/demos/audio_searching/requirements.txt
index 057c6ab92..9d0f6419b 100644
--- a/demos/audio_searching/requirements.txt
+++ b/demos/audio_searching/requirements.txt
@@ -2,7 +2,7 @@ diskcache==5.2.1
dtaidistance==2.3.1
fastapi
librosa==0.8.0
-numpy==1.21.0
+numpy==1.22.0
pydantic
pymilvus==2.0.1
pymysql
diff --git a/demos/custom_streaming_asr/setup_docker.sh b/demos/custom_streaming_asr/setup_docker.sh
old mode 100644
new mode 100755
diff --git a/demos/keyword_spotting/README.md b/demos/keyword_spotting/README.md
new file mode 100644
index 000000000..6544cf71e
--- /dev/null
+++ b/demos/keyword_spotting/README.md
@@ -0,0 +1,79 @@
+([简体中文](./README_cn.md)|English)
+# KWS (Keyword Spotting)
+
+## Introduction
+KWS(Keyword Spotting) is a technique to recognize keyword from a giving speech audio.
+
+This demo is an implementation to recognize keyword from a specific audio file. It can be done by a single command or a few lines in python using `PaddleSpeech`.
+
+## Usage
+### 1. Installation
+see [installation](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/install.md).
+
+You can choose one way from easy, meduim and hard to install paddlespeech.
+
+### 2. Prepare Input File
+The input of this demo should be a WAV file(`.wav`), and the sample rate must be the same as the model.
+
+Here are sample files for this demo that can be downloaded:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/kws/hey_snips.wav https://paddlespeech.bj.bcebos.com/kws/non-keyword.wav
+```
+
+### 3. Usage
+- Command Line(Recommended)
+ ```bash
+ paddlespeech kws --input ./hey_snips.wav
+ paddlespeech kws --input ./non-keyword.wav
+ ```
+
+ Usage:
+ ```bash
+ paddlespeech kws --help
+ ```
+ Arguments:
+ - `input`(required): Audio file to recognize.
+ - `threshold`:Score threshold for kws. Default: `0.8`.
+ - `model`: Model type of kws task. Default: `mdtc_heysnips`.
+ - `config`: Config of kws task. Use pretrained model when it is None. Default: `None`.
+ - `ckpt_path`: Model checkpoint. Use pretrained model when it is None. Default: `None`.
+ - `device`: Choose device to execute model inference. Default: default device of paddlepaddle in current environment.
+ - `verbose`: Show the log information.
+
+ Output:
+ ```bash
+ # Input file: ./hey_snips.wav
+ Score: 1.000, Threshold: 0.8, Is keyword: True
+ # Input file: ./non-keyword.wav
+ Score: 0.000, Threshold: 0.8, Is keyword: False
+ ```
+
+- Python API
+ ```python
+ import paddle
+ from paddlespeech.cli.kws import KWSExecutor
+
+ kws_executor = KWSExecutor()
+ result = kws_executor(
+ audio_file='./hey_snips.wav',
+ threshold=0.8,
+ model='mdtc_heysnips',
+ config=None,
+ ckpt_path=None,
+ device=paddle.get_device())
+ print('KWS Result: \n{}'.format(result))
+ ```
+
+ Output:
+ ```bash
+ KWS Result:
+ Score: 1.000, Threshold: 0.8, Is keyword: True
+ ```
+
+### 4.Pretrained Models
+
+Here is a list of pretrained models released by PaddleSpeech that can be used by command and python API:
+
+| Model | Language | Sample Rate
+| :--- | :---: | :---: |
+| mdtc_heysnips | en | 16k
diff --git a/demos/keyword_spotting/README_cn.md b/demos/keyword_spotting/README_cn.md
new file mode 100644
index 000000000..0d8f44a53
--- /dev/null
+++ b/demos/keyword_spotting/README_cn.md
@@ -0,0 +1,76 @@
+(简体中文|[English](./README.md))
+
+# 关键词识别
+## 介绍
+关键词识别是一项用于识别一段语音内是否包含特定的关键词。
+
+这个 demo 是一个从给定音频文件识别特定关键词的实现,它可以通过使用 `PaddleSpeech` 的单个命令或 python 中的几行代码来实现。
+## 使用方法
+### 1. 安装
+请看[安装文档](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/install_cn.md)。
+
+你可以从 easy,medium,hard 三中方式中选择一种方式安装。
+
+### 2. 准备输入
+这个 demo 的输入应该是一个 WAV 文件(`.wav`),并且采样率必须与模型的采样率相同。
+
+可以下载此 demo 的示例音频:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/kws/hey_snips.wav https://paddlespeech.bj.bcebos.com/kws/non-keyword.wav
+```
+### 3. 使用方法
+- 命令行 (推荐使用)
+ ```bash
+ paddlespeech kws --input ./hey_snips.wav
+ paddlespeech kws --input ./non-keyword.wav
+ ```
+
+ 使用方法:
+ ```bash
+ paddlespeech kws --help
+ ```
+ 参数:
+ - `input`(必须输入):用于识别关键词的音频文件。
+ - `threshold`:用于判别是包含关键词的得分阈值,默认值:`0.8`。
+ - `model`:KWS 任务的模型,默认值:`mdtc_heysnips`。
+ - `config`:KWS 任务的参数文件,若不设置则使用预训练模型中的默认配置,默认值:`None`。
+ - `ckpt_path`:模型参数文件,若不设置则下载预训练模型使用,默认值:`None`。
+ - `device`:执行预测的设备,默认值:当前系统下 paddlepaddle 的默认 device。
+ - `verbose`: 如果使用,显示 logger 信息。
+
+ 输出:
+ ```bash
+ # 输入为 ./hey_snips.wav
+ Score: 1.000, Threshold: 0.8, Is keyword: True
+ # 输入为 ./non-keyword.wav
+ Score: 0.000, Threshold: 0.8, Is keyword: False
+ ```
+
+- Python API
+ ```python
+ import paddle
+ from paddlespeech.cli.kws import KWSExecutor
+
+ kws_executor = KWSExecutor()
+ result = kws_executor(
+ audio_file='./hey_snips.wav',
+ threshold=0.8,
+ model='mdtc_heysnips',
+ config=None,
+ ckpt_path=None,
+ device=paddle.get_device())
+ print('KWS Result: \n{}'.format(result))
+ ```
+
+ 输出:
+ ```bash
+ KWS Result:
+ Score: 1.000, Threshold: 0.8, Is keyword: True
+ ```
+
+### 4.预训练模型
+以下是 PaddleSpeech 提供的可以被命令行和 python API 使用的预训练模型列表:
+
+| 模型 | 语言 | 采样率
+| :--- | :---: | :---: |
+| mdtc_heysnips | en | 16k
diff --git a/demos/keyword_spotting/run.sh b/demos/keyword_spotting/run.sh
new file mode 100755
index 000000000..7f9e0ebba
--- /dev/null
+++ b/demos/keyword_spotting/run.sh
@@ -0,0 +1,7 @@
+#!/bin/bash
+
+wget -c https://paddlespeech.bj.bcebos.com/kws/hey_snips.wav https://paddlespeech.bj.bcebos.com/kws/non-keyword.wav
+
+# kws
+paddlespeech kws --input ./hey_snips.wav
+paddlespeech kws --input non-keyword.wav
diff --git a/demos/speaker_verification/run.sh b/demos/speaker_verification/run.sh
old mode 100644
new mode 100755
diff --git a/demos/speech_recognition/run.sh b/demos/speech_recognition/run.sh
old mode 100644
new mode 100755
index 19ce0ebb3..e48ff3e96
--- a/demos/speech_recognition/run.sh
+++ b/demos/speech_recognition/run.sh
@@ -1,6 +1,7 @@
#!/bin/bash
-wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
# asr
paddlespeech asr --input ./zh.wav
@@ -8,3 +9,18 @@ paddlespeech asr --input ./zh.wav
# asr + punc
paddlespeech asr --input ./zh.wav | paddlespeech text --task punc
+
+
+# asr help
+paddlespeech asr --help
+
+
+# english asr
+paddlespeech asr --lang en --model transformer_librispeech --input ./en.wav
+
+# model stats
+paddlespeech stats --task asr
+
+
+# paddlespeech help
+paddlespeech --help
diff --git a/demos/speech_server/README.md b/demos/speech_server/README.md
index 14a88f078..e400f7e74 100644
--- a/demos/speech_server/README.md
+++ b/demos/speech_server/README.md
@@ -5,13 +5,19 @@
## Introduction
This demo is an implementation of starting the voice service and accessing the service. It can be achieved with a single command using `paddlespeech_server` and `paddlespeech_client` or a few lines of code in python.
+For service interface definition, please check:
+- [PaddleSpeech Server RESTful API](https://github.com/PaddlePaddle/PaddleSpeech/wiki/PaddleSpeech-Server-RESTful-API)
+
## Usage
### 1. Installation
see [installation](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/install.md).
-It is recommended to use **paddlepaddle 2.2.2** or above.
-You can choose one way from meduim and hard to install paddlespeech.
+It is recommended to use **paddlepaddle 2.3.1** or above.
+
+You can choose one way from easy, meduim and hard to install paddlespeech.
+
+**If you install in easy mode, you need to prepare the yaml file by yourself, you can refer to the yaml file in the conf directory.**
### 2. Prepare config File
The configuration file can be found in `conf/application.yaml` .
@@ -20,14 +26,6 @@ At present, the speech tasks integrated by the service include: asr (speech reco
Currently the engine type supports two forms: python and inference (Paddle Inference)
**Note:** If the service can be started normally in the container, but the client access IP is unreachable, you can try to replace the `host` address in the configuration file with the local IP address.
-
-The input of ASR client demo should be a WAV file(`.wav`), and the sample rate must be the same as the model.
-
-Here are sample files for thisASR client demo that can be downloaded:
-```bash
-wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
-```
-
### 3. Server Usage
- Command Line (Recommended)
@@ -46,7 +44,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `log_file`: log file. Default: ./log/paddlespeech.log
Output:
- ```bash
+ ```text
[2022-02-23 11:17:32] [INFO] [server.py:64] Started server process [6384]
INFO: Waiting for application startup.
[2022-02-23 11:17:32] [INFO] [on.py:26] Waiting for application startup.
@@ -54,7 +52,6 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
[2022-02-23 11:17:32] [INFO] [on.py:38] Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
[2022-02-23 11:17:32] [INFO] [server.py:204] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
-
```
- Python API
@@ -68,7 +65,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
```
Output:
- ```bash
+ ```text
INFO: Started server process [529]
[2022-02-23 14:57:56] [INFO] [server.py:64] Started server process [529]
INFO: Waiting for application startup.
@@ -77,11 +74,19 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
[2022-02-23 14:57:56] [INFO] [on.py:38] Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
[2022-02-23 14:57:56] [INFO] [server.py:204] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
-
```
### 4. ASR Client Usage
+
+The input of ASR client demo should be a WAV file(`.wav`), and the sample rate must be the same as the model.
+
+Here are sample files for this ASR client demo that can be downloaded:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
+```
+
**Note:** The response time will be slightly longer when using the client for the first time
- Command Line (Recommended)
@@ -105,16 +110,14 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `audio_format`: Audio format. Default: "wav".
Output:
- ```bash
- [2022-02-23 18:11:22,819] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'transcription': '我认为跑步最重要的就是给我带来了身体健康'}}
- [2022-02-23 18:11:22,820] [ INFO] - time cost 0.689145 s.
-
+ ```text
+ [2022-08-01 07:54:01,646] [ INFO] - ASR result: 我认为跑步最重要的就是给我带来了身体健康
+ [2022-08-01 07:54:01,646] [ INFO] - Response time 4.898965 s.
```
- Python API
```python
from paddlespeech.server.bin.paddlespeech_client import ASRClientExecutor
- import json
asrclient_executor = ASRClientExecutor()
res = asrclient_executor(
@@ -124,12 +127,11 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
sample_rate=16000,
lang="zh_cn",
audio_format="wav")
- print(res.json())
+ print(res)
```
-
Output:
- ```bash
- {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'transcription': '我认为跑步最重要的就是给我带来了身体健康'}}
+ ```text
+ 我认为跑步最重要的就是给我带来了身体健康
```
### 5. TTS Client Usage
@@ -157,12 +159,10 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `output`: Output wave filepath. Default: None, which means not to save the audio to the local.
Output:
- ```bash
- [2022-02-23 15:20:37,875] [ INFO] - {'description': 'success.'}
+ ```text
[2022-02-23 15:20:37,875] [ INFO] - Save synthesized audio successfully on output.wav.
[2022-02-23 15:20:37,875] [ INFO] - Audio duration: 3.612500 s.
[2022-02-23 15:20:37,875] [ INFO] - Response time: 0.348050 s.
-
```
- Python API
@@ -188,20 +188,25 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
```
Output:
- ```bash
+ ```text
{'description': 'success.'}
Save synthesized audio successfully on ./output.wav.
Audio duration: 3.612500 s.
-
```
### 6. CLS Client Usage
+
+Here are sample files for this CLS Client demo that can be downloaded:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
+```
+
**Note:** The response time will be slightly longer when using the client for the first time
- Command Line (Recommended)
If `127.0.0.1` is not accessible, you need to use the actual service IP address.
- ```
+ ```bash
paddlespeech_client cls --server_ip 127.0.0.1 --port 8090 --input ./zh.wav
```
@@ -217,11 +222,9 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `topk`: topk scores of classification result.
Output:
- ```bash
+ ```text
[2022-03-09 20:44:39,974] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'topk': 1, 'results': [{'class_name': 'Speech', 'prob': 0.9027184844017029}]}}
[2022-03-09 20:44:39,975] [ INFO] - Response time 0.104360 s.
-
-
```
- Python API
@@ -239,14 +242,19 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
```
Output:
- ```bash
+ ```text
{'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'topk': 1, 'results': [{'class_name': 'Speech', 'prob': 0.9027184844017029}]}}
-
```
### 7. Speaker Verification Client Usage
+Here are sample files for this Speaker Verification Client demo that can be downloaded:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/vector/audio/85236145389.wav
+wget -c https://paddlespeech.bj.bcebos.com/vector/audio/123456789.wav
+```
+
#### 7.1 Extract speaker embedding
**Note:** The response time will be slightly longer when using the client for the first time
- Command Line (Recommended)
@@ -273,19 +281,19 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
Output:
- ```bash
- [2022-05-25 12:25:36,165] [ INFO] - vector http client start
- [2022-05-25 12:25:36,165] [ INFO] - the input audio: 85236145389.wav
- [2022-05-25 12:25:36,165] [ INFO] - endpoint: http://127.0.0.1:8790/paddlespeech/vector
- [2022-05-25 12:25:36,166] [ INFO] - http://127.0.0.1:8790/paddlespeech/vector
- [2022-05-25 12:25:36,324] [ INFO] - The vector: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [-1.3251205682754517, 7.860682487487793, -4.620625972747803, 0.3000721037387848, 2.2648534774780273, -1.1931440830230713, 3.064713716506958, 7.673594951629639, -6.004472732543945, -12.024259567260742, -1.9496068954467773, 3.126953601837158, 1.6188379526138306, -7.638310432434082, -1.2299772500991821, -12.33833122253418, 2.1373026371002197, -5.395712375640869, 9.717328071594238, 5.675230503082275, 3.7805123329162598, 3.0597171783447266, 3.429692029953003, 8.9760103225708, 13.174124717712402, -0.5313228368759155, 8.942471504211426, 4.465109825134277, -4.426247596740723, -9.726503372192383, 8.399328231811523, 7.223917484283447, -7.435853958129883, 2.9441683292388916, -4.343039512634277, -13.886964797973633, -1.6346734762191772, -10.902740478515625, -5.311244964599609, 3.800722122192383, 3.897603750228882, -2.123077392578125, -2.3521194458007812, 4.151031017303467, -7.404866695404053, 0.13911646604537964, 2.4626107215881348, 4.96645450592041, 0.9897574186325073, 5.483975410461426, -3.3574001789093018, 10.13400650024414, -0.6120170950889587, -10.403095245361328, 4.600754261016846, 16.009349822998047, -7.78369140625, -4.194530487060547, -6.93686056137085, 1.1789555549621582, 11.490800857543945, 4.23802375793457, 9.550930976867676, 8.375045776367188, 7.508914470672607, -0.6570729613304138, -0.3005157709121704, 2.8406054973602295, 3.0828027725219727, 0.7308170199394226, 6.1483540534973145, 0.1376611888408661, -13.424735069274902, -7.746140480041504, -2.322798252105713, -8.305252075195312, 2.98791241645813, -10.99522876739502, 0.15211068093776703, -2.3820347785949707, -1.7984174489974976, 8.49562931060791, -5.852236747741699, -3.755497932434082, 0.6989710927009583, -5.270299434661865, -2.6188621520996094, -1.8828465938568115, -4.6466498374938965, 14.078543663024902, -0.5495333075523376, 10.579157829284668, -3.216050148010254, 9.349003791809082, -4.381077766418457, -11.675816535949707, -2.863020658493042, 4.5721755027771, 2.246612071990967, -4.574341773986816, 1.8610187768936157, 2.3767874240875244, 5.625787734985352, -9.784077644348145, 0.6496725678443909, -1.457950472831726, 0.4263263940811157, -4.921126365661621, -2.4547839164733887, 3.4869801998138428, -0.4265422224998474, 8.341268539428711, 1.356552004814148, 7.096688270568848, -13.102828979492188, 8.01673412322998, -7.115934371948242, 1.8699780702590942, 0.20872099697589874, 14.699383735656738, -1.0252779722213745, -2.6107232570648193, -2.5082311630249023, 8.427192687988281, 6.913852691650391, -6.29124641418457, 0.6157366037368774, 2.489687919616699, -3.4668266773223877, 9.92176342010498, 11.200815200805664, -0.19664029777050018, 7.491600513458252, -0.6231271624565125, -0.2584814429283142, -9.947997093200684, -0.9611040949821472, 1.1649218797683716, -2.1907122135162354, -1.502848744392395, -0.5192610621452332, 15.165953636169434, 2.4649462699890137, -0.998044490814209, 7.44166374206543, -2.0768048763275146, 3.5896823406219482, -7.305543422698975, -7.562084674835205, 4.32333517074585, 0.08044180274009705, -6.564010143280029, -2.314805269241333, -1.7642345428466797, -2.470881700515747, -7.6756181716918945, -9.548877716064453, -1.017755389213562, 0.1698644608259201, 2.5877134799957275, -1.8752295970916748, -0.36614322662353516, -6.049378395080566, -2.3965611457824707, -5.945338726043701, 0.9424033164978027, -13.155974388122559, -7.45780086517334, 0.14658108353614807, -3.7427968978881836, 5.841492652893066, -1.2872905731201172, 5.569431304931641, 12.570590019226074, 1.0939218997955322, 2.2142086029052734, 1.9181575775146484, 6.991420745849609, -5.888138771057129, 3.1409823894500732, -2.0036280155181885, 2.4434285163879395, 9.973138809204102, 5.036680221557617, 2.005120277404785, 2.861560344696045, 5.860223770141602, 2.917618751525879, -1.63111412525177, 2.0292205810546875, -4.070415019989014, -6.831437110900879]}}
- [2022-05-25 12:25:36,324] [ INFO] - Response time 0.159053 s.
+ ```text
+ [2022-08-01 09:01:22,151] [ INFO] - vector http client start
+ [2022-08-01 09:01:22,152] [ INFO] - the input audio: 85236145389.wav
+ [2022-08-01 09:01:22,152] [ INFO] - endpoint: http://127.0.0.1:8090/paddlespeech/vector
+ [2022-08-01 09:01:27,093] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [1.4217487573623657, 5.626248836517334, -5.342073440551758, 1.177390217781067, 3.308061122894287, 1.7565997838974, 5.1678876876831055, 10.806346893310547, -3.822679042816162, -5.614130973815918, 2.6238481998443604, -0.8072965741157532, 1.963512659072876, -7.312864780426025, 0.011034967377781868, -9.723127365112305, 0.661963164806366, -6.976816654205322, 10.213465690612793, 7.494767189025879, 2.9105641841888428, 3.894925117492676, 3.7999846935272217, 7.106173992156982, 16.905324935913086, -7.149376392364502, 8.733112335205078, 3.423002004623413, -4.831653118133545, -11.403371810913086, 11.232216835021973, 7.127464771270752, -4.282831192016602, 2.4523589611053467, -5.13075065612793, -18.17765998840332, -2.611666440963745, -11.00034236907959, -6.731431007385254, 1.6564655303955078, 0.7618184685707092, 1.1253058910369873, -2.0838277339935303, 4.725739002227783, -8.782590866088867, -3.5398736000061035, 3.8142387866973877, 5.142062664031982, 2.162053346633911, 4.09642219543457, -6.416221618652344, 12.747454643249512, 1.9429889917373657, -15.152948379516602, 6.417416572570801, 16.097013473510742, -9.716649055480957, -1.9920448064804077, -3.364956855773926, -1.8719490766525269, 11.567351341247559, 3.6978795528411865, 11.258269309997559, 7.442364692687988, 9.183405876159668, 4.528151512145996, -1.2417811155319214, 4.395910263061523, 6.672768592834473, 5.889888763427734, 7.627115249633789, -0.6692016124725342, -11.889703750610352, -9.208883285522461, -7.427401542663574, -3.777655601501465, 6.917237758636475, -9.848749160766602, -2.094479560852051, -5.1351189613342285, 0.49564215540885925, 9.317541122436523, -5.9141845703125, -1.809845209121704, -0.11738205701112747, -7.169270992279053, -1.0578246116638184, -5.721685886383057, -5.117387294769287, 16.137670516967773, -4.473618984222412, 7.66243314743042, -0.5538089871406555, 9.631582260131836, -6.470466613769531, -8.54850959777832, 4.371622085571289, -0.7970349192619324, 4.479003429412842, -2.9758646488189697, 3.2721707820892334, 2.8382749557495117, 5.1345953941345215, -9.19078254699707, -0.5657423138618469, -4.874573230743408, 2.316561460494995, -5.984307289123535, -2.1798791885375977, 0.35541653633117676, -0.3178458511829376, 9.493547439575195, 2.114448070526123, 4.358088493347168, -12.089820861816406, 8.451695442199707, -7.925461769104004, 4.624246120452881, 4.428938388824463, 18.691999435424805, -2.620460033416748, -5.149182319641113, -0.3582168221473694, 8.488557815551758, 4.98148250579834, -9.326834678649902, -2.2544236183166504, 6.64176607131958, 1.2119656801223755, 10.977132797241211, 16.55504035949707, 3.323848247528076, 9.55185317993164, -1.6677050590515137, -0.7953923940658569, -8.605660438537598, -0.4735637903213501, 2.6741855144500732, -5.359188079833984, -2.6673784255981445, 0.6660736799240112, 15.443212509155273, 4.740597724914551, -3.4725306034088135, 11.592561721801758, -2.05450701713562, 1.7361239194869995, -8.26533031463623, -9.304476737976074, 5.406835079193115, -1.5180232524871826, -7.746610641479492, -6.089605331420898, 0.07112561166286469, -0.34904858469963074, -8.649889945983887, -9.998958587646484, -2.5648481845855713, -0.5399898886680603, 2.6018145084381104, -0.31927648186683655, -1.8815231323242188, -2.0721378326416016, -3.4105639457702637, -8.299802780151367, 1.4836379289627075, -15.366002082824707, -8.288193702697754, 3.884773015975952, -3.4876506328582764, 7.362995624542236, 0.4657321572303772, 3.1326000690460205, 12.438883781433105, -1.8337029218673706, 4.532927513122559, 2.726433277130127, 10.145345687866211, -6.521956920623779, 2.8971481323242188, -3.3925881385803223, 5.079156398773193, 7.759725093841553, 4.677562236785889, 5.8457818031311035, 2.4023921489715576, 7.707108974456787, 3.9711389541625977, -6.390035152435303, 6.126871109008789, -3.776031017303467, -11.118141174316406]}}
+ [2022-08-01 09:01:27,094] [ INFO] - Response time 4.941739 s.
```
* Python API
``` python
from paddlespeech.server.bin.paddlespeech_client import VectorClientExecutor
+ import json
vectorclient_executor = VectorClientExecutor()
res = vectorclient_executor(
@@ -293,13 +301,13 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
server_ip="127.0.0.1",
port=8090,
task="spk")
- print(res)
+ print(res.json())
```
Output:
- ``` bash
- {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [-1.3251205682754517, 7.860682487487793, -4.620625972747803, 0.3000721037387848, 2.2648534774780273, -1.1931440830230713, 3.064713716506958, 7.673594951629639, -6.004472732543945, -12.024259567260742, -1.9496068954467773, 3.126953601837158, 1.6188379526138306, -7.638310432434082, -1.2299772500991821, -12.33833122253418, 2.1373026371002197, -5.395712375640869, 9.717328071594238, 5.675230503082275, 3.7805123329162598, 3.0597171783447266, 3.429692029953003, 8.9760103225708, 13.174124717712402, -0.5313228368759155, 8.942471504211426, 4.465109825134277, -4.426247596740723, -9.726503372192383, 8.399328231811523, 7.223917484283447, -7.435853958129883, 2.9441683292388916, -4.343039512634277, -13.886964797973633, -1.6346734762191772, -10.902740478515625, -5.311244964599609, 3.800722122192383, 3.897603750228882, -2.123077392578125, -2.3521194458007812, 4.151031017303467, -7.404866695404053, 0.13911646604537964, 2.4626107215881348, 4.96645450592041, 0.9897574186325073, 5.483975410461426, -3.3574001789093018, 10.13400650024414, -0.6120170950889587, -10.403095245361328, 4.600754261016846, 16.009349822998047, -7.78369140625, -4.194530487060547, -6.93686056137085, 1.1789555549621582, 11.490800857543945, 4.23802375793457, 9.550930976867676, 8.375045776367188, 7.508914470672607, -0.6570729613304138, -0.3005157709121704, 2.8406054973602295, 3.0828027725219727, 0.7308170199394226, 6.1483540534973145, 0.1376611888408661, -13.424735069274902, -7.746140480041504, -2.322798252105713, -8.305252075195312, 2.98791241645813, -10.99522876739502, 0.15211068093776703, -2.3820347785949707, -1.7984174489974976, 8.49562931060791, -5.852236747741699, -3.755497932434082, 0.6989710927009583, -5.270299434661865, -2.6188621520996094, -1.8828465938568115, -4.6466498374938965, 14.078543663024902, -0.5495333075523376, 10.579157829284668, -3.216050148010254, 9.349003791809082, -4.381077766418457, -11.675816535949707, -2.863020658493042, 4.5721755027771, 2.246612071990967, -4.574341773986816, 1.8610187768936157, 2.3767874240875244, 5.625787734985352, -9.784077644348145, 0.6496725678443909, -1.457950472831726, 0.4263263940811157, -4.921126365661621, -2.4547839164733887, 3.4869801998138428, -0.4265422224998474, 8.341268539428711, 1.356552004814148, 7.096688270568848, -13.102828979492188, 8.01673412322998, -7.115934371948242, 1.8699780702590942, 0.20872099697589874, 14.699383735656738, -1.0252779722213745, -2.6107232570648193, -2.5082311630249023, 8.427192687988281, 6.913852691650391, -6.29124641418457, 0.6157366037368774, 2.489687919616699, -3.4668266773223877, 9.92176342010498, 11.200815200805664, -0.19664029777050018, 7.491600513458252, -0.6231271624565125, -0.2584814429283142, -9.947997093200684, -0.9611040949821472, 1.1649218797683716, -2.1907122135162354, -1.502848744392395, -0.5192610621452332, 15.165953636169434, 2.4649462699890137, -0.998044490814209, 7.44166374206543, -2.0768048763275146, 3.5896823406219482, -7.305543422698975, -7.562084674835205, 4.32333517074585, 0.08044180274009705, -6.564010143280029, -2.314805269241333, -1.7642345428466797, -2.470881700515747, -7.6756181716918945, -9.548877716064453, -1.017755389213562, 0.1698644608259201, 2.5877134799957275, -1.8752295970916748, -0.36614322662353516, -6.049378395080566, -2.3965611457824707, -5.945338726043701, 0.9424033164978027, -13.155974388122559, -7.45780086517334, 0.14658108353614807, -3.7427968978881836, 5.841492652893066, -1.2872905731201172, 5.569431304931641, 12.570590019226074, 1.0939218997955322, 2.2142086029052734, 1.9181575775146484, 6.991420745849609, -5.888138771057129, 3.1409823894500732, -2.0036280155181885, 2.4434285163879395, 9.973138809204102, 5.036680221557617, 2.005120277404785, 2.861560344696045, 5.860223770141602, 2.917618751525879, -1.63111412525177, 2.0292205810546875, -4.070415019989014, -6.831437110900879]}}
+ ```text
+ {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [1.4217487573623657, 5.626248836517334, -5.342073440551758, 1.177390217781067, 3.308061122894287, 1.7565997838974, 5.1678876876831055, 10.806346893310547, -3.822679042816162, -5.614130973815918, 2.6238481998443604, -0.8072965741157532, 1.963512659072876, -7.312864780426025, 0.011034967377781868, -9.723127365112305, 0.661963164806366, -6.976816654205322, 10.213465690612793, 7.494767189025879, 2.9105641841888428, 3.894925117492676, 3.7999846935272217, 7.106173992156982, 16.905324935913086, -7.149376392364502, 8.733112335205078, 3.423002004623413, -4.831653118133545, -11.403371810913086, 11.232216835021973, 7.127464771270752, -4.282831192016602, 2.4523589611053467, -5.13075065612793, -18.17765998840332, -2.611666440963745, -11.00034236907959, -6.731431007385254, 1.6564655303955078, 0.7618184685707092, 1.1253058910369873, -2.0838277339935303, 4.725739002227783, -8.782590866088867, -3.5398736000061035, 3.8142387866973877, 5.142062664031982, 2.162053346633911, 4.09642219543457, -6.416221618652344, 12.747454643249512, 1.9429889917373657, -15.152948379516602, 6.417416572570801, 16.097013473510742, -9.716649055480957, -1.9920448064804077, -3.364956855773926, -1.8719490766525269, 11.567351341247559, 3.6978795528411865, 11.258269309997559, 7.442364692687988, 9.183405876159668, 4.528151512145996, -1.2417811155319214, 4.395910263061523, 6.672768592834473, 5.889888763427734, 7.627115249633789, -0.6692016124725342, -11.889703750610352, -9.208883285522461, -7.427401542663574, -3.777655601501465, 6.917237758636475, -9.848749160766602, -2.094479560852051, -5.1351189613342285, 0.49564215540885925, 9.317541122436523, -5.9141845703125, -1.809845209121704, -0.11738205701112747, -7.169270992279053, -1.0578246116638184, -5.721685886383057, -5.117387294769287, 16.137670516967773, -4.473618984222412, 7.66243314743042, -0.5538089871406555, 9.631582260131836, -6.470466613769531, -8.54850959777832, 4.371622085571289, -0.7970349192619324, 4.479003429412842, -2.9758646488189697, 3.2721707820892334, 2.8382749557495117, 5.1345953941345215, -9.19078254699707, -0.5657423138618469, -4.874573230743408, 2.316561460494995, -5.984307289123535, -2.1798791885375977, 0.35541653633117676, -0.3178458511829376, 9.493547439575195, 2.114448070526123, 4.358088493347168, -12.089820861816406, 8.451695442199707, -7.925461769104004, 4.624246120452881, 4.428938388824463, 18.691999435424805, -2.620460033416748, -5.149182319641113, -0.3582168221473694, 8.488557815551758, 4.98148250579834, -9.326834678649902, -2.2544236183166504, 6.64176607131958, 1.2119656801223755, 10.977132797241211, 16.55504035949707, 3.323848247528076, 9.55185317993164, -1.6677050590515137, -0.7953923940658569, -8.605660438537598, -0.4735637903213501, 2.6741855144500732, -5.359188079833984, -2.6673784255981445, 0.6660736799240112, 15.443212509155273, 4.740597724914551, -3.4725306034088135, 11.592561721801758, -2.05450701713562, 1.7361239194869995, -8.26533031463623, -9.304476737976074, 5.406835079193115, -1.5180232524871826, -7.746610641479492, -6.089605331420898, 0.07112561166286469, -0.34904858469963074, -8.649889945983887, -9.998958587646484, -2.5648481845855713, -0.5399898886680603, 2.6018145084381104, -0.31927648186683655, -1.8815231323242188, -2.0721378326416016, -3.4105639457702637, -8.299802780151367, 1.4836379289627075, -15.366002082824707, -8.288193702697754, 3.884773015975952, -3.4876506328582764, 7.362995624542236, 0.4657321572303772, 3.1326000690460205, 12.438883781433105, -1.8337029218673706, 4.532927513122559, 2.726433277130127, 10.145345687866211, -6.521956920623779, 2.8971481323242188, -3.3925881385803223, 5.079156398773193, 7.759725093841553, 4.677562236785889, 5.8457818031311035, 2.4023921489715576, 7.707108974456787, 3.9711389541625977, -6.390035152435303, 6.126871109008789, -3.776031017303467, -11.118141174316406]}}
```
#### 7.2 Get the score between speaker audio embedding
@@ -330,19 +338,19 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
Output:
- ``` bash
- [2022-05-25 12:33:24,527] [ INFO] - vector score http client start
- [2022-05-25 12:33:24,527] [ INFO] - enroll audio: 85236145389.wav, test audio: 123456789.wav
- [2022-05-25 12:33:24,528] [ INFO] - endpoint: http://127.0.0.1:8790/paddlespeech/vector/score
- [2022-05-25 12:33:24,695] [ INFO] - The vector score is: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
- [2022-05-25 12:33:24,696] [ INFO] - The vector: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
- [2022-05-25 12:33:24,696] [ INFO] - Response time 0.168271 s.
+ ```text
+ [2022-08-01 09:04:42,275] [ INFO] - vector score http client start
+ [2022-08-01 09:04:42,275] [ INFO] - enroll audio: 85236145389.wav, test audio: 123456789.wav
+ [2022-08-01 09:04:42,275] [ INFO] - endpoint: http://127.0.0.1:8090/paddlespeech/vector/score
+ [2022-08-01 09:04:44,611] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.4292638897895813}}
+ [2022-08-01 09:04:44,611] [ INFO] - Response time 2.336258 s.
```
* Python API
``` python
from paddlespeech.server.bin.paddlespeech_client import VectorClientExecutor
+ import json
vectorclient_executor = VectorClientExecutor()
res = vectorclient_executor(
@@ -352,17 +360,13 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
server_ip="127.0.0.1",
port=8090,
task="score")
- print(res)
+ print(res.json())
```
Output:
- ``` bash
- [2022-05-25 12:30:14,143] [ INFO] - vector score http client start
- [2022-05-25 12:30:14,143] [ INFO] - enroll audio: 85236145389.wav, test audio: 123456789.wav
- [2022-05-25 12:30:14,143] [ INFO] - endpoint: http://127.0.0.1:8790/paddlespeech/vector/score
- [2022-05-25 12:30:14,363] [ INFO] - The vector score is: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
- {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
+ ```text
+ {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.4292638897895813}}
```
### 8. Punctuation prediction
@@ -388,9 +392,9 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `input`(required): Input text to get punctuation.
Output:
- ```bash
- [2022-05-09 18:19:04,397] [ INFO] - The punc text: 我认为跑步最重要的就是给我带来了身体健康。
- [2022-05-09 18:19:04,397] [ INFO] - Response time 0.092407 s.
+ ```text
+ [2022-05-09 18:19:04,397] [ INFO] - The punc text: 我认为跑步最重要的就是给我带来了身体健康。
+ [2022-05-09 18:19:04,397] [ INFO] - Response time 0.092407 s.
```
- Python API
@@ -403,15 +407,13 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
server_ip="127.0.0.1",
port=8090,)
print(res)
-
```
Output:
- ```bash
+ ```text
我认为跑步最重要的就是给我带来了身体健康。
```
-
## Models supported by the service
### ASR model
Get all models supported by the ASR service via `paddlespeech_server stats --task asr`, where static models can be used for paddle inference inference.
diff --git a/demos/speech_server/README_cn.md b/demos/speech_server/README_cn.md
index 29629b7e8..628468c83 100644
--- a/demos/speech_server/README_cn.md
+++ b/demos/speech_server/README_cn.md
@@ -3,31 +3,31 @@
# 语音服务
## 介绍
-这个 demo 是一个启动离线语音服务和访问服务的实现。它可以通过使用`paddlespeech_server` 和 `paddlespeech_client`的单个命令或 python 的几行代码来实现。
+这个 demo 是一个启动离线语音服务和访问服务的实现。它可以通过使用 `paddlespeech_server` 和 `paddlespeech_client` 的单个命令或 python 的几行代码来实现。
+
+
+服务接口定义请参考:
+- [PaddleSpeech Server RESTful API](https://github.com/PaddlePaddle/PaddleSpeech/wiki/PaddleSpeech-Server-RESTful-API)
## 使用方法
### 1. 安装
请看 [安装文档](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/install.md).
-推荐使用 **paddlepaddle 2.2.2** 或以上版本。
-你可以从 medium,hard 两种方式中选择一种方式安装 PaddleSpeech。
+推荐使用 **paddlepaddle 2.3.1** 或以上版本。
+
+你可以从简单,中等,困难 几种方式中选择一种方式安装 PaddleSpeech。
+**如果使用简单模式安装,需要自行准备 yaml 文件,可参考 conf 目录下的 yaml 文件。**
### 2. 准备配置文件
配置文件可参见 `conf/application.yaml` 。
-其中,`engine_list`表示即将启动的服务将会包含的语音引擎,格式为 <语音任务>_<引擎类型>。
-目前服务集成的语音任务有: asr(语音识别)、tts(语音合成)、cls(音频分类)、vector(声纹识别)以及text(文本处理)。
-目前引擎类型支持两种形式:python 及 inference (Paddle Inference)
-**注意:** 如果在容器里可正常启动服务,但客户端访问 ip 不可达,可尝试将配置文件中 `host` 地址换成本地 ip 地址。
+其中,`engine_list` 表示即将启动的服务将会包含的语音引擎,格式为 <语音任务>_<引擎类型>。
+目前服务集成的语音任务有: asr (语音识别)、tts (语音合成)、cls (音频分类)、vector (声纹识别)以及 text (文本处理)。
-ASR client 的输入是一个 WAV 文件(`.wav`),并且采样率必须与模型的采样率相同。
-
-可以下载此 ASR client 的示例音频:
-```bash
-wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
-```
+目前引擎类型支持两种形式:python 及 inference (Paddle Inference)
+**注意:** 如果在容器里可正常启动服务,但客户端访问 ip 不可达,可尝试将配置文件中 `host` 地址换成本地 ip 地址。
### 3. 服务端使用方法
- 命令行 (推荐使用)
@@ -47,7 +47,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `log_file`: log 文件. 默认:./log/paddlespeech.log
输出:
- ```bash
+ ```text
[2022-02-23 11:17:32] [INFO] [server.py:64] Started server process [6384]
INFO: Waiting for application startup.
[2022-02-23 11:17:32] [INFO] [on.py:26] Waiting for application startup.
@@ -55,7 +55,6 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
[2022-02-23 11:17:32] [INFO] [on.py:38] Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
[2022-02-23 11:17:32] [INFO] [server.py:204] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
-
```
- Python API
@@ -69,7 +68,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
```
输出:
- ```bash
+ ```text
INFO: Started server process [529]
[2022-02-23 14:57:56] [INFO] [server.py:64] Started server process [529]
INFO: Waiting for application startup.
@@ -78,10 +77,18 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
[2022-02-23 14:57:56] [INFO] [on.py:38] Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
[2022-02-23 14:57:56] [INFO] [server.py:204] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
-
```
### 4. ASR 客户端使用方法
+
+ASR 客户端的输入是一个 WAV 文件(`.wav`),并且采样率必须与模型的采样率相同。
+
+可以下载 ASR 客户端的示例音频:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav
+```
+
**注意:** 初次使用客户端时响应时间会略长
- 命令行 (推荐使用)
@@ -89,7 +96,6 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
```
paddlespeech_client asr --server_ip 127.0.0.1 --port 8090 --input ./zh.wav
-
```
使用帮助:
@@ -107,16 +113,14 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `audio_format`: 音频格式,默认值:wav。
输出:
-
- ```bash
- [2022-02-23 18:11:22,819] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'transcription': '我认为跑步最重要的就是给我带来了身体健康'}}
- [2022-02-23 18:11:22,820] [ INFO] - time cost 0.689145 s.
+ ```text
+ [2022-08-01 07:54:01,646] [ INFO] - ASR result: 我认为跑步最重要的就是给我带来了身体健康
+ [2022-08-01 07:54:01,646] [ INFO] - Response time 4.898965 s.
```
- Python API
```python
from paddlespeech.server.bin.paddlespeech_client import ASRClientExecutor
- import json
asrclient_executor = ASRClientExecutor()
res = asrclient_executor(
@@ -126,13 +130,12 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
sample_rate=16000,
lang="zh_cn",
audio_format="wav")
- print(res.json())
+ print(res)
```
输出:
- ```bash
- {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'transcription': '我认为跑步最重要的就是给我带来了身体健康'}}
-
+ ```text
+ 我认为跑步最重要的就是给我带来了身体健康
```
### 5. TTS 客户端使用方法
@@ -161,8 +164,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `output`: 输出音频的路径, 默认值:None,表示不保存音频到本地。
输出:
- ```bash
- [2022-02-23 15:20:37,875] [ INFO] - {'description': 'success.'}
+ ```text
[2022-02-23 15:20:37,875] [ INFO] - Save synthesized audio successfully on output.wav.
[2022-02-23 15:20:37,875] [ INFO] - Audio duration: 3.612500 s.
[2022-02-23 15:20:37,875] [ INFO] - Response time: 0.348050 s.
@@ -191,22 +193,26 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
```
输出:
- ```bash
+ ```text
{'description': 'success.'}
Save synthesized audio successfully on ./output.wav.
Audio duration: 3.612500 s.
-
```
### 6. CLS 客户端使用方法
+可以下载 CLS 客户端的示例音频:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
+```
+
**注意:** 初次使用客户端时响应时间会略长
- 命令行 (推荐使用)
若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
- ```
+ ```bash
paddlespeech_client cls --server_ip 127.0.0.1 --port 8090 --input ./zh.wav
```
@@ -222,11 +228,9 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `topk`: 分类结果的topk。
输出:
- ```bash
+ ```text
[2022-03-09 20:44:39,974] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'topk': 1, 'results': [{'class_name': 'Speech', 'prob': 0.9027184844017029}]}}
[2022-03-09 20:44:39,975] [ INFO] - Response time 0.104360 s.
-
-
```
- Python API
@@ -241,24 +245,28 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
port=8090,
topk=1)
print(res.json())
-
```
输出:
- ```bash
+ ```text
{'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'topk': 1, 'results': [{'class_name': 'Speech', 'prob': 0.9027184844017029}]}}
-
```
### 7. 声纹客户端使用方法
+可以下载声纹客户端的示例音频:
+```bash
+wget -c https://paddlespeech.bj.bcebos.com/vector/audio/85236145389.wav
+wget -c https://paddlespeech.bj.bcebos.com/vector/audio/123456789.wav
+```
+
#### 7.1 提取声纹特征
-注意: 初次使用客户端时响应时间会略长
+**注意:** 初次使用客户端时响应时间会略长
* 命令行 (推荐使用)
若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
- ``` bash
+ ```bash
paddlespeech_client vector --task spk --server_ip 127.0.0.1 --port 8090 --input 85236145389.wav
```
@@ -274,21 +282,21 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
* task: vector 的任务,可选spk或者score。默认是 spk。
* enroll: 注册音频;。
* test: 测试音频。
- 输出:
- ``` bash
- [2022-05-25 12:25:36,165] [ INFO] - vector http client start
- [2022-05-25 12:25:36,165] [ INFO] - the input audio: 85236145389.wav
- [2022-05-25 12:25:36,165] [ INFO] - endpoint: http://127.0.0.1:8790/paddlespeech/vector
- [2022-05-25 12:25:36,166] [ INFO] - http://127.0.0.1:8790/paddlespeech/vector
- [2022-05-25 12:25:36,324] [ INFO] - The vector: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [-1.3251205682754517, 7.860682487487793, -4.620625972747803, 0.3000721037387848, 2.2648534774780273, -1.1931440830230713, 3.064713716506958, 7.673594951629639, -6.004472732543945, -12.024259567260742, -1.9496068954467773, 3.126953601837158, 1.6188379526138306, -7.638310432434082, -1.2299772500991821, -12.33833122253418, 2.1373026371002197, -5.395712375640869, 9.717328071594238, 5.675230503082275, 3.7805123329162598, 3.0597171783447266, 3.429692029953003, 8.9760103225708, 13.174124717712402, -0.5313228368759155, 8.942471504211426, 4.465109825134277, -4.426247596740723, -9.726503372192383, 8.399328231811523, 7.223917484283447, -7.435853958129883, 2.9441683292388916, -4.343039512634277, -13.886964797973633, -1.6346734762191772, -10.902740478515625, -5.311244964599609, 3.800722122192383, 3.897603750228882, -2.123077392578125, -2.3521194458007812, 4.151031017303467, -7.404866695404053, 0.13911646604537964, 2.4626107215881348, 4.96645450592041, 0.9897574186325073, 5.483975410461426, -3.3574001789093018, 10.13400650024414, -0.6120170950889587, -10.403095245361328, 4.600754261016846, 16.009349822998047, -7.78369140625, -4.194530487060547, -6.93686056137085, 1.1789555549621582, 11.490800857543945, 4.23802375793457, 9.550930976867676, 8.375045776367188, 7.508914470672607, -0.6570729613304138, -0.3005157709121704, 2.8406054973602295, 3.0828027725219727, 0.7308170199394226, 6.1483540534973145, 0.1376611888408661, -13.424735069274902, -7.746140480041504, -2.322798252105713, -8.305252075195312, 2.98791241645813, -10.99522876739502, 0.15211068093776703, -2.3820347785949707, -1.7984174489974976, 8.49562931060791, -5.852236747741699, -3.755497932434082, 0.6989710927009583, -5.270299434661865, -2.6188621520996094, -1.8828465938568115, -4.6466498374938965, 14.078543663024902, -0.5495333075523376, 10.579157829284668, -3.216050148010254, 9.349003791809082, -4.381077766418457, -11.675816535949707, -2.863020658493042, 4.5721755027771, 2.246612071990967, -4.574341773986816, 1.8610187768936157, 2.3767874240875244, 5.625787734985352, -9.784077644348145, 0.6496725678443909, -1.457950472831726, 0.4263263940811157, -4.921126365661621, -2.4547839164733887, 3.4869801998138428, -0.4265422224998474, 8.341268539428711, 1.356552004814148, 7.096688270568848, -13.102828979492188, 8.01673412322998, -7.115934371948242, 1.8699780702590942, 0.20872099697589874, 14.699383735656738, -1.0252779722213745, -2.6107232570648193, -2.5082311630249023, 8.427192687988281, 6.913852691650391, -6.29124641418457, 0.6157366037368774, 2.489687919616699, -3.4668266773223877, 9.92176342010498, 11.200815200805664, -0.19664029777050018, 7.491600513458252, -0.6231271624565125, -0.2584814429283142, -9.947997093200684, -0.9611040949821472, 1.1649218797683716, -2.1907122135162354, -1.502848744392395, -0.5192610621452332, 15.165953636169434, 2.4649462699890137, -0.998044490814209, 7.44166374206543, -2.0768048763275146, 3.5896823406219482, -7.305543422698975, -7.562084674835205, 4.32333517074585, 0.08044180274009705, -6.564010143280029, -2.314805269241333, -1.7642345428466797, -2.470881700515747, -7.6756181716918945, -9.548877716064453, -1.017755389213562, 0.1698644608259201, 2.5877134799957275, -1.8752295970916748, -0.36614322662353516, -6.049378395080566, -2.3965611457824707, -5.945338726043701, 0.9424033164978027, -13.155974388122559, -7.45780086517334, 0.14658108353614807, -3.7427968978881836, 5.841492652893066, -1.2872905731201172, 5.569431304931641, 12.570590019226074, 1.0939218997955322, 2.2142086029052734, 1.9181575775146484, 6.991420745849609, -5.888138771057129, 3.1409823894500732, -2.0036280155181885, 2.4434285163879395, 9.973138809204102, 5.036680221557617, 2.005120277404785, 2.861560344696045, 5.860223770141602, 2.917618751525879, -1.63111412525177, 2.0292205810546875, -4.070415019989014, -6.831437110900879]}}
- [2022-05-25 12:25:36,324] [ INFO] - Response time 0.159053 s.
+ 输出:
+ ```text
+ [2022-08-01 09:01:22,151] [ INFO] - vector http client start
+ [2022-08-01 09:01:22,152] [ INFO] - the input audio: 85236145389.wav
+ [2022-08-01 09:01:22,152] [ INFO] - endpoint: http://127.0.0.1:8090/paddlespeech/vector
+ [2022-08-01 09:01:27,093] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [1.4217487573623657, 5.626248836517334, -5.342073440551758, 1.177390217781067, 3.308061122894287, 1.7565997838974, 5.1678876876831055, 10.806346893310547, -3.822679042816162, -5.614130973815918, 2.6238481998443604, -0.8072965741157532, 1.963512659072876, -7.312864780426025, 0.011034967377781868, -9.723127365112305, 0.661963164806366, -6.976816654205322, 10.213465690612793, 7.494767189025879, 2.9105641841888428, 3.894925117492676, 3.7999846935272217, 7.106173992156982, 16.905324935913086, -7.149376392364502, 8.733112335205078, 3.423002004623413, -4.831653118133545, -11.403371810913086, 11.232216835021973, 7.127464771270752, -4.282831192016602, 2.4523589611053467, -5.13075065612793, -18.17765998840332, -2.611666440963745, -11.00034236907959, -6.731431007385254, 1.6564655303955078, 0.7618184685707092, 1.1253058910369873, -2.0838277339935303, 4.725739002227783, -8.782590866088867, -3.5398736000061035, 3.8142387866973877, 5.142062664031982, 2.162053346633911, 4.09642219543457, -6.416221618652344, 12.747454643249512, 1.9429889917373657, -15.152948379516602, 6.417416572570801, 16.097013473510742, -9.716649055480957, -1.9920448064804077, -3.364956855773926, -1.8719490766525269, 11.567351341247559, 3.6978795528411865, 11.258269309997559, 7.442364692687988, 9.183405876159668, 4.528151512145996, -1.2417811155319214, 4.395910263061523, 6.672768592834473, 5.889888763427734, 7.627115249633789, -0.6692016124725342, -11.889703750610352, -9.208883285522461, -7.427401542663574, -3.777655601501465, 6.917237758636475, -9.848749160766602, -2.094479560852051, -5.1351189613342285, 0.49564215540885925, 9.317541122436523, -5.9141845703125, -1.809845209121704, -0.11738205701112747, -7.169270992279053, -1.0578246116638184, -5.721685886383057, -5.117387294769287, 16.137670516967773, -4.473618984222412, 7.66243314743042, -0.5538089871406555, 9.631582260131836, -6.470466613769531, -8.54850959777832, 4.371622085571289, -0.7970349192619324, 4.479003429412842, -2.9758646488189697, 3.2721707820892334, 2.8382749557495117, 5.1345953941345215, -9.19078254699707, -0.5657423138618469, -4.874573230743408, 2.316561460494995, -5.984307289123535, -2.1798791885375977, 0.35541653633117676, -0.3178458511829376, 9.493547439575195, 2.114448070526123, 4.358088493347168, -12.089820861816406, 8.451695442199707, -7.925461769104004, 4.624246120452881, 4.428938388824463, 18.691999435424805, -2.620460033416748, -5.149182319641113, -0.3582168221473694, 8.488557815551758, 4.98148250579834, -9.326834678649902, -2.2544236183166504, 6.64176607131958, 1.2119656801223755, 10.977132797241211, 16.55504035949707, 3.323848247528076, 9.55185317993164, -1.6677050590515137, -0.7953923940658569, -8.605660438537598, -0.4735637903213501, 2.6741855144500732, -5.359188079833984, -2.6673784255981445, 0.6660736799240112, 15.443212509155273, 4.740597724914551, -3.4725306034088135, 11.592561721801758, -2.05450701713562, 1.7361239194869995, -8.26533031463623, -9.304476737976074, 5.406835079193115, -1.5180232524871826, -7.746610641479492, -6.089605331420898, 0.07112561166286469, -0.34904858469963074, -8.649889945983887, -9.998958587646484, -2.5648481845855713, -0.5399898886680603, 2.6018145084381104, -0.31927648186683655, -1.8815231323242188, -2.0721378326416016, -3.4105639457702637, -8.299802780151367, 1.4836379289627075, -15.366002082824707, -8.288193702697754, 3.884773015975952, -3.4876506328582764, 7.362995624542236, 0.4657321572303772, 3.1326000690460205, 12.438883781433105, -1.8337029218673706, 4.532927513122559, 2.726433277130127, 10.145345687866211, -6.521956920623779, 2.8971481323242188, -3.3925881385803223, 5.079156398773193, 7.759725093841553, 4.677562236785889, 5.8457818031311035, 2.4023921489715576, 7.707108974456787, 3.9711389541625977, -6.390035152435303, 6.126871109008789, -3.776031017303467, -11.118141174316406]}}
+ [2022-08-01 09:01:27,094] [ INFO] - Response time 4.941739 s.
```
* Python API
``` python
from paddlespeech.server.bin.paddlespeech_client import VectorClientExecutor
+ import json
vectorclient_executor = VectorClientExecutor()
res = vectorclient_executor(
@@ -296,18 +304,17 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
server_ip="127.0.0.1",
port=8090,
task="spk")
- print(res)
+ print(res.json())
```
输出:
-
- ``` bash
- {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [-1.3251205682754517, 7.860682487487793, -4.620625972747803, 0.3000721037387848, 2.2648534774780273, -1.1931440830230713, 3.064713716506958, 7.673594951629639, -6.004472732543945, -12.024259567260742, -1.9496068954467773, 3.126953601837158, 1.6188379526138306, -7.638310432434082, -1.2299772500991821, -12.33833122253418, 2.1373026371002197, -5.395712375640869, 9.717328071594238, 5.675230503082275, 3.7805123329162598, 3.0597171783447266, 3.429692029953003, 8.9760103225708, 13.174124717712402, -0.5313228368759155, 8.942471504211426, 4.465109825134277, -4.426247596740723, -9.726503372192383, 8.399328231811523, 7.223917484283447, -7.435853958129883, 2.9441683292388916, -4.343039512634277, -13.886964797973633, -1.6346734762191772, -10.902740478515625, -5.311244964599609, 3.800722122192383, 3.897603750228882, -2.123077392578125, -2.3521194458007812, 4.151031017303467, -7.404866695404053, 0.13911646604537964, 2.4626107215881348, 4.96645450592041, 0.9897574186325073, 5.483975410461426, -3.3574001789093018, 10.13400650024414, -0.6120170950889587, -10.403095245361328, 4.600754261016846, 16.009349822998047, -7.78369140625, -4.194530487060547, -6.93686056137085, 1.1789555549621582, 11.490800857543945, 4.23802375793457, 9.550930976867676, 8.375045776367188, 7.508914470672607, -0.6570729613304138, -0.3005157709121704, 2.8406054973602295, 3.0828027725219727, 0.7308170199394226, 6.1483540534973145, 0.1376611888408661, -13.424735069274902, -7.746140480041504, -2.322798252105713, -8.305252075195312, 2.98791241645813, -10.99522876739502, 0.15211068093776703, -2.3820347785949707, -1.7984174489974976, 8.49562931060791, -5.852236747741699, -3.755497932434082, 0.6989710927009583, -5.270299434661865, -2.6188621520996094, -1.8828465938568115, -4.6466498374938965, 14.078543663024902, -0.5495333075523376, 10.579157829284668, -3.216050148010254, 9.349003791809082, -4.381077766418457, -11.675816535949707, -2.863020658493042, 4.5721755027771, 2.246612071990967, -4.574341773986816, 1.8610187768936157, 2.3767874240875244, 5.625787734985352, -9.784077644348145, 0.6496725678443909, -1.457950472831726, 0.4263263940811157, -4.921126365661621, -2.4547839164733887, 3.4869801998138428, -0.4265422224998474, 8.341268539428711, 1.356552004814148, 7.096688270568848, -13.102828979492188, 8.01673412322998, -7.115934371948242, 1.8699780702590942, 0.20872099697589874, 14.699383735656738, -1.0252779722213745, -2.6107232570648193, -2.5082311630249023, 8.427192687988281, 6.913852691650391, -6.29124641418457, 0.6157366037368774, 2.489687919616699, -3.4668266773223877, 9.92176342010498, 11.200815200805664, -0.19664029777050018, 7.491600513458252, -0.6231271624565125, -0.2584814429283142, -9.947997093200684, -0.9611040949821472, 1.1649218797683716, -2.1907122135162354, -1.502848744392395, -0.5192610621452332, 15.165953636169434, 2.4649462699890137, -0.998044490814209, 7.44166374206543, -2.0768048763275146, 3.5896823406219482, -7.305543422698975, -7.562084674835205, 4.32333517074585, 0.08044180274009705, -6.564010143280029, -2.314805269241333, -1.7642345428466797, -2.470881700515747, -7.6756181716918945, -9.548877716064453, -1.017755389213562, 0.1698644608259201, 2.5877134799957275, -1.8752295970916748, -0.36614322662353516, -6.049378395080566, -2.3965611457824707, -5.945338726043701, 0.9424033164978027, -13.155974388122559, -7.45780086517334, 0.14658108353614807, -3.7427968978881836, 5.841492652893066, -1.2872905731201172, 5.569431304931641, 12.570590019226074, 1.0939218997955322, 2.2142086029052734, 1.9181575775146484, 6.991420745849609, -5.888138771057129, 3.1409823894500732, -2.0036280155181885, 2.4434285163879395, 9.973138809204102, 5.036680221557617, 2.005120277404785, 2.861560344696045, 5.860223770141602, 2.917618751525879, -1.63111412525177, 2.0292205810546875, -4.070415019989014, -6.831437110900879]}}
+ ```text
+ {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'vec': [1.4217487573623657, 5.626248836517334, -5.342073440551758, 1.177390217781067, 3.308061122894287, 1.7565997838974, 5.1678876876831055, 10.806346893310547, -3.822679042816162, -5.614130973815918, 2.6238481998443604, -0.8072965741157532, 1.963512659072876, -7.312864780426025, 0.011034967377781868, -9.723127365112305, 0.661963164806366, -6.976816654205322, 10.213465690612793, 7.494767189025879, 2.9105641841888428, 3.894925117492676, 3.7999846935272217, 7.106173992156982, 16.905324935913086, -7.149376392364502, 8.733112335205078, 3.423002004623413, -4.831653118133545, -11.403371810913086, 11.232216835021973, 7.127464771270752, -4.282831192016602, 2.4523589611053467, -5.13075065612793, -18.17765998840332, -2.611666440963745, -11.00034236907959, -6.731431007385254, 1.6564655303955078, 0.7618184685707092, 1.1253058910369873, -2.0838277339935303, 4.725739002227783, -8.782590866088867, -3.5398736000061035, 3.8142387866973877, 5.142062664031982, 2.162053346633911, 4.09642219543457, -6.416221618652344, 12.747454643249512, 1.9429889917373657, -15.152948379516602, 6.417416572570801, 16.097013473510742, -9.716649055480957, -1.9920448064804077, -3.364956855773926, -1.8719490766525269, 11.567351341247559, 3.6978795528411865, 11.258269309997559, 7.442364692687988, 9.183405876159668, 4.528151512145996, -1.2417811155319214, 4.395910263061523, 6.672768592834473, 5.889888763427734, 7.627115249633789, -0.6692016124725342, -11.889703750610352, -9.208883285522461, -7.427401542663574, -3.777655601501465, 6.917237758636475, -9.848749160766602, -2.094479560852051, -5.1351189613342285, 0.49564215540885925, 9.317541122436523, -5.9141845703125, -1.809845209121704, -0.11738205701112747, -7.169270992279053, -1.0578246116638184, -5.721685886383057, -5.117387294769287, 16.137670516967773, -4.473618984222412, 7.66243314743042, -0.5538089871406555, 9.631582260131836, -6.470466613769531, -8.54850959777832, 4.371622085571289, -0.7970349192619324, 4.479003429412842, -2.9758646488189697, 3.2721707820892334, 2.8382749557495117, 5.1345953941345215, -9.19078254699707, -0.5657423138618469, -4.874573230743408, 2.316561460494995, -5.984307289123535, -2.1798791885375977, 0.35541653633117676, -0.3178458511829376, 9.493547439575195, 2.114448070526123, 4.358088493347168, -12.089820861816406, 8.451695442199707, -7.925461769104004, 4.624246120452881, 4.428938388824463, 18.691999435424805, -2.620460033416748, -5.149182319641113, -0.3582168221473694, 8.488557815551758, 4.98148250579834, -9.326834678649902, -2.2544236183166504, 6.64176607131958, 1.2119656801223755, 10.977132797241211, 16.55504035949707, 3.323848247528076, 9.55185317993164, -1.6677050590515137, -0.7953923940658569, -8.605660438537598, -0.4735637903213501, 2.6741855144500732, -5.359188079833984, -2.6673784255981445, 0.6660736799240112, 15.443212509155273, 4.740597724914551, -3.4725306034088135, 11.592561721801758, -2.05450701713562, 1.7361239194869995, -8.26533031463623, -9.304476737976074, 5.406835079193115, -1.5180232524871826, -7.746610641479492, -6.089605331420898, 0.07112561166286469, -0.34904858469963074, -8.649889945983887, -9.998958587646484, -2.5648481845855713, -0.5399898886680603, 2.6018145084381104, -0.31927648186683655, -1.8815231323242188, -2.0721378326416016, -3.4105639457702637, -8.299802780151367, 1.4836379289627075, -15.366002082824707, -8.288193702697754, 3.884773015975952, -3.4876506328582764, 7.362995624542236, 0.4657321572303772, 3.1326000690460205, 12.438883781433105, -1.8337029218673706, 4.532927513122559, 2.726433277130127, 10.145345687866211, -6.521956920623779, 2.8971481323242188, -3.3925881385803223, 5.079156398773193, 7.759725093841553, 4.677562236785889, 5.8457818031311035, 2.4023921489715576, 7.707108974456787, 3.9711389541625977, -6.390035152435303, 6.126871109008789, -3.776031017303467, -11.118141174316406]}}
```
#### 7.2 音频声纹打分
-注意: 初次使用客户端时响应时间会略长
+**注意:** 初次使用客户端时响应时间会略长
* 命令行 (推荐使用)
若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
@@ -331,20 +338,19 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
* test: 测试音频。
输出:
-
- ``` bash
- [2022-05-25 12:33:24,527] [ INFO] - vector score http client start
- [2022-05-25 12:33:24,527] [ INFO] - enroll audio: 85236145389.wav, test audio: 123456789.wav
- [2022-05-25 12:33:24,528] [ INFO] - endpoint: http://127.0.0.1:8790/paddlespeech/vector/score
- [2022-05-25 12:33:24,695] [ INFO] - The vector score is: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
- [2022-05-25 12:33:24,696] [ INFO] - The vector: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
- [2022-05-25 12:33:24,696] [ INFO] - Response time 0.168271 s.
+ ```text
+ [2022-08-01 09:04:42,275] [ INFO] - vector score http client start
+ [2022-08-01 09:04:42,275] [ INFO] - enroll audio: 85236145389.wav, test audio: 123456789.wav
+ [2022-08-01 09:04:42,275] [ INFO] - endpoint: http://127.0.0.1:8090/paddlespeech/vector/score
+ [2022-08-01 09:04:44,611] [ INFO] - {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.4292638897895813}}
+ [2022-08-01 09:04:44,611] [ INFO] - Response time 2.336258 s.
```
* Python API
- ``` python
+ ```python
from paddlespeech.server.bin.paddlespeech_client import VectorClientExecutor
+ import json
vectorclient_executor = VectorClientExecutor()
res = vectorclient_executor(
@@ -354,20 +360,14 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
server_ip="127.0.0.1",
port=8090,
task="score")
- print(res)
+ print(res.json())
```
输出:
-
- ``` bash
- [2022-05-25 12:30:14,143] [ INFO] - vector score http client start
- [2022-05-25 12:30:14,143] [ INFO] - enroll audio: 85236145389.wav, test audio: 123456789.wav
- [2022-05-25 12:30:14,143] [ INFO] - endpoint: http://127.0.0.1:8790/paddlespeech/vector/score
- [2022-05-25 12:30:14,363] [ INFO] - The vector score is: {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
- {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.45332613587379456}}
+ ```text
+ {'success': True, 'code': 200, 'message': {'description': 'success'}, 'result': {'score': 0.4292638897895813}}
```
-
### 8. 标点预测
**注意:** 初次使用客户端时响应时间会略长
@@ -390,9 +390,9 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
- `input`(必须输入): 用于标点预测的文本内容。
输出:
- ```bash
- [2022-05-09 18:19:04,397] [ INFO] - The punc text: 我认为跑步最重要的就是给我带来了身体健康。
- [2022-05-09 18:19:04,397] [ INFO] - Response time 0.092407 s.
+ ```text
+ [2022-05-09 18:19:04,397] [ INFO] - The punc text: 我认为跑步最重要的就是给我带来了身体健康。
+ [2022-05-09 18:19:04,397] [ INFO] - Response time 0.092407 s.
```
- Python API
@@ -405,11 +405,10 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
server_ip="127.0.0.1",
port=8090,)
print(res)
-
```
输出:
- ```bash
+ ```text
我认为跑步最重要的就是给我带来了身体健康。
```
@@ -418,10 +417,10 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav https://paddlespee
通过 `paddlespeech_server stats --task asr` 获取 ASR 服务支持的所有模型,其中静态模型可用于 paddle inference 推理。
### TTS 支持的模型
-通过 `paddlespeech_server stats --task tts` 获取 TTS 服务支持的所有模型,其中静态模型可用于 paddle inference 推理。
+通过 `paddlespeech_server stats --task tts` 获取 TTS 服务支持的所有模型,其中静态模型可用于 paddle inference 推理。
### CLS 支持的模型
-通过 `paddlespeech_server stats --task cls` 获取 CLS 服务支持的所有模型,其中静态模型可用于 paddle inference 推理。
+通过 `paddlespeech_server stats --task cls` 获取 CLS 服务支持的所有模型,其中静态模型可用于 paddle inference 推理。
### Vector 支持的模型
通过 `paddlespeech_server stats --task vector` 获取 Vector 服务支持的所有模型。
diff --git a/demos/speech_server/asr_client.sh b/demos/speech_server/asr_client.sh
old mode 100644
new mode 100755
diff --git a/demos/speech_server/cls_client.sh b/demos/speech_server/cls_client.sh
old mode 100644
new mode 100755
diff --git a/demos/speech_server/conf/application.yaml b/demos/speech_server/conf/application.yaml
index c6588ce80..9c171c470 100644
--- a/demos/speech_server/conf/application.yaml
+++ b/demos/speech_server/conf/application.yaml
@@ -7,7 +7,7 @@ host: 0.0.0.0
port: 8090
# The task format in the engin_list is: _
-# task choices = ['asr_python', 'asr_inference', 'tts_python', 'tts_inference', 'cls_python', 'cls_inference']
+# task choices = ['asr_python', 'asr_inference', 'tts_python', 'tts_inference', 'cls_python', 'cls_inference', 'text_python', 'vector_python']
protocol: 'http'
engine_list: ['asr_python', 'tts_python', 'cls_python', 'text_python', 'vector_python']
@@ -28,7 +28,6 @@ asr_python:
force_yes: True
device: # set 'gpu:id' or 'cpu'
-
################### speech task: asr; engine_type: inference #######################
asr_inference:
# model_type choices=['deepspeech2offline_aishell']
@@ -50,10 +49,11 @@ asr_inference:
################################### TTS #########################################
################### speech task: tts; engine_type: python #######################
-tts_python:
- # am (acoustic model) choices=['speedyspeech_csmsc', 'fastspeech2_csmsc',
- # 'fastspeech2_ljspeech', 'fastspeech2_aishell3',
- # 'fastspeech2_vctk']
+tts_python:
+ # am (acoustic model) choices=['speedyspeech_csmsc', 'fastspeech2_csmsc',
+ # 'fastspeech2_ljspeech', 'fastspeech2_aishell3',
+ # 'fastspeech2_vctk', 'fastspeech2_mix',
+ # 'tacotron2_csmsc', 'tacotron2_ljspeech']
am: 'fastspeech2_csmsc'
am_config:
am_ckpt:
@@ -64,8 +64,10 @@ tts_python:
spk_id: 0
# voc (vocoder) choices=['pwgan_csmsc', 'pwgan_ljspeech', 'pwgan_aishell3',
- # 'pwgan_vctk', 'mb_melgan_csmsc']
- voc: 'pwgan_csmsc'
+ # 'pwgan_vctk', 'mb_melgan_csmsc', 'style_melgan_csmsc',
+ # 'hifigan_csmsc', 'hifigan_ljspeech', 'hifigan_aishell3',
+ # 'hifigan_vctk', 'wavernn_csmsc']
+ voc: 'mb_melgan_csmsc'
voc_config:
voc_ckpt:
voc_stat:
@@ -94,7 +96,7 @@ tts_inference:
summary: True # False -> do not show predictor config
# voc (vocoder) choices=['pwgan_csmsc', 'mb_melgan_csmsc','hifigan_csmsc']
- voc: 'pwgan_csmsc'
+ voc: 'mb_melgan_csmsc'
voc_model: # the pdmodel file of your vocoder static model (XX.pdmodel)
voc_params: # the pdiparams file of your vocoder static model (XX.pdipparams)
voc_sample_rate: 24000
diff --git a/demos/speech_server/server.sh b/demos/speech_server/server.sh
old mode 100644
new mode 100755
index e5961286b..fd719ffc1
--- a/demos/speech_server/server.sh
+++ b/demos/speech_server/server.sh
@@ -1,3 +1,3 @@
#!/bin/bash
-paddlespeech_server start --config_file ./conf/application.yaml
+paddlespeech_server start --config_file ./conf/application.yaml &> server.log &
diff --git a/demos/speech_server/sid_client.sh b/demos/speech_server/sid_client.sh
new file mode 100755
index 000000000..99bab21ae
--- /dev/null
+++ b/demos/speech_server/sid_client.sh
@@ -0,0 +1,10 @@
+#!/bin/bash
+
+wget -c https://paddlespeech.bj.bcebos.com/vector/audio/85236145389.wav
+wget -c https://paddlespeech.bj.bcebos.com/vector/audio/123456789.wav
+
+# sid extract
+paddlespeech_client vector --server_ip 127.0.0.1 --port 8090 --task spk --input ./85236145389.wav
+
+# sid score
+paddlespeech_client vector --server_ip 127.0.0.1 --port 8090 --task score --enroll ./85236145389.wav --test ./123456789.wav
diff --git a/demos/speech_server/text_client.sh b/demos/speech_server/text_client.sh
new file mode 100755
index 000000000..098f159fb
--- /dev/null
+++ b/demos/speech_server/text_client.sh
@@ -0,0 +1,4 @@
+#!/bin/bash
+
+
+paddlespeech_client text --server_ip 127.0.0.1 --port 8090 --input 今天的天气真好啊你下午有空吗我想约你一起去吃饭
diff --git a/demos/speech_server/tts_client.sh b/demos/speech_server/tts_client.sh
old mode 100644
new mode 100755
diff --git a/demos/speech_web/.gitignore b/demos/speech_web/.gitignore
new file mode 100644
index 000000000..54418e605
--- /dev/null
+++ b/demos/speech_web/.gitignore
@@ -0,0 +1,16 @@
+*/.vscode/*
+*.wav
+*/resource/*
+.Ds*
+*.pyc
+*.pcm
+*.npy
+*.diff
+*.sqlite
+*/static/*
+*.pdparams
+*.pdiparams*
+*.pdmodel
+*/source/*
+*/PaddleSpeech/*
+
diff --git a/demos/speech_web/API.md b/demos/speech_web/API.md
new file mode 100644
index 000000000..c51446749
--- /dev/null
+++ b/demos/speech_web/API.md
@@ -0,0 +1,404 @@
+# 接口文档
+
+开启服务后可参照:
+
+http://0.0.0.0:8010/docs
+
+## ASR
+
+### 【POST】/asr/offline
+
+说明:上传 16k, 16bit wav 文件,返回 offline 语音识别模型识别结果
+
+返回: JSON
+
+前端接口: ASR-端到端识别,音频文件识别;语音指令-录音上传
+
+示例:
+
+```json
+{
+ "code": 0,
+ "result": "你也喜欢这个天气吗",
+ "message": "ok"
+}
+```
+
+### 【POST】/asr/offlinefile
+
+说明:上传16k,16bit wav文件,返回 offline 语音识别模型识别结果 + wav 数据的 base64
+
+返回: JSON
+
+前端接口: 音频文件识别(播放这段base64还原后记得添加 wav 头,采样率 16k, int16,添加后才能播放)
+
+示例:
+
+```json
+{
+ "code": 0,
+ "result": {
+ "asr_result": "今天天气真好",
+ "wav_base64": "///+//3//f/8/////v/////////////////+/wAA//8AAAEAAQACAAIAAQABAP"
+ },
+ "message": "ok"
+}
+```
+
+
+### 【POST】/asr/collectEnv
+
+说明: 通过采集环境噪音,上传 16k, int16 wav 文件,来生成后台 VAD 的能量阈值, 返回阈值结果
+
+前端接口:ASR-环境采样
+
+返回: JSON
+
+```json
+{
+ "code": 0,
+ "result": 3624.93505859375,
+ "message": "采集环境噪音成功"
+}
+```
+
+### 【GET】/asr/stopRecord
+
+说明:通过 GET 请求 /asr/stopRecord, 后台停止接收 offlineStream 中通过 WS 协议 上传的数据
+
+前端接口:语音聊天-暂停录音(获取 NLP,播放 TTS 时暂停)
+
+返回: JSON
+
+```JSON
+{
+ "code": 0,
+ "result": null,
+ "message": "停止成功"
+}
+```
+
+### 【GET】/asr/resumeRecord
+
+说明:通过 GET 请求 /asr/resumeRecord, 后台停止接收 offlineStream 中通过 WS 协议 上传的数据
+
+前端接口:语音聊天-恢复录音( TTS 播放完毕时,告诉后台恢复录音)
+
+返回: JSON
+
+```JSON
+{
+ "code": 0,
+ "result": null,
+ "message": "Online录音恢复"
+}
+```
+
+### 【Websocket】/ws/asr/offlineStream
+
+说明:通过 WS 协议,将前端音频持续上传到后台,前端采集 16k,Int16 类型的PCM片段,持续上传到后端
+
+前端接口:语音聊天-开始录音,持续将麦克风语音传给后端,后端推送语音识别结果
+
+返回:后端返回识别结果,offline 模型识别结果, 由WS推送
+
+
+### 【Websocket】/ws/asr/onlineStream
+
+说明:通过 WS 协议,将前端音频持续上传到后台,前端采集 16k,Int16 类型的 PCM 片段,持续上传到后端
+
+前端接口:ASR-流式识别开始录音,持续将麦克风语音传给后端,后端推送语音识别结果
+
+返回:后端返回识别结果,online 模型识别结果, 由 WS 推送
+
+## NLP
+
+### 【POST】/nlp/chat
+
+说明:返回闲聊对话的结果
+
+前端接口:语音聊天-获取到ASR识别结果后,向后端获取闲聊文本
+
+上传示例:
+
+```json
+{
+ "chat": "天气非常棒"
+}
+```
+
+返回示例:
+
+```json
+{
+ "code": 0,
+ "result": "是的,我也挺喜欢的",
+ "message": "ok"
+}
+```
+
+
+### 【POST】/nlp/ie
+
+说明:返回信息抽取结果
+
+前端接口:语音指令-向后端获取信息抽取结果
+
+上传示例:
+
+```json
+{
+ "chat": "今天我从马来西亚出发去香港花了五十万元"
+}
+```
+
+返回示例:
+
+```json
+{
+ "code": 0,
+ "result": [
+ {
+ "时间": [
+ {
+ "text": "今天",
+ "start": 0,
+ "end": 2,
+ "probability": 0.9817976247505698
+ }
+ ],
+ "出发地": [
+ {
+ "text": "马来西亚",
+ "start": 4,
+ "end": 8,
+ "probability": 0.974892389414169
+ }
+ ],
+ "目的地": [
+ {
+ "text": "马来西亚",
+ "start": 4,
+ "end": 8,
+ "probability": 0.7347504438136951
+ }
+ ],
+ "费用": [
+ {
+ "text": "五十万元",
+ "start": 15,
+ "end": 19,
+ "probability": 0.9679076530644402
+ }
+ ]
+ }
+ ],
+ "message": "ok"
+}
+```
+
+
+## TTS
+
+### 【POST】/tts/offline
+
+说明:获取 TTS 离线模型音频
+
+前端接口:TTS-端到端合成
+
+上传示例:
+
+```json
+{
+ "text": "天气非常棒"
+}
+```
+
+返回示例:对应音频对应的 base64 编码
+
+```json
+{
+ "code": 0,
+ "result": "UklGRrzQAABXQVZFZm10IBAAAAABAAEAwF0AAIC7AAACABAAZGF0YZjQAAADAP7/BAADAAAA...",
+ "message": "ok"
+}
+```
+
+### 【POST】/tts/online
+
+说明:流式获取语音合成音频
+
+前端接口:流式合成
+
+上传示例:
+```json
+{
+ "text": "天气非常棒"
+}
+
+```
+
+返回示例:
+
+二进制PCM片段,16k Int 16类型
+
+## VPR
+
+### 【POST】/vpr/enroll
+
+说明:声纹注册,通过表单上传 spk_id(字符串,非空), 与 audio (文件)
+
+前端接口:声纹识别-声纹注册
+
+上传示例:
+
+```text
+curl -X 'POST' \
+ 'http://0.0.0.0:8010/vpr/enroll' \
+ -H 'accept: application/json' \
+ -H 'Content-Type: multipart/form-data' \
+ -F 'spk_id=啦啦啦啦' \
+ -F 'audio=@demo_16k.wav;type=audio/wav'
+```
+
+返回示例:
+
+```json
+{
+ "status": true,
+ "msg": "Successfully enroll data!"
+}
+```
+
+### 【POST】/vpr/recog
+
+说明:声纹识别,识别文件,提取文件的声纹信息做比对 音频 16k, int 16 wav 格式
+
+前端接口:声纹识别-上传音频,返回声纹识别结果
+
+上传示例:
+
+```shell
+curl -X 'POST' \
+ 'http://0.0.0.0:8010/vpr/recog' \
+ -H 'accept: application/json' \
+ -H 'Content-Type: multipart/form-data' \
+ -F 'audio=@demo_16k.wav;type=audio/wav'
+```
+
+返回示例:
+
+```json
+[
+ [
+ "啦啦啦啦",
+ [
+ "",
+ 100
+ ]
+ ],
+ [
+ "test1",
+ [
+ "",
+ 11.64
+ ]
+ ],
+ [
+ "test2",
+ [
+ "",
+ 6.09
+ ]
+ ]
+]
+
+```
+
+
+### 【POST】/vpr/del
+
+说明: 根据 spk_id 删除用户数据
+
+前端接口:声纹识别-删除用户数据
+
+上传示例:
+```json
+{
+ "spk_id":"啦啦啦啦"
+}
+```
+
+返回示例
+
+```json
+{
+ "status": true,
+ "msg": "Successfully delete data!"
+}
+
+```
+
+
+### 【GET】/vpr/list
+
+说明:查询用户列表数据,无需参数,返回 spk_id 与 vpr_id
+
+前端接口:声纹识别-获取声纹数据列表
+
+返回示例:
+
+```json
+[
+ [
+ "test1",
+ "test2"
+ ],
+ [
+ 9,
+ 10
+ ]
+]
+
+```
+
+
+### 【GET】/vpr/data
+
+说明: 根据 vpr_id 获取用户vpr时使用的音频
+
+前端接口:声纹识别-获取vpr对应的音频
+
+访问示例:
+
+```shell
+curl -X 'GET' \
+ 'http://0.0.0.0:8010/vpr/data?vprId=9' \
+ -H 'accept: application/json'
+```
+
+返回示例:
+
+对应音频文件
+
+### 【GET】/vpr/database64
+
+说明: 根据 vpr_id 获取用户 vpr 时注册使用音频转换成 16k, int16 类型的数组,返回 base64 编码
+
+前端接口:声纹识别-获取 vpr 对应的音频(注意:播放时需要添加 wav头,16k,int16, 可参考 tts 播放时添加 wav 的方式,注意更改采样率)
+
+访问示例:
+
+```shell
+curl -X 'GET' \
+ 'http://localhost:8010/vpr/database64?vprId=12' \
+ -H 'accept: application/json'
+```
+
+返回示例:
+```json
+{
+ "code": 0,
+ "result":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA",
+ "message": "ok"
+```
\ No newline at end of file
diff --git a/demos/speech_web/README.md b/demos/speech_web/README.md
new file mode 100644
index 000000000..3b2da6e9a
--- /dev/null
+++ b/demos/speech_web/README.md
@@ -0,0 +1,116 @@
+# Paddle Speech Demo
+
+PaddleSpeechDemo 是一个以 PaddleSpeech 的语音交互功能为主体开发的 Demo 展示项目,用于帮助大家更好的上手 PaddleSpeech 以及使用 PaddleSpeech 构建自己的应用。
+
+智能语音交互部分使用 PaddleSpeech,对话以及信息抽取部分使用 PaddleNLP,网页前端展示部分基于 Vue3 进行开发
+
+主要功能:
+
++ 语音聊天:PaddleSpeech 的语音识别能力+语音合成能力,对话部分基于 PaddleNLP 的闲聊功能
++ 声纹识别:PaddleSpeech 的声纹识别功能展示
++ 语音识别:支持【实时语音识别】,【端到端识别】,【音频文件识别】三种模式
++ 语音合成:支持【流式合成】与【端到端合成】两种方式
++ 语音指令:基于 PaddleSpeech 的语音识别能力与 PaddleNLP 的信息抽取,实现交通费的智能报销
+
+运行效果:
+
+ 
+
+## 安装
+
+### 后端环境安装
+
+```
+# 安装环境
+cd speech_server
+pip install -r requirements.txt
+
+# 下载 ie 模型,针对地点进行微调,效果更好,不下载的话会使用其它版本,效果没有这个好
+cd source
+mkdir model
+cd model
+wget https://bj.bcebos.com/paddlenlp/applications/speech-cmd-analysis/finetune/model_state.pdparams
+```
+
+### 前端环境安装
+
+前端依赖 `node.js` ,需要提前安装,确保 `npm` 可用,`npm` 测试版本 `8.3.1`,建议下载[官网](https://nodejs.org/en/)稳定版的 `node.js`
+
+```
+# 进入前端目录
+cd web_client
+
+# 安装 `yarn`,已经安装可跳过
+npm install -g yarn
+
+# 使用yarn安装前端依赖
+yarn install
+```
+
+## 启动服务
+
+### 开启后端服务
+
+```
+cd speech_server
+# 默认8010端口
+python main.py --port 8010
+```
+
+### 开启前端服务
+
+```
+cd web_client
+yarn dev --port 8011
+```
+
+默认配置下,前端中配置的后台地址信息是 localhost,确保后端服务器和打开页面的游览器在同一台机器上,不在一台机器的配置方式见下方的 FAQ:【后端如果部署在其它机器或者别的端口如何修改】
+## FAQ
+
+#### Q: 如何安装node.js
+
+A: node.js的安装可以参考[【菜鸟教程】](https://www.runoob.com/nodejs/nodejs-install-setup.html), 确保 npm 可用
+
+#### Q:后端如果部署在其它机器或者别的端口如何修改
+
+A:后端的配置地址有分散在两个文件中
+
+修改第一个文件 `PaddleSpeechWebClient/vite.config.js`
+
+```
+server: {
+ host: "0.0.0.0",
+ proxy: {
+ "/api": {
+ target: "http://localhost:8010", // 这里改成后端所在接口
+ changeOrigin: true,
+ rewrite: (path) => path.replace(/^\/api/, ""),
+ },
+ },
+ }
+```
+
+修改第二个文件 `PaddleSpeechWebClient/src/api/API.js`( Websocket 代理配置失败,所以需要在这个文件中修改)
+
+```
+// websocket (这里改成后端所在的接口)
+CHAT_SOCKET_RECORD: 'ws://localhost:8010/ws/asr/offlineStream', // ChatBot websocket 接口
+ASR_SOCKET_RECORD: 'ws://localhost:8010/ws/asr/onlineStream', // Stream ASR 接口
+TTS_SOCKET_RECORD: 'ws://localhost:8010/ws/tts/online', // Stream TTS 接口
+```
+
+#### Q:后端以IP地址的形式,前端无法录音
+
+A:这里主要是游览器安全策略的限制,需要配置游览器后重启。游览器修改配置可参考[使用js-audio-recorder报浏览器不支持getUserMedia](https://blog.csdn.net/YRY_LIKE_YOU/article/details/113745273)
+
+chrome设置地址: chrome://flags/#unsafely-treat-insecure-origin-as-secure
+
+## 参考资料
+
+vue实现录音参考资料:https://blog.csdn.net/qq_41619796/article/details/107865602#t1
+
+前端流式播放音频参考仓库:
+
+https://github.com/AnthumChris/fetch-stream-audio
+
+https://bm.enthuses.me/buffered.php?bref=6677
diff --git a/demos/speech_web/docs/效果展示.png b/demos/speech_web/docs/效果展示.png
new file mode 100644
index 000000000..5f7997c17
Binary files /dev/null and b/demos/speech_web/docs/效果展示.png differ
diff --git a/demos/speech_web/speech_server/conf/tts_online_application.yaml b/demos/speech_web/speech_server/conf/tts_online_application.yaml
new file mode 100644
index 000000000..0460a5e16
--- /dev/null
+++ b/demos/speech_web/speech_server/conf/tts_online_application.yaml
@@ -0,0 +1,103 @@
+# This is the parameter configuration file for streaming tts server.
+
+#################################################################################
+# SERVER SETTING #
+#################################################################################
+host: 0.0.0.0
+port: 8092
+
+# The task format in the engin_list is: _
+# engine_list choices = ['tts_online', 'tts_online-onnx'], the inference speed of tts_online-onnx is faster than tts_online.
+# protocol choices = ['websocket', 'http']
+protocol: 'http'
+engine_list: ['tts_online-onnx']
+
+
+#################################################################################
+# ENGINE CONFIG #
+#################################################################################
+
+################################### TTS #########################################
+################### speech task: tts; engine_type: online #######################
+tts_online:
+ # am (acoustic model) choices=['fastspeech2_csmsc', 'fastspeech2_cnndecoder_csmsc']
+ # fastspeech2_cnndecoder_csmsc support streaming am infer.
+ am: 'fastspeech2_csmsc'
+ am_config:
+ am_ckpt:
+ am_stat:
+ phones_dict:
+ tones_dict:
+ speaker_dict:
+ spk_id: 0
+
+ # voc (vocoder) choices=['mb_melgan_csmsc, hifigan_csmsc']
+ # Both mb_melgan_csmsc and hifigan_csmsc support streaming voc inference
+ voc: 'mb_melgan_csmsc'
+ voc_config:
+ voc_ckpt:
+ voc_stat:
+
+ # others
+ lang: 'zh'
+ device: 'cpu' # set 'gpu:id' or 'cpu'
+ # am_block and am_pad only for fastspeech2_cnndecoder_onnx model to streaming am infer,
+ # when am_pad set 12, streaming synthetic audio is the same as non-streaming synthetic audio
+ am_block: 72
+ am_pad: 12
+ # voc_pad and voc_block voc model to streaming voc infer,
+ # when voc model is mb_melgan_csmsc, voc_pad set 14, streaming synthetic audio is the same as non-streaming synthetic audio; The minimum value of pad can be set to 7, streaming synthetic audio sounds normal
+ # when voc model is hifigan_csmsc, voc_pad set 19, streaming synthetic audio is the same as non-streaming synthetic audio; voc_pad set 14, streaming synthetic audio sounds normal
+ voc_block: 36
+ voc_pad: 14
+
+
+
+#################################################################################
+# ENGINE CONFIG #
+#################################################################################
+
+################################### TTS #########################################
+################### speech task: tts; engine_type: online-onnx #######################
+tts_online-onnx:
+ # am (acoustic model) choices=['fastspeech2_csmsc_onnx', 'fastspeech2_cnndecoder_csmsc_onnx']
+ # fastspeech2_cnndecoder_csmsc_onnx support streaming am infer.
+ am: 'fastspeech2_cnndecoder_csmsc_onnx'
+ # am_ckpt is a list, if am is fastspeech2_cnndecoder_csmsc_onnx, am_ckpt = [encoder model, decoder model, postnet model];
+ # if am is fastspeech2_csmsc_onnx, am_ckpt = [ckpt model];
+ am_ckpt: # list
+ am_stat:
+ phones_dict:
+ tones_dict:
+ speaker_dict:
+ spk_id: 0
+ am_sample_rate: 24000
+ am_sess_conf:
+ device: "cpu" # set 'gpu:id' or 'cpu'
+ use_trt: False
+ cpu_threads: 4
+
+ # voc (vocoder) choices=['mb_melgan_csmsc_onnx, hifigan_csmsc_onnx']
+ # Both mb_melgan_csmsc_onnx and hifigan_csmsc_onnx support streaming voc inference
+ voc: 'hifigan_csmsc_onnx'
+ voc_ckpt:
+ voc_sample_rate: 24000
+ voc_sess_conf:
+ device: "cpu" # set 'gpu:id' or 'cpu'
+ use_trt: False
+ cpu_threads: 4
+
+ # others
+ lang: 'zh'
+ # am_block and am_pad only for fastspeech2_cnndecoder_onnx model to streaming am infer,
+ # when am_pad set 12, streaming synthetic audio is the same as non-streaming synthetic audio
+ am_block: 72
+ am_pad: 12
+ # voc_pad and voc_block voc model to streaming voc infer,
+ # when voc model is mb_melgan_csmsc_onnx, voc_pad set 14, streaming synthetic audio is the same as non-streaming synthetic audio; The minimum value of pad can be set to 7, streaming synthetic audio sounds normal
+ # when voc model is hifigan_csmsc_onnx, voc_pad set 19, streaming synthetic audio is the same as non-streaming synthetic audio; voc_pad set 14, streaming synthetic audio sounds normal
+ voc_block: 36
+ voc_pad: 14
+ # voc_upsample should be same as n_shift on voc config.
+ voc_upsample: 300
+
diff --git a/demos/speech_web/speech_server/conf/ws_conformer_wenetspeech_application_faster.yaml b/demos/speech_web/speech_server/conf/ws_conformer_wenetspeech_application_faster.yaml
new file mode 100644
index 000000000..ba413c802
--- /dev/null
+++ b/demos/speech_web/speech_server/conf/ws_conformer_wenetspeech_application_faster.yaml
@@ -0,0 +1,48 @@
+# This is the parameter configuration file for PaddleSpeech Serving.
+
+#################################################################################
+# SERVER SETTING #
+#################################################################################
+host: 0.0.0.0
+port: 8090
+
+# The task format in the engin_list is: _
+# task choices = ['asr_online']
+# protocol = ['websocket'] (only one can be selected).
+# websocket only support online engine type.
+protocol: 'websocket'
+engine_list: ['asr_online']
+
+
+#################################################################################
+# ENGINE CONFIG #
+#################################################################################
+
+################################### ASR #########################################
+################### speech task: asr; engine_type: online #######################
+asr_online:
+ model_type: 'conformer_online_wenetspeech'
+ am_model: # the pdmodel file of am static model [optional]
+ am_params: # the pdiparams file of am static model [optional]
+ lang: 'zh'
+ sample_rate: 16000
+ cfg_path:
+ decode_method:
+ force_yes: True
+ device: 'cpu' # cpu or gpu:id
+ decode_method: "attention_rescoring"
+ continuous_decoding: True # enable continue decoding when endpoint detected
+ num_decoding_left_chunks: 16
+ am_predictor_conf:
+ device: # set 'gpu:id' or 'cpu'
+ switch_ir_optim: True
+ glog_info: False # True -> print glog
+ summary: True # False -> do not show predictor config
+
+ chunk_buffer_conf:
+ window_n: 7 # frame
+ shift_n: 4 # frame
+ window_ms: 25 # ms
+ shift_ms: 10 # ms
+ sample_rate: 16000
+ sample_width: 2
diff --git a/demos/speech_web/speech_server/main.py b/demos/speech_web/speech_server/main.py
new file mode 100644
index 000000000..b10176670
--- /dev/null
+++ b/demos/speech_web/speech_server/main.py
@@ -0,0 +1,492 @@
+# todo:
+# 1. 开启服务
+# 2. 接收录音音频,返回识别结果
+# 3. 接收ASR识别结果,返回NLP对话结果
+# 4. 接收NLP对话结果,返回TTS音频
+
+import base64
+import yaml
+import os
+import json
+import datetime
+import librosa
+import soundfile as sf
+import numpy as np
+import argparse
+import uvicorn
+import aiofiles
+from typing import Optional, List
+from pydantic import BaseModel
+from fastapi import FastAPI, Header, File, UploadFile, Form, Cookie, WebSocket, WebSocketDisconnect
+from fastapi.responses import StreamingResponse
+from starlette.responses import FileResponse
+from starlette.middleware.cors import CORSMiddleware
+from starlette.requests import Request
+from starlette.websockets import WebSocketState as WebSocketState
+
+from src.AudioManeger import AudioMannger
+from src.util import *
+from src.robot import Robot
+from src.WebsocketManeger import ConnectionManager
+from src.SpeechBase.vpr import VPR
+
+from paddlespeech.server.engine.asr.online.python.asr_engine import PaddleASRConnectionHanddler
+from paddlespeech.server.utils.audio_process import float2pcm
+
+
+# 解析配置
+parser = argparse.ArgumentParser(
+ prog='PaddleSpeechDemo', add_help=True)
+
+parser.add_argument(
+ "--port",
+ action="store",
+ type=int,
+ help="port of the app",
+ default=8010,
+ required=False)
+
+args = parser.parse_args()
+port = args.port
+
+# 配置文件
+tts_config = "conf/tts_online_application.yaml"
+asr_config = "conf/ws_conformer_wenetspeech_application_faster.yaml"
+asr_init_path = "source/demo/demo.wav"
+db_path = "source/db/vpr.sqlite"
+ie_model_path = "source/model"
+
+# 路径配置
+UPLOAD_PATH = "source/vpr"
+WAV_PATH = "source/wav"
+
+
+base_sources = [
+ UPLOAD_PATH, WAV_PATH
+]
+for path in base_sources:
+ os.makedirs(path, exist_ok=True)
+
+
+# 初始化
+app = FastAPI()
+chatbot = Robot(asr_config, tts_config, asr_init_path, ie_model_path=ie_model_path)
+manager = ConnectionManager()
+aumanager = AudioMannger(chatbot)
+aumanager.init()
+vpr = VPR(db_path, dim = 192, top_k = 5)
+
+# 服务配置
+class NlpBase(BaseModel):
+ chat: str
+
+class TtsBase(BaseModel):
+ text: str
+
+class Audios:
+ def __init__(self) -> None:
+ self.audios = b""
+
+audios = Audios()
+
+######################################################################
+########################### ASR 服务 #################################
+#####################################################################
+
+# 接收文件,返回ASR结果
+# 上传文件
+@app.post("/asr/offline")
+async def speech2textOffline(files: List[UploadFile]):
+ # 只有第一个有效
+ asr_res = ""
+ for file in files[:1]:
+ # 生成时间戳
+ now_name = "asr_offline_" + datetime.datetime.strftime(datetime.datetime.now(), '%Y%m%d%H%M%S') + randName() + ".wav"
+ out_file_path = os.path.join(WAV_PATH, now_name)
+ async with aiofiles.open(out_file_path, 'wb') as out_file:
+ content = await file.read() # async read
+ await out_file.write(content) # async write
+
+ # 返回ASR识别结果
+ asr_res = chatbot.speech2text(out_file_path)
+ return SuccessRequest(result=asr_res)
+ # else:
+ # return ErrorRequest(message="文件不是.wav格式")
+ return ErrorRequest(message="上传文件为空")
+
+# 接收文件,同时将wav强制转成16k, int16类型
+@app.post("/asr/offlinefile")
+async def speech2textOfflineFile(files: List[UploadFile]):
+ # 只有第一个有效
+ asr_res = ""
+ for file in files[:1]:
+ # 生成时间戳
+ now_name = "asr_offline_" + datetime.datetime.strftime(datetime.datetime.now(), '%Y%m%d%H%M%S') + randName() + ".wav"
+ out_file_path = os.path.join(WAV_PATH, now_name)
+ async with aiofiles.open(out_file_path, 'wb') as out_file:
+ content = await file.read() # async read
+ await out_file.write(content) # async write
+
+ # 将文件转成16k, 16bit类型的wav文件
+ wav, sr = librosa.load(out_file_path, sr=16000)
+ wav = float2pcm(wav) # float32 to int16
+ wav_bytes = wav.tobytes() # to bytes
+ wav_base64 = base64.b64encode(wav_bytes).decode('utf8')
+
+ # 将文件重新写入
+ now_name = now_name[:-4] + "_16k" + ".wav"
+ out_file_path = os.path.join(WAV_PATH, now_name)
+ sf.write(out_file_path,wav,16000)
+
+ # 返回ASR识别结果
+ asr_res = chatbot.speech2text(out_file_path)
+ response_res = {
+ "asr_result": asr_res,
+ "wav_base64": wav_base64
+ }
+ return SuccessRequest(result=response_res)
+
+ return ErrorRequest(message="上传文件为空")
+
+
+
+# 流式接收测试
+@app.post("/asr/online1")
+async def speech2textOnlineRecive(files: List[UploadFile]):
+ audio_bin = b''
+ for file in files:
+ content = await file.read()
+ audio_bin += content
+ audios.audios += audio_bin
+ print(f"audios长度变化: {len(audios.audios)}")
+ return SuccessRequest(message="接收成功")
+
+# 采集环境噪音大小
+@app.post("/asr/collectEnv")
+async def collectEnv(files: List[UploadFile]):
+ for file in files[:1]:
+ content = await file.read() # async read
+ # 初始化, wav 前44字节是头部信息
+ aumanager.compute_env_volume(content[44:])
+ vad_ = aumanager.vad_threshold
+ return SuccessRequest(result=vad_,message="采集环境噪音成功")
+
+# 停止录音
+@app.get("/asr/stopRecord")
+async def stopRecord():
+ audios.audios = b""
+ aumanager.stop()
+ print("Online录音暂停")
+ return SuccessRequest(message="停止成功")
+
+# 恢复录音
+@app.get("/asr/resumeRecord")
+async def resumeRecord():
+ aumanager.resume()
+ print("Online录音恢复")
+ return SuccessRequest(message="Online录音恢复")
+
+
+# 聊天用的ASR
+@app.websocket("/ws/asr/offlineStream")
+async def websocket_endpoint(websocket: WebSocket):
+ await manager.connect(websocket)
+ try:
+ while True:
+ asr_res = None
+ # websocket 不接收,只推送
+ data = await websocket.receive_bytes()
+ if not aumanager.is_pause:
+ asr_res = aumanager.stream_asr(data)
+ else:
+ print("录音暂停")
+ if asr_res:
+ await manager.send_personal_message(asr_res, websocket)
+ aumanager.clear_asr()
+
+ except WebSocketDisconnect:
+ manager.disconnect(websocket)
+ # await manager.broadcast(f"用户-{user}-离开")
+ # print(f"用户-{user}-离开")
+
+
+# Online识别的ASR
+@app.websocket('/ws/asr/onlineStream')
+async def websocket_endpoint(websocket: WebSocket):
+ """PaddleSpeech Online ASR Server api
+
+ Args:
+ websocket (WebSocket): the websocket instance
+ """
+
+ #1. the interface wait to accept the websocket protocal header
+ # and only we receive the header, it establish the connection with specific thread
+ await websocket.accept()
+
+ #2. if we accept the websocket headers, we will get the online asr engine instance
+ engine = chatbot.asr.engine
+
+ #3. each websocket connection, we will create an PaddleASRConnectionHanddler to process such audio
+ # and each connection has its own connection instance to process the request
+ # and only if client send the start signal, we create the PaddleASRConnectionHanddler instance
+ connection_handler = None
+
+ try:
+ #4. we do a loop to process the audio package by package according the protocal
+ # and only if the client send finished signal, we will break the loop
+ while True:
+ # careful here, changed the source code from starlette.websockets
+ # 4.1 we wait for the client signal for the specific action
+ assert websocket.application_state == WebSocketState.CONNECTED
+ message = await websocket.receive()
+ websocket._raise_on_disconnect(message)
+
+ #4.2 text for the action command and bytes for pcm data
+ if "text" in message:
+ # we first parse the specific command
+ message = json.loads(message["text"])
+ if 'signal' not in message:
+ resp = {"status": "ok", "message": "no valid json data"}
+ await websocket.send_json(resp)
+
+ # start command, we create the PaddleASRConnectionHanddler instance to process the audio data
+ # end command, we process the all the last audio pcm and return the final result
+ # and we break the loop
+ if message['signal'] == 'start':
+ resp = {"status": "ok", "signal": "server_ready"}
+ # do something at begining here
+ # create the instance to process the audio
+ # connection_handler = chatbot.asr.connection_handler
+ connection_handler = PaddleASRConnectionHanddler(engine)
+ await websocket.send_json(resp)
+ elif message['signal'] == 'end':
+ # reset single engine for an new connection
+ # and we will destroy the connection
+ connection_handler.decode(is_finished=True)
+ connection_handler.rescoring()
+ asr_results = connection_handler.get_result()
+ connection_handler.reset()
+
+ resp = {
+ "status": "ok",
+ "signal": "finished",
+ 'result': asr_results
+ }
+ await websocket.send_json(resp)
+ break
+ else:
+ resp = {"status": "ok", "message": "no valid json data"}
+ await websocket.send_json(resp)
+ elif "bytes" in message:
+ # bytes for the pcm data
+ message = message["bytes"]
+ print("###############")
+ print("len message: ", len(message))
+ print("###############")
+
+ # we extract the remained audio pcm
+ # and decode for the result in this package data
+ connection_handler.extract_feat(message)
+ connection_handler.decode(is_finished=False)
+ asr_results = connection_handler.get_result()
+
+ # return the current period result
+ # if the engine create the vad instance, this connection will have many period results
+ resp = {'result': asr_results}
+ print(resp)
+ await websocket.send_json(resp)
+ except WebSocketDisconnect:
+ pass
+
+######################################################################
+########################### NLP 服务 #################################
+#####################################################################
+
+@app.post("/nlp/chat")
+async def chatOffline(nlp_base:NlpBase):
+ chat = nlp_base.chat
+ if not chat:
+ return ErrorRequest(message="传入文本为空")
+ else:
+ res = chatbot.chat(chat)
+ return SuccessRequest(result=res)
+
+@app.post("/nlp/ie")
+async def ieOffline(nlp_base:NlpBase):
+ nlp_text = nlp_base.chat
+ if not nlp_text:
+ return ErrorRequest(message="传入文本为空")
+ else:
+ res = chatbot.ie(nlp_text)
+ return SuccessRequest(result=res)
+
+######################################################################
+########################### TTS 服务 #################################
+#####################################################################
+
+@app.post("/tts/offline")
+async def text2speechOffline(tts_base:TtsBase):
+ text = tts_base.text
+ if not text:
+ return ErrorRequest(message="文本为空")
+ else:
+ now_name = "tts_"+ datetime.datetime.strftime(datetime.datetime.now(), '%Y%m%d%H%M%S') + randName() + ".wav"
+ out_file_path = os.path.join(WAV_PATH, now_name)
+ # 保存为文件,再转成base64传输
+ chatbot.text2speech(text, outpath=out_file_path)
+ with open(out_file_path, "rb") as f:
+ data_bin = f.read()
+ base_str = base64.b64encode(data_bin)
+ return SuccessRequest(result=base_str)
+
+# http流式TTS
+@app.post("/tts/online")
+async def stream_tts(request_body: TtsBase):
+ text = request_body.text
+ return StreamingResponse(chatbot.text2speechStreamBytes(text=text))
+
+# ws流式TTS
+@app.websocket("/ws/tts/online")
+async def stream_ttsWS(websocket: WebSocket):
+ await manager.connect(websocket)
+ try:
+ while True:
+ text = await websocket.receive_text()
+ # 用 websocket 流式接收音频数据
+ if text:
+ for sub_wav in chatbot.text2speechStream(text=text):
+ # print("发送sub wav: ", len(sub_wav))
+ res = {
+ "wav": sub_wav,
+ "done": False
+ }
+ await websocket.send_json(res)
+
+ # 输送结束
+ res = {
+ "wav": sub_wav,
+ "done": True
+ }
+ await websocket.send_json(res)
+ # manager.disconnect(websocket)
+
+ except WebSocketDisconnect:
+ manager.disconnect(websocket)
+
+
+######################################################################
+########################### VPR 服务 #################################
+#####################################################################
+
+app.add_middleware(
+ CORSMiddleware,
+ allow_origins=["*"],
+ allow_credentials=True,
+ allow_methods=["*"],
+ allow_headers=["*"])
+
+
+@app.post('/vpr/enroll')
+async def vpr_enroll(table_name: str=None,
+ spk_id: str=Form(...),
+ audio: UploadFile=File(...)):
+ # Enroll the uploaded audio with spk-id into MySQL
+ try:
+ if not spk_id:
+ return {'status': False, 'msg': "spk_id can not be None"}
+ # Save the upload data to server.
+ content = await audio.read()
+ now_name = "vpr_enroll_" + datetime.datetime.strftime(datetime.datetime.now(), '%Y%m%d%H%M%S') + randName() + ".wav"
+ audio_path = os.path.join(UPLOAD_PATH, now_name)
+
+ with open(audio_path, "wb+") as f:
+ f.write(content)
+ vpr.vpr_enroll(username=spk_id, wav_path=audio_path)
+ return {'status': True, 'msg': "Successfully enroll data!"}
+ except Exception as e:
+ return {'status': False, 'msg': e}
+
+
+@app.post('/vpr/recog')
+async def vpr_recog(request: Request,
+ table_name: str=None,
+ audio: UploadFile=File(...)):
+ # Voice print recognition online
+ # try:
+ # Save the upload data to server.
+ content = await audio.read()
+ now_name = "vpr_query_" + datetime.datetime.strftime(datetime.datetime.now(), '%Y%m%d%H%M%S') + randName() + ".wav"
+ query_audio_path = os.path.join(UPLOAD_PATH, now_name)
+ with open(query_audio_path, "wb+") as f:
+ f.write(content)
+ spk_ids, paths, scores = vpr.do_search_vpr(query_audio_path)
+
+ res = dict(zip(spk_ids, zip(paths, scores)))
+ # Sort results by distance metric, closest distances first
+ res = sorted(res.items(), key=lambda item: item[1][1], reverse=True)
+ return res
+ # except Exception as e:
+ # return {'status': False, 'msg': e}, 400
+
+
+@app.post('/vpr/del')
+async def vpr_del(spk_id: dict=None):
+ # Delete a record by spk_id in MySQL
+ try:
+ spk_id = spk_id['spk_id']
+ if not spk_id:
+ return {'status': False, 'msg': "spk_id can not be None"}
+ vpr.vpr_del(username=spk_id)
+ return {'status': True, 'msg': "Successfully delete data!"}
+ except Exception as e:
+ return {'status': False, 'msg': e}, 400
+
+
+@app.get('/vpr/list')
+async def vpr_list():
+ # Get all records in MySQL
+ try:
+ spk_ids, vpr_ids = vpr.do_list()
+ return spk_ids, vpr_ids
+ except Exception as e:
+ return {'status': False, 'msg': e}, 400
+
+
+@app.get('/vpr/database64')
+async def vpr_database64(vprId: int):
+ # Get the audio file from path by spk_id in MySQL
+ try:
+ if not vprId:
+ return {'status': False, 'msg': "vpr_id can not be None"}
+ audio_path = vpr.do_get_wav(vprId)
+ # 返回base64
+
+ # 将文件转成16k, 16bit类型的wav文件
+ wav, sr = librosa.load(audio_path, sr=16000)
+ wav = float2pcm(wav) # float32 to int16
+ wav_bytes = wav.tobytes() # to bytes
+ wav_base64 = base64.b64encode(wav_bytes).decode('utf8')
+
+ return SuccessRequest(result=wav_base64)
+ except Exception as e:
+ return {'status': False, 'msg': e}, 400
+
+@app.get('/vpr/data')
+async def vpr_data(vprId: int):
+ # Get the audio file from path by spk_id in MySQL
+ try:
+ if not vprId:
+ return {'status': False, 'msg': "vpr_id can not be None"}
+ audio_path = vpr.do_get_wav(vprId)
+ return FileResponse(audio_path)
+ except Exception as e:
+ return {'status': False, 'msg': e}, 400
+
+if __name__ == '__main__':
+ uvicorn.run(app=app, host='0.0.0.0', port=port)
+
+
+
+
+
+
diff --git a/demos/speech_web/speech_server/requirements.txt b/demos/speech_web/speech_server/requirements.txt
new file mode 100644
index 000000000..7e7bd1680
--- /dev/null
+++ b/demos/speech_web/speech_server/requirements.txt
@@ -0,0 +1,14 @@
+aiofiles
+fastapi
+librosa
+numpy
+pydantic
+scikit_learn
+SoundFile
+starlette
+uvicorn
+paddlepaddle
+paddlespeech
+paddlenlp
+faiss-cpu
+python-multipart
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/AudioManeger.py b/demos/speech_web/speech_server/src/AudioManeger.py
new file mode 100644
index 000000000..0deb03699
--- /dev/null
+++ b/demos/speech_web/speech_server/src/AudioManeger.py
@@ -0,0 +1,150 @@
+import imp
+from queue import Queue
+import numpy as np
+import os
+import wave
+import random
+import datetime
+from .util import randName
+
+
+class AudioMannger:
+ def __init__(self, robot, frame_length=160, frame=10, data_width=2, vad_default = 300):
+ # 二进制 pcm 流
+ self.audios = b''
+ self.asr_result = ""
+ # Speech 核心主体
+ self.robot = robot
+
+ self.file_dir = "source"
+ os.makedirs(self.file_dir, exist_ok=True)
+ self.vad_deafult = vad_default
+ self.vad_threshold = vad_default
+ self.vad_threshold_path = os.path.join(self.file_dir, "vad_threshold.npy")
+
+ # 10ms 一帧
+ self.frame_length = frame_length
+ # 10帧,检测一次 vad
+ self.frame = frame
+ # int 16, 两个bytes
+ self.data_width = data_width
+ # window
+ self.window_length = frame_length * frame * data_width
+
+ # 是否开始录音
+ self.on_asr = False
+ self.silence_cnt = 0
+ self.max_silence_cnt = 4
+ self.is_pause = False # 录音暂停与恢复
+
+
+
+ def init(self):
+ if os.path.exists(self.vad_threshold_path):
+ # 平均响度文件存在
+ self.vad_threshold = np.load(self.vad_threshold_path)
+
+
+ def clear_audio(self):
+ # 清空 pcm 累积片段与 asr 识别结果
+ self.audios = b''
+
+ def clear_asr(self):
+ self.asr_result = ""
+
+
+ def compute_chunk_volume(self, start_index, pcm_bins):
+ # 根据帧长计算能量平均值
+ pcm_bin = pcm_bins[start_index: start_index + self.window_length]
+ # 转成 numpy
+ pcm_np = np.frombuffer(pcm_bin, np.int16)
+ # 归一化 + 计算响度
+ x = pcm_np.astype(np.float32)
+ x = np.abs(x)
+ return np.mean(x)
+
+
+ def is_speech(self, start_index, pcm_bins):
+ # 检查是否没
+ if start_index > len(pcm_bins):
+ return False
+ # 检查从这个 start 开始是否为静音帧
+ energy = self.compute_chunk_volume(start_index=start_index, pcm_bins=pcm_bins)
+ # print(energy)
+ if energy > self.vad_threshold:
+ return True
+ else:
+ return False
+
+ def compute_env_volume(self, pcm_bins):
+ max_energy = 0
+ start = 0
+ while start < len(pcm_bins):
+ energy = self.compute_chunk_volume(start_index=start, pcm_bins=pcm_bins)
+ if energy > max_energy:
+ max_energy = energy
+ start += self.window_length
+ self.vad_threshold = max_energy + 100 if max_energy > self.vad_deafult else self.vad_deafult
+
+ # 保存成文件
+ np.save(self.vad_threshold_path, self.vad_threshold)
+ print(f"vad 阈值大小: {self.vad_threshold}")
+ print(f"环境采样保存: {os.path.realpath(self.vad_threshold_path)}")
+
+ def stream_asr(self, pcm_bin):
+ # 先把 pcm_bin 送进去做端点检测
+ start = 0
+ while start < len(pcm_bin):
+ if self.is_speech(start_index=start, pcm_bins=pcm_bin):
+ self.on_asr = True
+ self.silence_cnt = 0
+ print("录音中")
+ self.audios += pcm_bin[ start : start + self.window_length]
+ else:
+ if self.on_asr:
+ self.silence_cnt += 1
+ if self.silence_cnt > self.max_silence_cnt:
+ self.on_asr = False
+ self.silence_cnt = 0
+ # 录音停止
+ print("录音停止")
+ # audios 保存为 wav, 送入 ASR
+ if len(self.audios) > 2 * 16000:
+ file_path = os.path.join(self.file_dir, "asr_" + datetime.datetime.strftime(datetime.datetime.now(), '%Y%m%d%H%M%S') + randName() + ".wav")
+ self.save_audio(file_path=file_path)
+ self.asr_result = self.robot.speech2text(file_path)
+ self.clear_audio()
+ return self.asr_result
+ else:
+ # 正常接收
+ print("录音中 静音")
+ self.audios += pcm_bin[ start : start + self.window_length]
+ start += self.window_length
+ return ""
+
+ def save_audio(self, file_path):
+ print("保存音频")
+ wf = wave.open(file_path, 'wb') # 创建一个音频文件,名字为“01.wav"
+ wf.setnchannels(1) # 设置声道数为2
+ wf.setsampwidth(2) # 设置采样深度为
+ wf.setframerate(16000) # 设置采样率为16000
+ # 将数据写入创建的音频文件
+ wf.writeframes(self.audios)
+ # 写完后将文件关闭
+ wf.close()
+
+ def end(self):
+ # audios 保存为 wav, 送入 ASR
+ file_path = os.path.join(self.file_dir, "asr.wav")
+ self.save_audio(file_path=file_path)
+ return self.robot.speech2text(file_path)
+
+ def stop(self):
+ self.is_pause = True
+ self.audios = b''
+
+ def resume(self):
+ self.is_pause = False
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/SpeechBase/asr.py b/demos/speech_web/speech_server/src/SpeechBase/asr.py
new file mode 100644
index 000000000..8d4c0cffc
--- /dev/null
+++ b/demos/speech_web/speech_server/src/SpeechBase/asr.py
@@ -0,0 +1,62 @@
+from re import sub
+import numpy as np
+import paddle
+import librosa
+import soundfile
+
+from paddlespeech.server.engine.asr.online.python.asr_engine import ASREngine
+from paddlespeech.server.engine.asr.online.python.asr_engine import PaddleASRConnectionHanddler
+from paddlespeech.server.utils.config import get_config
+
+def readWave(samples):
+ x_len = len(samples)
+
+ chunk_size = 85 * 16 #80ms, sample_rate = 16kHz
+ if x_len % chunk_size != 0:
+ padding_len_x = chunk_size - x_len % chunk_size
+ else:
+ padding_len_x = 0
+
+ padding = np.zeros((padding_len_x), dtype=samples.dtype)
+ padded_x = np.concatenate([samples, padding], axis=0)
+
+ assert (x_len + padding_len_x) % chunk_size == 0
+ num_chunk = (x_len + padding_len_x) / chunk_size
+ num_chunk = int(num_chunk)
+ for i in range(0, num_chunk):
+ start = i * chunk_size
+ end = start + chunk_size
+ x_chunk = padded_x[start:end]
+ yield x_chunk
+
+
+class ASR:
+ def __init__(self, config_path, ) -> None:
+ self.config = get_config(config_path)['asr_online']
+ self.engine = ASREngine()
+ self.engine.init(self.config)
+ self.connection_handler = PaddleASRConnectionHanddler(self.engine)
+
+ def offlineASR(self, samples, sample_rate=16000):
+ x_chunk, x_chunk_lens = self.engine.preprocess(samples=samples, sample_rate=sample_rate)
+ self.engine.run(x_chunk, x_chunk_lens)
+ result = self.engine.postprocess()
+ self.engine.reset()
+ return result
+
+ def onlineASR(self, samples:bytes=None, is_finished=False):
+ if not is_finished:
+ # 流式开始
+ self.connection_handler.extract_feat(samples)
+ self.connection_handler.decode(is_finished)
+ asr_results = self.connection_handler.get_result()
+ return asr_results
+ else:
+ # 流式结束
+ self.connection_handler.decode(is_finished=True)
+ self.connection_handler.rescoring()
+ asr_results = self.connection_handler.get_result()
+ self.connection_handler.reset()
+ return asr_results
+
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/SpeechBase/nlp.py b/demos/speech_web/speech_server/src/SpeechBase/nlp.py
new file mode 100644
index 000000000..4ece63256
--- /dev/null
+++ b/demos/speech_web/speech_server/src/SpeechBase/nlp.py
@@ -0,0 +1,23 @@
+from paddlenlp import Taskflow
+
+class NLP:
+ def __init__(self, ie_model_path=None):
+ schema = ["时间", "出发地", "目的地", "费用"]
+ if ie_model_path:
+ self.ie_model = Taskflow("information_extraction",
+ schema=schema, task_path=ie_model_path)
+ else:
+ self.ie_model = Taskflow("information_extraction",
+ schema=schema)
+
+ self.dialogue_model = Taskflow("dialogue")
+
+ def chat(self, text):
+ result = self.dialogue_model([text])
+ return result[0]
+
+ def ie(self, text):
+ result = self.ie_model(text)
+ return result
+
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/SpeechBase/sql_helper.py b/demos/speech_web/speech_server/src/SpeechBase/sql_helper.py
new file mode 100644
index 000000000..6937def58
--- /dev/null
+++ b/demos/speech_web/speech_server/src/SpeechBase/sql_helper.py
@@ -0,0 +1,116 @@
+import base64
+import sqlite3
+import os
+import numpy as np
+from pkg_resources import resource_stream
+
+
+def dict_factory(cursor, row):
+ d = {}
+ for idx, col in enumerate(cursor.description):
+ d[col[0]] = row[idx]
+ return d
+
+class DataBase(object):
+ def __init__(self, db_path:str):
+ db_path = os.path.realpath(db_path)
+
+ if os.path.exists(db_path):
+ self.db_path = db_path
+ else:
+ db_path_dir = os.path.dirname(db_path)
+ os.makedirs(db_path_dir, exist_ok=True)
+ self.db_path = db_path
+
+ self.conn = sqlite3.connect(self.db_path)
+ self.conn.row_factory = dict_factory
+ self.cursor = self.conn.cursor()
+ self.init_database()
+
+ def init_database(self):
+ """
+ 初始化数据库, 若表不存在则创建
+ """
+ sql = """
+ CREATE TABLE IF NOT EXISTS vprtable (
+ `id` INTEGER PRIMARY KEY AUTOINCREMENT,
+ `username` TEXT NOT NULL,
+ `vector` TEXT NOT NULL,
+ `wavpath` TEXT NOT NULL
+ );
+ """
+ self.cursor.execute(sql)
+ self.conn.commit()
+
+ def execute_base(self, sql, data_dict):
+ self.cursor.execute(sql, data_dict)
+ self.conn.commit()
+
+ def insert_one(self, username, vector_base64:str, wav_path):
+ if not os.path.exists(wav_path):
+ return None, "wav not exists"
+ else:
+ sql = f"""
+ insert into
+ vprtable (username, vector, wavpath)
+ values (?, ?, ?)
+ """
+ try:
+ self.cursor.execute(sql, (username, vector_base64, wav_path))
+ self.conn.commit()
+ lastidx = self.cursor.lastrowid
+ return lastidx, "data insert success"
+ except Exception as e:
+ print(e)
+ return None, e
+
+ def select_all(self):
+ sql = """
+ SELECT * from vprtable
+ """
+ result = self.cursor.execute(sql).fetchall()
+ return result
+
+ def select_by_id(self, vpr_id):
+ sql = f"""
+ SELECT * from vprtable WHERE `id` = {vpr_id}
+ """
+ result = self.cursor.execute(sql).fetchall()
+ return result
+
+ def select_by_username(self, username):
+ sql = f"""
+ SELECT * from vprtable WHERE `username` = '{username}'
+ """
+ result = self.cursor.execute(sql).fetchall()
+ return result
+
+ def drop_by_username(self, username):
+ sql = f"""
+ DELETE from vprtable WHERE `username`='{username}'
+ """
+ self.cursor.execute(sql)
+ self.conn.commit()
+
+ def drop_all(self):
+ sql = f"""
+ DELETE from vprtable
+ """
+ self.cursor.execute(sql)
+ self.conn.commit()
+
+ def drop_table(self):
+ sql = f"""
+ DROP TABLE vprtable
+ """
+ self.cursor.execute(sql)
+ self.conn.commit()
+
+ def encode_vector(self, vector:np.ndarray):
+ return base64.b64encode(vector).decode('utf8')
+
+ def decode_vector(self, vector_base64, dtype=np.float32):
+ b = base64.b64decode(vector_base64)
+ vc = np.frombuffer(b, dtype=dtype)
+ return vc
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/SpeechBase/tts.py b/demos/speech_web/speech_server/src/SpeechBase/tts.py
new file mode 100644
index 000000000..d5ba0c802
--- /dev/null
+++ b/demos/speech_web/speech_server/src/SpeechBase/tts.py
@@ -0,0 +1,209 @@
+# tts 推理引擎,支持流式与非流式
+# 精简化使用
+# 用 onnxruntime 进行推理
+# 1. 下载对应的模型
+# 2. 加载模型
+# 3. 端到端推理
+# 4. 流式推理
+
+import base64
+import math
+import logging
+import numpy as np
+from paddlespeech.server.utils.onnx_infer import get_sess
+from paddlespeech.t2s.frontend.zh_frontend import Frontend
+from paddlespeech.server.utils.util import denorm, get_chunks
+from paddlespeech.server.utils.audio_process import float2pcm
+from paddlespeech.server.utils.config import get_config
+
+from paddlespeech.server.engine.tts.online.onnx.tts_engine import TTSEngine
+
+class TTS:
+ def __init__(self, config_path):
+ self.config = get_config(config_path)['tts_online-onnx']
+ self.config['voc_block'] = 36
+ self.engine = TTSEngine()
+ self.engine.init(self.config)
+ self.executor = self.engine.executor
+ #self.engine.warm_up()
+
+ # 前端初始化
+ self.frontend = Frontend(
+ phone_vocab_path=self.engine.executor.phones_dict,
+ tone_vocab_path=None)
+
+ def depadding(self, data, chunk_num, chunk_id, block, pad, upsample):
+ """
+ Streaming inference removes the result of pad inference
+ """
+ front_pad = min(chunk_id * block, pad)
+ # first chunk
+ if chunk_id == 0:
+ data = data[:block * upsample]
+ # last chunk
+ elif chunk_id == chunk_num - 1:
+ data = data[front_pad * upsample:]
+ # middle chunk
+ else:
+ data = data[front_pad * upsample:(front_pad + block) * upsample]
+
+ return data
+
+ def offlineTTS(self, text):
+ get_tone_ids = False
+ merge_sentences = False
+
+ input_ids = self.frontend.get_input_ids(
+ text,
+ merge_sentences=merge_sentences,
+ get_tone_ids=get_tone_ids)
+ phone_ids = input_ids["phone_ids"]
+ wav_list = []
+ for i in range(len(phone_ids)):
+ orig_hs = self.engine.executor.am_encoder_infer_sess.run(
+ None, input_feed={'text': phone_ids[i].numpy()}
+ )
+ hs = orig_hs[0]
+ am_decoder_output = self.engine.executor.am_decoder_sess.run(
+ None, input_feed={'xs': hs})
+ am_postnet_output = self.engine.executor.am_postnet_sess.run(
+ None,
+ input_feed={
+ 'xs': np.transpose(am_decoder_output[0], (0, 2, 1))
+ })
+ am_output_data = am_decoder_output + np.transpose(
+ am_postnet_output[0], (0, 2, 1))
+ normalized_mel = am_output_data[0][0]
+ mel = denorm(normalized_mel, self.engine.executor.am_mu, self.engine.executor.am_std)
+ wav = self.engine.executor.voc_sess.run(
+ output_names=None, input_feed={'logmel': mel})[0]
+ wav_list.append(wav)
+ wavs = np.concatenate(wav_list)
+ return wavs
+
+ def streamTTS(self, text):
+
+ get_tone_ids = False
+ merge_sentences = False
+
+ # front
+ input_ids = self.frontend.get_input_ids(
+ text,
+ merge_sentences=merge_sentences,
+ get_tone_ids=get_tone_ids)
+ phone_ids = input_ids["phone_ids"]
+
+ for i in range(len(phone_ids)):
+ part_phone_ids = phone_ids[i].numpy()
+ voc_chunk_id = 0
+
+ # fastspeech2_csmsc
+ if self.config.am == "fastspeech2_csmsc_onnx":
+ # am
+ mel = self.executor.am_sess.run(
+ output_names=None, input_feed={'text': part_phone_ids})
+ mel = mel[0]
+
+ # voc streaming
+ mel_chunks = get_chunks(mel, self.config.voc_block, self.config.voc_pad, "voc")
+ voc_chunk_num = len(mel_chunks)
+ for i, mel_chunk in enumerate(mel_chunks):
+ sub_wav = self.executor.voc_sess.run(
+ output_names=None, input_feed={'logmel': mel_chunk})
+ sub_wav = self.depadding(sub_wav[0], voc_chunk_num, i,
+ self.config.voc_block, self.config.voc_pad,
+ self.config.voc_upsample)
+
+ yield self.after_process(sub_wav)
+
+ # fastspeech2_cnndecoder_csmsc
+ elif self.config.am == "fastspeech2_cnndecoder_csmsc_onnx":
+ # am
+ orig_hs = self.executor.am_encoder_infer_sess.run(
+ None, input_feed={'text': part_phone_ids})
+ orig_hs = orig_hs[0]
+
+ # streaming voc chunk info
+ mel_len = orig_hs.shape[1]
+ voc_chunk_num = math.ceil(mel_len / self.config.voc_block)
+ start = 0
+ end = min(self.config.voc_block + self.config.voc_pad, mel_len)
+
+ # streaming am
+ hss = get_chunks(orig_hs, self.config.am_block, self.config.am_pad, "am")
+ am_chunk_num = len(hss)
+ for i, hs in enumerate(hss):
+ am_decoder_output = self.executor.am_decoder_sess.run(
+ None, input_feed={'xs': hs})
+ am_postnet_output = self.executor.am_postnet_sess.run(
+ None,
+ input_feed={
+ 'xs': np.transpose(am_decoder_output[0], (0, 2, 1))
+ })
+ am_output_data = am_decoder_output + np.transpose(
+ am_postnet_output[0], (0, 2, 1))
+ normalized_mel = am_output_data[0][0]
+
+ sub_mel = denorm(normalized_mel, self.executor.am_mu,
+ self.executor.am_std)
+ sub_mel = self.depadding(sub_mel, am_chunk_num, i,
+ self.config.am_block, self.config.am_pad, 1)
+
+ if i == 0:
+ mel_streaming = sub_mel
+ else:
+ mel_streaming = np.concatenate(
+ (mel_streaming, sub_mel), axis=0)
+
+ # streaming voc
+ # 当流式AM推理的mel帧数大于流式voc推理的chunk size,开始进行流式voc 推理
+ while (mel_streaming.shape[0] >= end and
+ voc_chunk_id < voc_chunk_num):
+ voc_chunk = mel_streaming[start:end, :]
+
+ sub_wav = self.executor.voc_sess.run(
+ output_names=None, input_feed={'logmel': voc_chunk})
+ sub_wav = self.depadding(
+ sub_wav[0], voc_chunk_num, voc_chunk_id,
+ self.config.voc_block, self.config.voc_pad, self.config.voc_upsample)
+
+ yield self.after_process(sub_wav)
+
+ voc_chunk_id += 1
+ start = max(
+ 0, voc_chunk_id * self.config.voc_block - self.config.voc_pad)
+ end = min(
+ (voc_chunk_id + 1) * self.config.voc_block + self.config.voc_pad,
+ mel_len)
+
+ else:
+ logging.error(
+ "Only support fastspeech2_csmsc or fastspeech2_cnndecoder_csmsc on streaming tts."
+ )
+
+
+ def streamTTSBytes(self, text):
+ for wav in self.engine.executor.infer(
+ text=text,
+ lang=self.engine.config.lang,
+ am=self.engine.config.am,
+ spk_id=0):
+ wav = float2pcm(wav) # float32 to int16
+ wav_bytes = wav.tobytes() # to bytes
+ yield wav_bytes
+
+
+ def after_process(self, wav):
+ # for tvm
+ wav = float2pcm(wav) # float32 to int16
+ wav_bytes = wav.tobytes() # to bytes
+ wav_base64 = base64.b64encode(wav_bytes).decode('utf8') # to base64
+ return wav_base64
+
+ def streamTTS_TVM(self, text):
+ # 用 TVM 优化
+ pass
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/SpeechBase/vpr.py b/demos/speech_web/speech_server/src/SpeechBase/vpr.py
new file mode 100644
index 000000000..29ee986e3
--- /dev/null
+++ b/demos/speech_web/speech_server/src/SpeechBase/vpr.py
@@ -0,0 +1,118 @@
+# vpr Demo 没有使用 mysql 与 muilvs, 仅用于docker演示
+import logging
+import faiss
+from matplotlib import use
+import numpy as np
+from .sql_helper import DataBase
+from .vpr_encode import get_audio_embedding
+
+class VPR:
+ def __init__(self, db_path, dim, top_k) -> None:
+ # 初始化
+ self.db_path = db_path
+ self.dim = dim
+ self.top_k = top_k
+ self.dtype = np.float32
+ self.vpr_idx = 0
+
+ # db 初始化
+ self.db = DataBase(db_path)
+
+ # faiss 初始化
+ index_ip = faiss.IndexFlatIP(dim)
+ self.index_ip = faiss.IndexIDMap(index_ip)
+ self.init()
+
+ def init(self):
+ # demo 初始化,把 mysql中的向量注册到 faiss 中
+ sql_dbs = self.db.select_all()
+ if sql_dbs:
+ for sql_db in sql_dbs:
+ idx = sql_db['id']
+ vc_bs64 = sql_db['vector']
+ vc = self.db.decode_vector(vc_bs64)
+ if len(vc.shape) == 1:
+ vc = np.expand_dims(vc, axis=0)
+ # 构建数据库
+ self.index_ip.add_with_ids(vc, np.array((idx,)).astype('int64'))
+ logging.info("faiss 构建完毕")
+
+ def faiss_enroll(self, idx, vc):
+ self.index_ip.add_with_ids(vc, np.array((idx,)).astype('int64'))
+
+ def vpr_enroll(self, username, wav_path):
+ # 注册声纹
+ emb = get_audio_embedding(wav_path)
+ emb = np.expand_dims(emb, axis=0)
+ if emb is not None:
+ emb_bs64 = self.db.encode_vector(emb)
+ last_idx, mess = self.db.insert_one(username, emb_bs64, wav_path)
+ if last_idx:
+ # faiss 注册
+ self.faiss_enroll(last_idx, emb)
+ else:
+ last_idx, mess = None
+ return last_idx
+
+ def vpr_recog(self, wav_path):
+ # 识别声纹
+ emb_search = get_audio_embedding(wav_path)
+
+ if emb_search is not None:
+ emb_search = np.expand_dims(emb_search, axis=0)
+ D, I = self.index_ip.search(emb_search, self.top_k)
+ D = D.tolist()[0]
+ I = I.tolist()[0]
+ return [(round(D[i] * 100, 2 ), I[i]) for i in range(len(D)) if I[i] != -1]
+ else:
+ logging.error("识别失败")
+ return None
+
+ def do_search_vpr(self, wav_path):
+ spk_ids, paths, scores = [], [], []
+ recog_result = self.vpr_recog(wav_path)
+ for score, idx in recog_result:
+ username = self.db.select_by_id(idx)[0]['username']
+ if username not in spk_ids:
+ spk_ids.append(username)
+ scores.append(score)
+ paths.append("")
+ return spk_ids, paths, scores
+
+ def vpr_del(self, username):
+ # 根据用户username, 删除声纹
+ # 查用户ID,删除对应向量
+ res = self.db.select_by_username(username)
+ for r in res:
+ idx = r['id']
+ self.index_ip.remove_ids(np.array((idx,)).astype('int64'))
+
+ self.db.drop_by_username(username)
+
+ def vpr_list(self):
+ # 获取数据列表
+ return self.db.select_all()
+
+ def do_list(self):
+ spk_ids, vpr_ids = [], []
+ for res in self.db.select_all():
+ spk_ids.append(res['username'])
+ vpr_ids.append(res['id'])
+ return spk_ids, vpr_ids
+
+ def do_get_wav(self, vpr_idx):
+ res = self.db.select_by_id(vpr_idx)
+ return res[0]['wavpath']
+
+
+ def vpr_data(self, idx):
+ # 获取对应ID的数据
+ res = self.db.select_by_id(idx)
+ return res
+
+ def vpr_droptable(self):
+ # 删除表
+ self.db.drop_table()
+ # 清空 faiss
+ self.index_ip.reset()
+
diff --git a/demos/speech_web/speech_server/src/SpeechBase/vpr_encode.py b/demos/speech_web/speech_server/src/SpeechBase/vpr_encode.py
new file mode 100644
index 000000000..a6a00e4d0
--- /dev/null
+++ b/demos/speech_web/speech_server/src/SpeechBase/vpr_encode.py
@@ -0,0 +1,20 @@
+from paddlespeech.cli.vector import VectorExecutor
+import numpy as np
+import logging
+
+vector_executor = VectorExecutor()
+
+def get_audio_embedding(path):
+ """
+ Use vpr_inference to generate embedding of audio
+ """
+ try:
+ embedding = vector_executor(
+ audio_file=path, model='ecapatdnn_voxceleb12')
+ embedding = embedding / np.linalg.norm(embedding)
+ return embedding
+ except Exception as e:
+ logging.error(f"Error with embedding:{e}")
+ return None
+
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/WebsocketManeger.py b/demos/speech_web/speech_server/src/WebsocketManeger.py
new file mode 100644
index 000000000..5edde8430
--- /dev/null
+++ b/demos/speech_web/speech_server/src/WebsocketManeger.py
@@ -0,0 +1,31 @@
+from typing import List
+
+from fastapi import WebSocket
+
+class ConnectionManager:
+ def __init__(self):
+ # 存放激活的ws连接对象
+ self.active_connections: List[WebSocket] = []
+
+ async def connect(self, ws: WebSocket):
+ # 等待连接
+ await ws.accept()
+ # 存储ws连接对象
+ self.active_connections.append(ws)
+
+ def disconnect(self, ws: WebSocket):
+ # 关闭时 移除ws对象
+ self.active_connections.remove(ws)
+
+ @staticmethod
+ async def send_personal_message(message: str, ws: WebSocket):
+ # 发送个人消息
+ await ws.send_text(message)
+
+ async def broadcast(self, message: str):
+ # 广播消息
+ for connection in self.active_connections:
+ await connection.send_text(message)
+
+
+manager = ConnectionManager()
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/robot.py b/demos/speech_web/speech_server/src/robot.py
new file mode 100644
index 000000000..b971c57b5
--- /dev/null
+++ b/demos/speech_web/speech_server/src/robot.py
@@ -0,0 +1,70 @@
+from paddlespeech.cli.asr.infer import ASRExecutor
+import soundfile as sf
+import os
+import librosa
+
+from src.SpeechBase.asr import ASR
+from src.SpeechBase.tts import TTS
+from src.SpeechBase.nlp import NLP
+
+
+class Robot:
+ def __init__(self, asr_config, tts_config,asr_init_path,
+ ie_model_path=None) -> None:
+ self.nlp = NLP(ie_model_path=ie_model_path)
+ self.asr = ASR(config_path=asr_config)
+ self.tts = TTS(config_path=tts_config)
+ self.tts_sample_rate = 24000
+ self.asr_sample_rate = 16000
+
+ # 流式识别效果不如端到端的模型,这里流式模型与端到端模型分开
+ self.asr_model = ASRExecutor()
+ self.asr_name = "conformer_wenetspeech"
+ self.warm_up_asrmodel(asr_init_path)
+
+
+ def warm_up_asrmodel(self, asr_init_path):
+ if not os.path.exists(asr_init_path):
+ path_dir = os.path.dirname(asr_init_path)
+ if not os.path.exists(path_dir):
+ os.makedirs(path_dir, exist_ok=True)
+
+ # TTS生成,采样率24000
+ text = "生成初始音频"
+ self.text2speech(text, asr_init_path)
+
+ # asr model初始化
+ self.asr_model(asr_init_path, model=self.asr_name,lang='zh',
+ sample_rate=16000, force_yes=True)
+
+
+ def speech2text(self, audio_file):
+ self.asr_model.preprocess(self.asr_name, audio_file)
+ self.asr_model.infer(self.asr_name)
+ res = self.asr_model.postprocess()
+ return res
+
+ def text2speech(self, text, outpath):
+ wav = self.tts.offlineTTS(text)
+ sf.write(
+ outpath, wav, samplerate=self.tts_sample_rate)
+ res = wav
+ return res
+
+ def text2speechStream(self, text):
+ for sub_wav_base64 in self.tts.streamTTS(text=text):
+ yield sub_wav_base64
+
+ def text2speechStreamBytes(self, text):
+ for wav_bytes in self.tts.streamTTSBytes(text=text):
+ yield wav_bytes
+
+ def chat(self, text):
+ result = self.nlp.chat(text)
+ return result
+
+ def ie(self, text):
+ result = self.nlp.ie(text)
+ return result
+
+
\ No newline at end of file
diff --git a/demos/speech_web/speech_server/src/util.py b/demos/speech_web/speech_server/src/util.py
new file mode 100644
index 000000000..34005d919
--- /dev/null
+++ b/demos/speech_web/speech_server/src/util.py
@@ -0,0 +1,18 @@
+import random
+
+def randName(n=5):
+ return "".join(random.sample('zyxwvutsrqponmlkjihgfedcba',n))
+
+def SuccessRequest(result=None, message="ok"):
+ return {
+ "code": 0,
+ "result":result,
+ "message": message
+ }
+
+def ErrorRequest(result=None, message="error"):
+ return {
+ "code": -1,
+ "result":result,
+ "message": message
+ }
\ No newline at end of file
diff --git a/demos/speech_web/web_client/.gitignore b/demos/speech_web/web_client/.gitignore
new file mode 100644
index 000000000..e33435dce
--- /dev/null
+++ b/demos/speech_web/web_client/.gitignore
@@ -0,0 +1,25 @@
+# Logs
+logs
+*.log
+npm-debug.log*
+yarn-debug.log*
+yarn-error.log*
+pnpm-debug.log*
+lerna-debug.log*
+
+node_modules
+dist
+dist-ssr
+*.local
+
+# Editor directories and files
+.vscode/*
+!.vscode/extensions.json
+.idea
+.DS_Store
+*.suo
+*.ntvs*
+*.njsproj
+*.sln
+*.sw?
+.vscode/*
diff --git a/demos/speech_web/web_client/index.html b/demos/speech_web/web_client/index.html
new file mode 100644
index 000000000..6b20e7b7b
--- /dev/null
+++ b/demos/speech_web/web_client/index.html
@@ -0,0 +1,13 @@
+
+
+
+
+
+
+ 飞桨PaddleSpeech
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/package-lock.json b/demos/speech_web/web_client/package-lock.json
new file mode 100644
index 000000000..509be385c
--- /dev/null
+++ b/demos/speech_web/web_client/package-lock.json
@@ -0,0 +1,1869 @@
+{
+ "name": "paddlespeechwebclient",
+ "version": "0.0.0",
+ "lockfileVersion": 2,
+ "requires": true,
+ "packages": {
+ "": {
+ "name": "paddlespeechwebclient",
+ "version": "0.0.0",
+ "dependencies": {
+ "ant-design-vue": "^2.2.8",
+ "axios": "^0.26.1",
+ "element-plus": "^2.1.9",
+ "js-audio-recorder": "0.5.7",
+ "lamejs": "^1.2.1",
+ "less": "^4.1.2",
+ "vue": "^3.2.25"
+ },
+ "devDependencies": {
+ "@vitejs/plugin-vue": "^2.3.0",
+ "vite": "^2.9.0"
+ }
+ },
+ "node_modules/@ant-design/colors": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmmirror.com/@ant-design/colors/-/colors-6.0.0.tgz",
+ "integrity": "sha512-qAZRvPzfdWHtfameEGP2Qvuf838NhergR35o+EuVyB5XvSA98xod5r4utvi4TJ3ywmevm290g9nsCG5MryrdWQ==",
+ "dependencies": {
+ "@ctrl/tinycolor": "^3.4.0"
+ }
+ },
+ "node_modules/@ant-design/icons-svg": {
+ "version": "4.2.1",
+ "resolved": "https://registry.npmmirror.com/@ant-design/icons-svg/-/icons-svg-4.2.1.tgz",
+ "integrity": "sha512-EB0iwlKDGpG93hW8f85CTJTs4SvMX7tt5ceupvhALp1IF44SeUFOMhKUOYqpsoYWQKAOuTRDMqn75rEaKDp0Xw=="
+ },
+ "node_modules/@ant-design/icons-vue": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmmirror.com/@ant-design/icons-vue/-/icons-vue-6.1.0.tgz",
+ "integrity": "sha512-EX6bYm56V+ZrKN7+3MT/ubDkvJ5rK/O2t380WFRflDcVFgsvl3NLH7Wxeau6R8DbrO5jWR6DSTC3B6gYFp77AA==",
+ "dependencies": {
+ "@ant-design/colors": "^6.0.0",
+ "@ant-design/icons-svg": "^4.2.1"
+ },
+ "peerDependencies": {
+ "vue": ">=3.0.3"
+ }
+ },
+ "node_modules/@babel/parser": {
+ "version": "7.17.9",
+ "resolved": "https://registry.npmmirror.com/@babel/parser/-/parser-7.17.9.tgz",
+ "integrity": "sha512-vqUSBLP8dQHFPdPi9bc5GK9vRkYHJ49fsZdtoJ8EQ8ibpwk5rPKfvNIwChB0KVXcIjcepEBBd2VHC5r9Gy8ueg==",
+ "license": "MIT",
+ "bin": {
+ "parser": "bin/babel-parser.js"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@babel/runtime": {
+ "version": "7.17.9",
+ "resolved": "https://registry.npmmirror.com/@babel/runtime/-/runtime-7.17.9.tgz",
+ "integrity": "sha512-lSiBBvodq29uShpWGNbgFdKYNiFDo5/HIYsaCEY9ff4sb10x9jizo2+pRrSyF4jKZCXqgzuqBOQKbUm90gQwJg==",
+ "dependencies": {
+ "regenerator-runtime": "^0.13.4"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@ctrl/tinycolor": {
+ "version": "3.4.1",
+ "resolved": "https://registry.npmmirror.com/@ctrl/tinycolor/-/tinycolor-3.4.1.tgz",
+ "integrity": "sha512-ej5oVy6lykXsvieQtqZxCOaLT+xD4+QNarq78cIYISHmZXshCvROLudpQN3lfL8G0NL7plMSSK+zlyvCaIJ4Iw==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/@element-plus/icons-vue": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmmirror.com/@element-plus/icons-vue/-/icons-vue-1.1.4.tgz",
+ "integrity": "sha512-Iz/nHqdp1sFPmdzRwHkEQQA3lKvoObk8azgABZ81QUOpW9s/lUyQVUSh0tNtEPZXQlKwlSh7SPgoVxzrE0uuVQ==",
+ "license": "MIT",
+ "peerDependencies": {
+ "vue": "^3.2.0"
+ }
+ },
+ "node_modules/@floating-ui/core": {
+ "version": "0.6.1",
+ "resolved": "https://registry.npmmirror.com/@floating-ui/core/-/core-0.6.1.tgz",
+ "integrity": "sha512-Y30eVMcZva8o84c0HcXAtDO4BEzPJMvF6+B7x7urL2xbAqVsGJhojOyHLaoQHQYjb6OkqRq5kO+zeySycQwKqg==",
+ "license": "MIT"
+ },
+ "node_modules/@floating-ui/dom": {
+ "version": "0.4.4",
+ "resolved": "https://registry.npmmirror.com/@floating-ui/dom/-/dom-0.4.4.tgz",
+ "integrity": "sha512-0Ulu3B/dqQplUUSqnTx0foSrlYuMN+GTtlJWvNJwt6Fr7/PqmlR/Y08o6/+bxDWr6p3roBJRaQ51MDZsNmEhhw==",
+ "license": "MIT",
+ "dependencies": {
+ "@floating-ui/core": "^0.6.1"
+ }
+ },
+ "node_modules/@popperjs/core": {
+ "version": "2.11.5",
+ "resolved": "https://registry.npmmirror.com/@popperjs/core/-/core-2.11.5.tgz",
+ "integrity": "sha512-9X2obfABZuDVLCgPK9aX0a/x4jaOEweTTWE2+9sr0Qqqevj2Uv5XorvusThmc9XGYpS9yI+fhh8RTafBtGposw==",
+ "license": "MIT",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/popperjs"
+ }
+ },
+ "node_modules/@simonwep/pickr": {
+ "version": "1.8.2",
+ "resolved": "https://registry.npmmirror.com/@simonwep/pickr/-/pickr-1.8.2.tgz",
+ "integrity": "sha512-/l5w8BIkrpP6n1xsetx9MWPWlU6OblN5YgZZphxan0Tq4BByTCETL6lyIeY8lagalS2Nbt4F2W034KHLIiunKA==",
+ "dependencies": {
+ "core-js": "^3.15.1",
+ "nanopop": "^2.1.0"
+ }
+ },
+ "node_modules/@types/lodash": {
+ "version": "4.14.181",
+ "resolved": "https://registry.npmmirror.com/@types/lodash/-/lodash-4.14.181.tgz",
+ "integrity": "sha512-n3tyKthHJbkiWhDZs3DkhkCzt2MexYHXlX0td5iMplyfwketaOeKboEVBqzceH7juqvEg3q5oUoBFxSLu7zFag==",
+ "license": "MIT"
+ },
+ "node_modules/@types/lodash-es": {
+ "version": "4.17.6",
+ "resolved": "https://registry.npmmirror.com/@types/lodash-es/-/lodash-es-4.17.6.tgz",
+ "integrity": "sha512-R+zTeVUKDdfoRxpAryaQNRKk3105Rrgx2CFRClIgRGaqDTdjsm8h6IYA8ir584W3ePzkZfst5xIgDwYrlh9HLg==",
+ "license": "MIT",
+ "dependencies": {
+ "@types/lodash": "*"
+ }
+ },
+ "node_modules/@vitejs/plugin-vue": {
+ "version": "2.3.1",
+ "resolved": "https://registry.npmmirror.com/@vitejs/plugin-vue/-/plugin-vue-2.3.1.tgz",
+ "integrity": "sha512-YNzBt8+jt6bSwpt7LP890U1UcTOIZZxfpE5WOJ638PNxSEKOqAi0+FSKS0nVeukfdZ0Ai/H7AFd6k3hayfGZqQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=12.0.0"
+ },
+ "peerDependencies": {
+ "vite": "^2.5.10",
+ "vue": "^3.2.25"
+ }
+ },
+ "node_modules/@vue/compiler-core": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-core/-/compiler-core-3.2.32.tgz",
+ "integrity": "sha512-bRQ8Rkpm/aYFElDWtKkTPHeLnX5pEkNxhPUcqu5crEJIilZH0yeFu/qUAcV4VfSE2AudNPkQSOwMZofhnuutmA==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/parser": "^7.16.4",
+ "@vue/shared": "3.2.32",
+ "estree-walker": "^2.0.2",
+ "source-map": "^0.6.1"
+ }
+ },
+ "node_modules/@vue/compiler-dom": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-dom/-/compiler-dom-3.2.32.tgz",
+ "integrity": "sha512-maa3PNB/NxR17h2hDQfcmS02o1f9r9QIpN1y6fe8tWPrS1E4+q8MqrvDDQNhYVPd84rc3ybtyumrgm9D5Rf/kg==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/compiler-core": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "node_modules/@vue/compiler-sfc": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-sfc/-/compiler-sfc-3.2.32.tgz",
+ "integrity": "sha512-uO6+Gh3AVdWm72lRRCjMr8nMOEqc6ezT9lWs5dPzh1E9TNaJkMYPaRtdY9flUv/fyVQotkfjY/ponjfR+trPSg==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/parser": "^7.16.4",
+ "@vue/compiler-core": "3.2.32",
+ "@vue/compiler-dom": "3.2.32",
+ "@vue/compiler-ssr": "3.2.32",
+ "@vue/reactivity-transform": "3.2.32",
+ "@vue/shared": "3.2.32",
+ "estree-walker": "^2.0.2",
+ "magic-string": "^0.25.7",
+ "postcss": "^8.1.10",
+ "source-map": "^0.6.1"
+ }
+ },
+ "node_modules/@vue/compiler-ssr": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-ssr/-/compiler-ssr-3.2.32.tgz",
+ "integrity": "sha512-ZklVUF/SgTx6yrDUkaTaBL/JMVOtSocP+z5Xz/qIqqLdW/hWL90P+ob/jOQ0Xc/om57892Q7sRSrex0wujOL2Q==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/compiler-dom": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "node_modules/@vue/reactivity": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/reactivity/-/reactivity-3.2.32.tgz",
+ "integrity": "sha512-4zaDumuyDqkuhbb63hRd+YHFGopW7srFIWesLUQ2su/rJfWrSq3YUvoKAJE8Eu1EhZ2Q4c1NuwnEreKj1FkDxA==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "node_modules/@vue/reactivity-transform": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/reactivity-transform/-/reactivity-transform-3.2.32.tgz",
+ "integrity": "sha512-CW1W9zaJtE275tZSWIfQKiPG0iHpdtSlmTqYBu7Y62qvtMgKG5yOxtvBs4RlrZHlaqFSE26avLAgQiTp4YHozw==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/parser": "^7.16.4",
+ "@vue/compiler-core": "3.2.32",
+ "@vue/shared": "3.2.32",
+ "estree-walker": "^2.0.2",
+ "magic-string": "^0.25.7"
+ }
+ },
+ "node_modules/@vue/runtime-core": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/runtime-core/-/runtime-core-3.2.32.tgz",
+ "integrity": "sha512-uKKzK6LaCnbCJ7rcHvsK0azHLGpqs+Vi9B28CV1mfWVq1F3Bj8Okk3cX+5DtD06aUh4V2bYhS2UjjWiUUKUF0w==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/reactivity": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "node_modules/@vue/runtime-dom": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/runtime-dom/-/runtime-dom-3.2.32.tgz",
+ "integrity": "sha512-AmlIg+GPqjkNoADLjHojEX5RGcAg+TsgXOOcUrtDHwKvA8mO26EnLQLB8nylDjU6AMJh2CIYn8NEgyOV5ZIScQ==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/runtime-core": "3.2.32",
+ "@vue/shared": "3.2.32",
+ "csstype": "^2.6.8"
+ }
+ },
+ "node_modules/@vue/server-renderer": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/server-renderer/-/server-renderer-3.2.32.tgz",
+ "integrity": "sha512-TYKpZZfRJpGTTiy/s6bVYwQJpAUx3G03z4G7/3O18M11oacrMTVHaHjiPuPqf3xQtY8R4LKmQ3EOT/DRCA/7Wg==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/compiler-ssr": "3.2.32",
+ "@vue/shared": "3.2.32"
+ },
+ "peerDependencies": {
+ "vue": "3.2.32"
+ }
+ },
+ "node_modules/@vue/shared": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/shared/-/shared-3.2.32.tgz",
+ "integrity": "sha512-bjcixPErUsAnTQRQX4Z5IQnICYjIfNCyCl8p29v1M6kfVzvwOICPw+dz48nNuWlTOOx2RHhzHdazJibE8GSnsw==",
+ "license": "MIT"
+ },
+ "node_modules/@vueuse/core": {
+ "version": "8.2.5",
+ "resolved": "https://registry.npmmirror.com/@vueuse/core/-/core-8.2.5.tgz",
+ "integrity": "sha512-5prZAA1Ji2ltwNUnzreu6WIXYqHYP/9U2BiY5mD/650VYLpVcwVlYznJDFcLCmEWI3o3Vd34oS1FUf+6Mh68GQ==",
+ "license": "MIT",
+ "dependencies": {
+ "@vueuse/metadata": "8.2.5",
+ "@vueuse/shared": "8.2.5",
+ "vue-demi": "*"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/antfu"
+ },
+ "peerDependencies": {
+ "@vue/composition-api": "^1.1.0",
+ "vue": "^2.6.0 || ^3.2.0"
+ },
+ "peerDependenciesMeta": {
+ "@vue/composition-api": {
+ "optional": true
+ },
+ "vue": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/@vueuse/metadata": {
+ "version": "8.2.5",
+ "resolved": "https://registry.npmmirror.com/@vueuse/metadata/-/metadata-8.2.5.tgz",
+ "integrity": "sha512-Lk9plJjh9cIdiRdcj16dau+2LANxIdFCiTgdfzwYXbflxq0QnMBeOD2qHgKDE7fuVrtPcVWj8VSuZEx1HRfNQA==",
+ "license": "MIT",
+ "funding": {
+ "url": "https://github.com/sponsors/antfu"
+ }
+ },
+ "node_modules/@vueuse/shared": {
+ "version": "8.2.5",
+ "resolved": "https://registry.npmmirror.com/@vueuse/shared/-/shared-8.2.5.tgz",
+ "integrity": "sha512-lNWo+7sk6JCuOj4AiYM+6HZ6fq4xAuVq1sVckMQKgfCJZpZRe4i8es+ZULO5bYTKP+VrOCtqrLR2GzEfrbr3YQ==",
+ "license": "MIT",
+ "dependencies": {
+ "vue-demi": "*"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/antfu"
+ },
+ "peerDependencies": {
+ "@vue/composition-api": "^1.1.0",
+ "vue": "^2.6.0 || ^3.2.0"
+ },
+ "peerDependenciesMeta": {
+ "@vue/composition-api": {
+ "optional": true
+ },
+ "vue": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/ant-design-vue": {
+ "version": "2.2.8",
+ "resolved": "https://registry.npmmirror.com/ant-design-vue/-/ant-design-vue-2.2.8.tgz",
+ "integrity": "sha512-3graq9/gCfJQs6hznrHV6sa9oDmk/D1H3Oo0vLdVpPS/I61fZPk8NEyNKCHpNA6fT2cx6xx9U3QS63uuyikg/Q==",
+ "dependencies": {
+ "@ant-design/icons-vue": "^6.0.0",
+ "@babel/runtime": "^7.10.5",
+ "@simonwep/pickr": "~1.8.0",
+ "array-tree-filter": "^2.1.0",
+ "async-validator": "^3.3.0",
+ "dom-align": "^1.12.1",
+ "dom-scroll-into-view": "^2.0.0",
+ "lodash": "^4.17.21",
+ "lodash-es": "^4.17.15",
+ "moment": "^2.27.0",
+ "omit.js": "^2.0.0",
+ "resize-observer-polyfill": "^1.5.1",
+ "scroll-into-view-if-needed": "^2.2.25",
+ "shallow-equal": "^1.0.0",
+ "vue-types": "^3.0.0",
+ "warning": "^4.0.0"
+ },
+ "peerDependencies": {
+ "@vue/compiler-sfc": ">=3.1.0",
+ "vue": ">=3.1.0"
+ }
+ },
+ "node_modules/ant-design-vue/node_modules/async-validator": {
+ "version": "3.5.2",
+ "resolved": "https://registry.npmmirror.com/async-validator/-/async-validator-3.5.2.tgz",
+ "integrity": "sha512-8eLCg00W9pIRZSB781UUX/H6Oskmm8xloZfr09lz5bikRpBVDlJ3hRVuxxP1SxcwsEYfJ4IU8Q19Y8/893r3rQ=="
+ },
+ "node_modules/array-tree-filter": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmmirror.com/array-tree-filter/-/array-tree-filter-2.1.0.tgz",
+ "integrity": "sha512-4ROwICNlNw/Hqa9v+rk5h22KjmzB1JGTMVKP2AKJBOCgb0yL0ASf0+YvCcLNNwquOHNX48jkeZIJ3a+oOQqKcw=="
+ },
+ "node_modules/async-validator": {
+ "version": "4.0.7",
+ "resolved": "https://registry.npmmirror.com/async-validator/-/async-validator-4.0.7.tgz",
+ "integrity": "sha512-Pj2IR7u8hmUEDOwB++su6baaRi+QvsgajuFB9j95foM1N2gy5HM4z60hfusIO0fBPG5uLAEl6yCJr1jNSVugEQ==",
+ "license": "MIT"
+ },
+ "node_modules/axios": {
+ "version": "0.26.1",
+ "resolved": "https://registry.npmmirror.com/axios/-/axios-0.26.1.tgz",
+ "integrity": "sha512-fPwcX4EvnSHuInCMItEhAGnaSEXRBjtzh9fOtsE6E1G6p7vl7edEeZe11QHf18+6+9gR5PbKV/sGKNaD8YaMeA==",
+ "license": "MIT",
+ "dependencies": {
+ "follow-redirects": "^1.14.8"
+ }
+ },
+ "node_modules/axios/node_modules/follow-redirects": {
+ "version": "1.14.9",
+ "resolved": "https://registry.npmmirror.com/follow-redirects/-/follow-redirects-1.14.9.tgz",
+ "integrity": "sha512-MQDfihBQYMcyy5dhRDJUHcw7lb2Pv/TuE6xP1vyraLukNDHKbDxDNaOE3NbCAdKQApno+GPRyo1YAp89yCjK4w==",
+ "funding": [
+ {
+ "type": "individual",
+ "url": "https://github.com/sponsors/RubenVerborgh"
+ }
+ ],
+ "license": "MIT",
+ "engines": {
+ "node": ">=4.0"
+ },
+ "peerDependenciesMeta": {
+ "debug": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/compute-scroll-into-view": {
+ "version": "1.0.17",
+ "resolved": "https://registry.npmmirror.com/compute-scroll-into-view/-/compute-scroll-into-view-1.0.17.tgz",
+ "integrity": "sha512-j4dx+Fb0URmzbwwMUrhqWM2BEWHdFGx+qZ9qqASHRPqvTYdqvWnHg0H1hIbcyLnvgnoNAVMlwkepyqM3DaIFUg=="
+ },
+ "node_modules/copy-anything": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmmirror.com/copy-anything/-/copy-anything-2.0.6.tgz",
+ "integrity": "sha512-1j20GZTsvKNkc4BY3NpMOM8tt///wY3FpIzozTOFO2ffuZcV61nojHXVKIy3WM+7ADCy5FVhdZYHYDdgTU0yJw==",
+ "dependencies": {
+ "is-what": "^3.14.1"
+ }
+ },
+ "node_modules/core-js": {
+ "version": "3.22.5",
+ "resolved": "https://registry.npmmirror.com/core-js/-/core-js-3.22.5.tgz",
+ "integrity": "sha512-VP/xYuvJ0MJWRAobcmQ8F2H6Bsn+s7zqAAjFaHGBMc5AQm7zaelhD1LGduFn2EehEcQcU+br6t+fwbpQ5d1ZWA==",
+ "hasInstallScript": true
+ },
+ "node_modules/csstype": {
+ "version": "2.6.20",
+ "resolved": "https://registry.npmmirror.com/csstype/-/csstype-2.6.20.tgz",
+ "integrity": "sha512-/WwNkdXfckNgw6S5R125rrW8ez139lBHWouiBvX8dfMFtcn6V81REDqnH7+CRpRipfYlyU1CmOnOxrmGcFOjeA==",
+ "license": "MIT"
+ },
+ "node_modules/dayjs": {
+ "version": "1.11.0",
+ "resolved": "https://registry.npmmirror.com/dayjs/-/dayjs-1.11.0.tgz",
+ "integrity": "sha512-JLC809s6Y948/FuCZPm5IX8rRhQwOiyMb2TfVVQEixG7P8Lm/gt5S7yoQZmC8x1UehI9Pb7sksEt4xx14m+7Ug==",
+ "license": "MIT"
+ },
+ "node_modules/dom-align": {
+ "version": "1.12.3",
+ "resolved": "https://registry.npmmirror.com/dom-align/-/dom-align-1.12.3.tgz",
+ "integrity": "sha512-Gj9hZN3a07cbR6zviMUBOMPdWxYhbMI+x+WS0NAIu2zFZmbK8ys9R79g+iG9qLnlCwpFoaB+fKy8Pdv470GsPA=="
+ },
+ "node_modules/dom-scroll-into-view": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmmirror.com/dom-scroll-into-view/-/dom-scroll-into-view-2.0.1.tgz",
+ "integrity": "sha512-bvVTQe1lfaUr1oFzZX80ce9KLDlZ3iU+XGNE/bz9HnGdklTieqsbmsLHe+rT2XWqopvL0PckkYqN7ksmm5pe3w=="
+ },
+ "node_modules/element-plus": {
+ "version": "2.1.9",
+ "resolved": "https://registry.npmmirror.com/element-plus/-/element-plus-2.1.9.tgz",
+ "integrity": "sha512-6mWqS3YrmJPnouWP4otzL8+MehfOnDFqDbcIdnmC07p+Z0JkWe/CVKc4Wky8AYC8nyDMUQyiZYvooCbqGuM7pg==",
+ "license": "MIT",
+ "dependencies": {
+ "@ctrl/tinycolor": "^3.4.0",
+ "@element-plus/icons-vue": "^1.1.4",
+ "@floating-ui/dom": "^0.4.2",
+ "@popperjs/core": "^2.11.4",
+ "@types/lodash": "^4.14.181",
+ "@types/lodash-es": "^4.17.6",
+ "@vueuse/core": "^8.2.4",
+ "async-validator": "^4.0.7",
+ "dayjs": "^1.11.0",
+ "escape-html": "^1.0.3",
+ "lodash": "^4.17.21",
+ "lodash-es": "^4.17.21",
+ "lodash-unified": "^1.0.2",
+ "memoize-one": "^6.0.0",
+ "normalize-wheel-es": "^1.1.2"
+ },
+ "peerDependencies": {
+ "vue": "^3.2.0"
+ }
+ },
+ "node_modules/errno": {
+ "version": "0.1.8",
+ "resolved": "https://registry.npmmirror.com/errno/-/errno-0.1.8.tgz",
+ "integrity": "sha512-dJ6oBr5SQ1VSd9qkk7ByRgb/1SH4JZjCHSW/mr63/QcXO9zLVxvJ6Oy13nio03rxpSnVDDjFor75SjVeZWPW/A==",
+ "optional": true,
+ "dependencies": {
+ "prr": "~1.0.1"
+ },
+ "bin": {
+ "errno": "cli.js"
+ }
+ },
+ "node_modules/esbuild": {
+ "version": "0.14.36",
+ "resolved": "https://registry.npmmirror.com/esbuild/-/esbuild-0.14.36.tgz",
+ "integrity": "sha512-HhFHPiRXGYOCRlrhpiVDYKcFJRdO0sBElZ668M4lh2ER0YgnkLxECuFe7uWCf23FrcLc59Pqr7dHkTqmRPDHmw==",
+ "dev": true,
+ "hasInstallScript": true,
+ "license": "MIT",
+ "bin": {
+ "esbuild": "bin/esbuild"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "optionalDependencies": {
+ "esbuild-android-64": "0.14.36",
+ "esbuild-android-arm64": "0.14.36",
+ "esbuild-darwin-64": "0.14.36",
+ "esbuild-darwin-arm64": "0.14.36",
+ "esbuild-freebsd-64": "0.14.36",
+ "esbuild-freebsd-arm64": "0.14.36",
+ "esbuild-linux-32": "0.14.36",
+ "esbuild-linux-64": "0.14.36",
+ "esbuild-linux-arm": "0.14.36",
+ "esbuild-linux-arm64": "0.14.36",
+ "esbuild-linux-mips64le": "0.14.36",
+ "esbuild-linux-ppc64le": "0.14.36",
+ "esbuild-linux-riscv64": "0.14.36",
+ "esbuild-linux-s390x": "0.14.36",
+ "esbuild-netbsd-64": "0.14.36",
+ "esbuild-openbsd-64": "0.14.36",
+ "esbuild-sunos-64": "0.14.36",
+ "esbuild-windows-32": "0.14.36",
+ "esbuild-windows-64": "0.14.36",
+ "esbuild-windows-arm64": "0.14.36"
+ }
+ },
+ "node_modules/esbuild-darwin-64": {
+ "version": "0.14.36",
+ "resolved": "https://registry.npmmirror.com/esbuild-darwin-64/-/esbuild-darwin-64-0.14.36.tgz",
+ "integrity": "sha512-kkl6qmV0dTpyIMKagluzYqlc1vO0ecgpviK/7jwPbRDEv5fejRTaBBEE2KxEQbTHcLhiiDbhG7d5UybZWo/1zQ==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/escape-html": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmmirror.com/escape-html/-/escape-html-1.0.3.tgz",
+ "integrity": "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==",
+ "license": "MIT"
+ },
+ "node_modules/estree-walker": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmmirror.com/estree-walker/-/estree-walker-2.0.2.tgz",
+ "integrity": "sha512-Rfkk/Mp/DL7JVje3u18FxFujQlTNR2q6QfMSMB7AvCBx91NGj/ba3kCfza0f6dVDbw7YlRf/nDrn7pQrCCyQ/w==",
+ "license": "MIT"
+ },
+ "node_modules/fsevents": {
+ "version": "2.3.2",
+ "resolved": "https://registry.npmmirror.com/fsevents/-/fsevents-2.3.2.tgz",
+ "integrity": "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==",
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": "^8.16.0 || ^10.6.0 || >=11.0.0"
+ }
+ },
+ "node_modules/function-bind": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmmirror.com/function-bind/-/function-bind-1.1.1.tgz",
+ "integrity": "sha512-yIovAzMX49sF8Yl58fSCWJ5svSLuaibPxXQJFLmBObTuCr0Mf1KiPopGM9NiFjiYBCbfaa2Fh6breQ6ANVTI0A==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/graceful-fs": {
+ "version": "4.2.10",
+ "resolved": "https://registry.npmmirror.com/graceful-fs/-/graceful-fs-4.2.10.tgz",
+ "integrity": "sha512-9ByhssR2fPVsNZj478qUUbKfmL0+t5BDVyjShtyZZLiK7ZDAArFFfopyOTj0M05wE2tJPisA4iTnnXl2YoPvOA==",
+ "optional": true
+ },
+ "node_modules/has": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmmirror.com/has/-/has-1.0.3.tgz",
+ "integrity": "sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "function-bind": "^1.1.1"
+ },
+ "engines": {
+ "node": ">= 0.4.0"
+ }
+ },
+ "node_modules/iconv-lite": {
+ "version": "0.4.24",
+ "resolved": "https://registry.npmmirror.com/iconv-lite/-/iconv-lite-0.4.24.tgz",
+ "integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==",
+ "optional": true,
+ "dependencies": {
+ "safer-buffer": ">= 2.1.2 < 3"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/image-size": {
+ "version": "0.5.5",
+ "resolved": "https://registry.npmmirror.com/image-size/-/image-size-0.5.5.tgz",
+ "integrity": "sha512-6TDAlDPZxUFCv+fuOkIoXT/V/f3Qbq8e37p+YOiYrUv3v9cc3/6x78VdfPgFVaB9dZYeLUfKgHRebpkm/oP2VQ==",
+ "optional": true,
+ "bin": {
+ "image-size": "bin/image-size.js"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-core-module": {
+ "version": "2.8.1",
+ "resolved": "https://registry.npmmirror.com/is-core-module/-/is-core-module-2.8.1.tgz",
+ "integrity": "sha512-SdNCUs284hr40hFTFP6l0IfZ/RSrMXF3qgoRHd3/79unUTvrFO/JoXwkGm+5J/Oe3E/b5GsnG330uUNgRpu1PA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "has": "^1.0.3"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/is-plain-object": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmmirror.com/is-plain-object/-/is-plain-object-3.0.1.tgz",
+ "integrity": "sha512-Xnpx182SBMrr/aBik8y+GuR4U1L9FqMSojwDQwPMmxyC6bvEqly9UBCxhauBF5vNh2gwWJNX6oDV7O+OM4z34g==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-what": {
+ "version": "3.14.1",
+ "resolved": "https://registry.npmmirror.com/is-what/-/is-what-3.14.1.tgz",
+ "integrity": "sha512-sNxgpk9793nzSs7bA6JQJGeIuRBQhAaNGG77kzYQgMkrID+lS6SlK07K5LaptscDlSaIgH+GPFzf+d75FVxozA=="
+ },
+ "node_modules/js-audio-recorder": {
+ "version": "0.5.7",
+ "resolved": "https://registry.npmmirror.com/js-audio-recorder/-/js-audio-recorder-0.5.7.tgz",
+ "integrity": "sha512-DIlv30N86AYHr7zGHN0O7V/3Rd8Q6SIJ/MBzVJaT9STWTdhF4E/8fxCX6ZMgRSv8xmx6fEqcFFNPoofmxJD4+A==",
+ "license": "MIT"
+ },
+ "node_modules/js-tokens": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmmirror.com/js-tokens/-/js-tokens-4.0.0.tgz",
+ "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="
+ },
+ "node_modules/lamejs": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmmirror.com/lamejs/-/lamejs-1.2.1.tgz",
+ "integrity": "sha512-s7bxvjvYthw6oPLCm5pFxvA84wUROODB8jEO2+CE1adhKgrIvVOlmMgY8zyugxGrvRaDHNJanOiS21/emty6dQ==",
+ "license": "LGPL-3.0",
+ "dependencies": {
+ "use-strict": "1.0.1"
+ }
+ },
+ "node_modules/less": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmmirror.com/less/-/less-4.1.2.tgz",
+ "integrity": "sha512-EoQp/Et7OSOVu0aJknJOtlXZsnr8XE8KwuzTHOLeVSEx8pVWUICc8Q0VYRHgzyjX78nMEyC/oztWFbgyhtNfDA==",
+ "dependencies": {
+ "copy-anything": "^2.0.1",
+ "parse-node-version": "^1.0.1",
+ "tslib": "^2.3.0"
+ },
+ "bin": {
+ "lessc": "bin/lessc"
+ },
+ "engines": {
+ "node": ">=6"
+ },
+ "optionalDependencies": {
+ "errno": "^0.1.1",
+ "graceful-fs": "^4.1.2",
+ "image-size": "~0.5.0",
+ "make-dir": "^2.1.0",
+ "mime": "^1.4.1",
+ "needle": "^2.5.2",
+ "source-map": "~0.6.0"
+ }
+ },
+ "node_modules/lodash": {
+ "version": "4.17.21",
+ "resolved": "https://registry.npmmirror.com/lodash/-/lodash-4.17.21.tgz",
+ "integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==",
+ "license": "MIT"
+ },
+ "node_modules/lodash-es": {
+ "version": "4.17.21",
+ "resolved": "https://registry.npmmirror.com/lodash-es/-/lodash-es-4.17.21.tgz",
+ "integrity": "sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw==",
+ "license": "MIT"
+ },
+ "node_modules/lodash-unified": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmmirror.com/lodash-unified/-/lodash-unified-1.0.2.tgz",
+ "integrity": "sha512-OGbEy+1P+UT26CYi4opY4gebD8cWRDxAT6MAObIVQMiqYdxZr1g3QHWCToVsm31x2NkLS4K3+MC2qInaRMa39g==",
+ "license": "MIT",
+ "peerDependencies": {
+ "@types/lodash-es": "*",
+ "lodash": "*",
+ "lodash-es": "*"
+ }
+ },
+ "node_modules/loose-envify": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmmirror.com/loose-envify/-/loose-envify-1.4.0.tgz",
+ "integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
+ "dependencies": {
+ "js-tokens": "^3.0.0 || ^4.0.0"
+ },
+ "bin": {
+ "loose-envify": "cli.js"
+ }
+ },
+ "node_modules/magic-string": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmmirror.com/magic-string/-/magic-string-0.25.9.tgz",
+ "integrity": "sha512-RmF0AsMzgt25qzqqLc1+MbHmhdx0ojF2Fvs4XnOqz2ZOBXzzkEwc/dJQZCYHAn7v1jbVOjAZfK8msRn4BxO4VQ==",
+ "license": "MIT",
+ "dependencies": {
+ "sourcemap-codec": "^1.4.8"
+ }
+ },
+ "node_modules/make-dir": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmmirror.com/make-dir/-/make-dir-2.1.0.tgz",
+ "integrity": "sha512-LS9X+dc8KLxXCb8dni79fLIIUA5VyZoyjSMCwTluaXA0o27cCK0bhXkpgw+sTXVpPy/lSO57ilRixqk0vDmtRA==",
+ "optional": true,
+ "dependencies": {
+ "pify": "^4.0.1",
+ "semver": "^5.6.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/memoize-one": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmmirror.com/memoize-one/-/memoize-one-6.0.0.tgz",
+ "integrity": "sha512-rkpe71W0N0c0Xz6QD0eJETuWAJGnJ9afsl1srmwPrI+yBCkge5EycXXbYRyvL29zZVUWQCY7InPRCv3GDXuZNw==",
+ "license": "MIT"
+ },
+ "node_modules/mime": {
+ "version": "1.6.0",
+ "resolved": "https://registry.npmmirror.com/mime/-/mime-1.6.0.tgz",
+ "integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==",
+ "optional": true,
+ "bin": {
+ "mime": "cli.js"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/moment": {
+ "version": "2.29.4",
+ "resolved": "https://registry.npmjs.org/moment/-/moment-2.29.4.tgz",
+ "integrity": "sha512-5LC9SOxjSc2HF6vO2CyuTDNivEdoz2IvyJJGj6X8DJ0eFyfszE0QiEd+iXmBvUP3WHxSjFH/vIsA0EN00cgr8w==",
+ "engines": {
+ "node": "*"
+ }
+ },
+ "node_modules/nanoid": {
+ "version": "3.3.2",
+ "resolved": "https://registry.npmmirror.com/nanoid/-/nanoid-3.3.2.tgz",
+ "integrity": "sha512-CuHBogktKwpm5g2sRgv83jEy2ijFzBwMoYA60orPDR7ynsLijJDqgsi4RDGj3OJpy3Ieb+LYwiRmIOGyytgITA==",
+ "license": "MIT",
+ "bin": {
+ "nanoid": "bin/nanoid.cjs"
+ },
+ "engines": {
+ "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
+ }
+ },
+ "node_modules/nanopop": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmmirror.com/nanopop/-/nanopop-2.1.0.tgz",
+ "integrity": "sha512-jGTwpFRexSH+fxappnGQtN9dspgE2ipa1aOjtR24igG0pv6JCxImIAmrLRHX+zUF5+1wtsFVbKyfP51kIGAVNw=="
+ },
+ "node_modules/needle": {
+ "version": "2.9.1",
+ "resolved": "https://registry.npmmirror.com/needle/-/needle-2.9.1.tgz",
+ "integrity": "sha512-6R9fqJ5Zcmf+uYaFgdIHmLwNldn5HbK8L5ybn7Uz+ylX/rnOsSp1AHcvQSrCaFN+qNM1wpymHqD7mVasEOlHGQ==",
+ "optional": true,
+ "dependencies": {
+ "debug": "^3.2.6",
+ "iconv-lite": "^0.4.4",
+ "sax": "^1.2.4"
+ },
+ "bin": {
+ "needle": "bin/needle"
+ },
+ "engines": {
+ "node": ">= 4.4.x"
+ }
+ },
+ "node_modules/needle/node_modules/debug": {
+ "version": "3.2.7",
+ "resolved": "https://registry.npmmirror.com/debug/-/debug-3.2.7.tgz",
+ "integrity": "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==",
+ "optional": true,
+ "dependencies": {
+ "ms": "^2.1.1"
+ }
+ },
+ "node_modules/needle/node_modules/ms": {
+ "version": "2.1.3",
+ "resolved": "https://registry.npmmirror.com/ms/-/ms-2.1.3.tgz",
+ "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
+ "optional": true
+ },
+ "node_modules/normalize-wheel-es": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmmirror.com/normalize-wheel-es/-/normalize-wheel-es-1.1.2.tgz",
+ "integrity": "sha512-scX83plWJXYH1J4+BhAuIHadROzxX0UBF3+HuZNY2Ks8BciE7tSTQ+5JhTsvzjaO0/EJdm4JBGrfObKxFf3Png==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/omit.js": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmmirror.com/omit.js/-/omit.js-2.0.2.tgz",
+ "integrity": "sha512-hJmu9D+bNB40YpL9jYebQl4lsTW6yEHRTroJzNLqQJYHm7c+NQnJGfZmIWh8S3q3KoaxV1aLhV6B3+0N0/kyJg=="
+ },
+ "node_modules/parse-node-version": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmmirror.com/parse-node-version/-/parse-node-version-1.0.1.tgz",
+ "integrity": "sha512-3YHlOa/JgH6Mnpr05jP9eDG254US9ek25LyIxZlDItp2iJtwyaXQb57lBYLdT3MowkUFYEV2XXNAYIPlESvJlA==",
+ "engines": {
+ "node": ">= 0.10"
+ }
+ },
+ "node_modules/path-parse": {
+ "version": "1.0.7",
+ "resolved": "https://registry.npmmirror.com/path-parse/-/path-parse-1.0.7.tgz",
+ "integrity": "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/picocolors": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmmirror.com/picocolors/-/picocolors-1.0.0.tgz",
+ "integrity": "sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ==",
+ "license": "ISC"
+ },
+ "node_modules/pify": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmmirror.com/pify/-/pify-4.0.1.tgz",
+ "integrity": "sha512-uB80kBFb/tfd68bVleG9T5GGsGPjJrLAUpR5PZIrhBnIaRTQRjqdJSsIKkOP6OAIFbj7GOrcudc5pNjZ+geV2g==",
+ "optional": true,
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/postcss": {
+ "version": "8.4.12",
+ "resolved": "https://registry.npmmirror.com/postcss/-/postcss-8.4.12.tgz",
+ "integrity": "sha512-lg6eITwYe9v6Hr5CncVbK70SoioNQIq81nsaG86ev5hAidQvmOeETBqs7jm43K2F5/Ley3ytDtriImV6TpNiSg==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/postcss/"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/postcss"
+ }
+ ],
+ "license": "MIT",
+ "dependencies": {
+ "nanoid": "^3.3.1",
+ "picocolors": "^1.0.0",
+ "source-map-js": "^1.0.2"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14"
+ }
+ },
+ "node_modules/prr": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmmirror.com/prr/-/prr-1.0.1.tgz",
+ "integrity": "sha512-yPw4Sng1gWghHQWj0B3ZggWUm4qVbPwPFcRG8KyxiU7J2OHFSoEHKS+EZ3fv5l1t9CyCiop6l/ZYeWbrgoQejw==",
+ "optional": true
+ },
+ "node_modules/regenerator-runtime": {
+ "version": "0.13.9",
+ "resolved": "https://registry.npmmirror.com/regenerator-runtime/-/regenerator-runtime-0.13.9.tgz",
+ "integrity": "sha512-p3VT+cOEgxFsRRA9X4lkI1E+k2/CtnKtU4gcxyaCUreilL/vqI6CdZ3wxVUx3UOUg+gnUOQQcRI7BmSI656MYA=="
+ },
+ "node_modules/resize-observer-polyfill": {
+ "version": "1.5.1",
+ "resolved": "https://registry.npmmirror.com/resize-observer-polyfill/-/resize-observer-polyfill-1.5.1.tgz",
+ "integrity": "sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg=="
+ },
+ "node_modules/resolve": {
+ "version": "1.22.0",
+ "resolved": "https://registry.npmmirror.com/resolve/-/resolve-1.22.0.tgz",
+ "integrity": "sha512-Hhtrw0nLeSrFQ7phPp4OOcVjLPIeMnRlr5mcnVuMe7M/7eBn98A3hmFRLoFo3DLZkivSYwhRUJTyPyWAk56WLw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "is-core-module": "^2.8.1",
+ "path-parse": "^1.0.7",
+ "supports-preserve-symlinks-flag": "^1.0.0"
+ },
+ "bin": {
+ "resolve": "bin/resolve"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/rollup": {
+ "version": "2.70.1",
+ "resolved": "https://registry.npmmirror.com/rollup/-/rollup-2.70.1.tgz",
+ "integrity": "sha512-CRYsI5EuzLbXdxC6RnYhOuRdtz4bhejPMSWjsFLfVM/7w/85n2szZv6yExqUXsBdz5KT8eoubeyDUDjhLHEslA==",
+ "dev": true,
+ "license": "MIT",
+ "bin": {
+ "rollup": "dist/bin/rollup"
+ },
+ "engines": {
+ "node": ">=10.0.0"
+ },
+ "optionalDependencies": {
+ "fsevents": "~2.3.2"
+ }
+ },
+ "node_modules/safer-buffer": {
+ "version": "2.1.2",
+ "resolved": "https://registry.npmmirror.com/safer-buffer/-/safer-buffer-2.1.2.tgz",
+ "integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
+ "optional": true
+ },
+ "node_modules/sax": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmmirror.com/sax/-/sax-1.2.4.tgz",
+ "integrity": "sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw==",
+ "optional": true
+ },
+ "node_modules/scroll-into-view-if-needed": {
+ "version": "2.2.29",
+ "resolved": "https://registry.npmmirror.com/scroll-into-view-if-needed/-/scroll-into-view-if-needed-2.2.29.tgz",
+ "integrity": "sha512-hxpAR6AN+Gh53AdAimHM6C8oTN1ppwVZITihix+WqalywBeFcQ6LdQP5ABNl26nX8GTEL7VT+b8lKpdqq65wXg==",
+ "dependencies": {
+ "compute-scroll-into-view": "^1.0.17"
+ }
+ },
+ "node_modules/semver": {
+ "version": "5.7.1",
+ "resolved": "https://registry.npmmirror.com/semver/-/semver-5.7.1.tgz",
+ "integrity": "sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ==",
+ "optional": true,
+ "bin": {
+ "semver": "bin/semver"
+ }
+ },
+ "node_modules/shallow-equal": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmmirror.com/shallow-equal/-/shallow-equal-1.2.1.tgz",
+ "integrity": "sha512-S4vJDjHHMBaiZuT9NPb616CSmLf618jawtv3sufLl6ivK8WocjAo58cXwbRV1cgqxH0Qbv+iUt6m05eqEa2IRA=="
+ },
+ "node_modules/source-map": {
+ "version": "0.6.1",
+ "resolved": "https://registry.npmmirror.com/source-map/-/source-map-0.6.1.tgz",
+ "integrity": "sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==",
+ "license": "BSD-3-Clause",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/source-map-js": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmmirror.com/source-map-js/-/source-map-js-1.0.2.tgz",
+ "integrity": "sha512-R0XvVJ9WusLiqTCEiGCmICCMplcCkIwwR11mOSD9CR5u+IXYdiseeEuXCVAjS54zqwkLcPNnmU4OeJ6tUrWhDw==",
+ "license": "BSD-3-Clause",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/sourcemap-codec": {
+ "version": "1.4.8",
+ "resolved": "https://registry.npmmirror.com/sourcemap-codec/-/sourcemap-codec-1.4.8.tgz",
+ "integrity": "sha512-9NykojV5Uih4lgo5So5dtw+f0JgJX30KCNI8gwhz2J9A15wD0Ml6tjHKwf6fTSa6fAdVBdZeNOs9eJ71qCk8vA==",
+ "license": "MIT"
+ },
+ "node_modules/supports-preserve-symlinks-flag": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmmirror.com/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz",
+ "integrity": "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/tslib": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmmirror.com/tslib/-/tslib-2.4.0.tgz",
+ "integrity": "sha512-d6xOpEDfsi2CZVlPQzGeux8XMwLT9hssAsaPYExaQMuYskwb+x1x7J371tWlbBdWHroy99KnVB6qIkUbs5X3UQ=="
+ },
+ "node_modules/use-strict": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmmirror.com/use-strict/-/use-strict-1.0.1.tgz",
+ "integrity": "sha512-IeiWvvEXfW5ltKVMkxq6FvNf2LojMKvB2OCeja6+ct24S1XOmQw2dGr2JyndwACWAGJva9B7yPHwAmeA9QCqAQ==",
+ "license": "ISC"
+ },
+ "node_modules/vite": {
+ "version": "2.9.1",
+ "resolved": "https://registry.npmmirror.com/vite/-/vite-2.9.1.tgz",
+ "integrity": "sha512-vSlsSdOYGcYEJfkQ/NeLXgnRv5zZfpAsdztkIrs7AZHV8RCMZQkwjo4DS5BnrYTqoWqLoUe1Cah4aVO4oNNqCQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "esbuild": "^0.14.27",
+ "postcss": "^8.4.12",
+ "resolve": "^1.22.0",
+ "rollup": "^2.59.0"
+ },
+ "bin": {
+ "vite": "bin/vite.js"
+ },
+ "engines": {
+ "node": ">=12.2.0"
+ },
+ "optionalDependencies": {
+ "fsevents": "~2.3.2"
+ },
+ "peerDependencies": {
+ "less": "*",
+ "sass": "*",
+ "stylus": "*"
+ },
+ "peerDependenciesMeta": {
+ "less": {
+ "optional": true
+ },
+ "sass": {
+ "optional": true
+ },
+ "stylus": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/vue": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/vue/-/vue-3.2.32.tgz",
+ "integrity": "sha512-6L3jKZApF042OgbCkh+HcFeAkiYi3Lovi8wNhWqIK98Pi5efAMLZzRHgi91v+60oIRxdJsGS9sTMsb+yDpY8Eg==",
+ "license": "MIT",
+ "dependencies": {
+ "@vue/compiler-dom": "3.2.32",
+ "@vue/compiler-sfc": "3.2.32",
+ "@vue/runtime-dom": "3.2.32",
+ "@vue/server-renderer": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "node_modules/vue-demi": {
+ "version": "0.12.5",
+ "resolved": "https://registry.npmmirror.com/vue-demi/-/vue-demi-0.12.5.tgz",
+ "integrity": "sha512-BREuTgTYlUr0zw0EZn3hnhC3I6gPWv+Kwh4MCih6QcAeaTlaIX0DwOVN0wHej7hSvDPecz4jygy/idsgKfW58Q==",
+ "hasInstallScript": true,
+ "license": "MIT",
+ "bin": {
+ "vue-demi-fix": "bin/vue-demi-fix.js",
+ "vue-demi-switch": "bin/vue-demi-switch.js"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/antfu"
+ },
+ "peerDependencies": {
+ "@vue/composition-api": "^1.0.0-rc.1",
+ "vue": "^3.0.0-0 || ^2.6.0"
+ },
+ "peerDependenciesMeta": {
+ "@vue/composition-api": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/vue-types": {
+ "version": "3.0.2",
+ "resolved": "https://registry.npmmirror.com/vue-types/-/vue-types-3.0.2.tgz",
+ "integrity": "sha512-IwUC0Aq2zwaXqy74h4WCvFCUtoV0iSWr0snWnE9TnU18S66GAQyqQbRf2qfJtUuiFsBf6qp0MEwdonlwznlcrw==",
+ "dependencies": {
+ "is-plain-object": "3.0.1"
+ },
+ "engines": {
+ "node": ">=10.15.0"
+ },
+ "peerDependencies": {
+ "vue": "^3.0.0"
+ }
+ },
+ "node_modules/warning": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmmirror.com/warning/-/warning-4.0.3.tgz",
+ "integrity": "sha512-rpJyN222KWIvHJ/F53XSZv0Zl/accqHR8et1kpaMTD/fLCRxtV8iX8czMzY7sVZupTI3zcUTg8eycS2kNF9l6w==",
+ "dependencies": {
+ "loose-envify": "^1.0.0"
+ }
+ }
+ },
+ "dependencies": {
+ "@ant-design/colors": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmmirror.com/@ant-design/colors/-/colors-6.0.0.tgz",
+ "integrity": "sha512-qAZRvPzfdWHtfameEGP2Qvuf838NhergR35o+EuVyB5XvSA98xod5r4utvi4TJ3ywmevm290g9nsCG5MryrdWQ==",
+ "requires": {
+ "@ctrl/tinycolor": "^3.4.0"
+ }
+ },
+ "@ant-design/icons-svg": {
+ "version": "4.2.1",
+ "resolved": "https://registry.npmmirror.com/@ant-design/icons-svg/-/icons-svg-4.2.1.tgz",
+ "integrity": "sha512-EB0iwlKDGpG93hW8f85CTJTs4SvMX7tt5ceupvhALp1IF44SeUFOMhKUOYqpsoYWQKAOuTRDMqn75rEaKDp0Xw=="
+ },
+ "@ant-design/icons-vue": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmmirror.com/@ant-design/icons-vue/-/icons-vue-6.1.0.tgz",
+ "integrity": "sha512-EX6bYm56V+ZrKN7+3MT/ubDkvJ5rK/O2t380WFRflDcVFgsvl3NLH7Wxeau6R8DbrO5jWR6DSTC3B6gYFp77AA==",
+ "requires": {
+ "@ant-design/colors": "^6.0.0",
+ "@ant-design/icons-svg": "^4.2.1"
+ }
+ },
+ "@babel/parser": {
+ "version": "7.17.9",
+ "resolved": "https://registry.npmmirror.com/@babel/parser/-/parser-7.17.9.tgz",
+ "integrity": "sha512-vqUSBLP8dQHFPdPi9bc5GK9vRkYHJ49fsZdtoJ8EQ8ibpwk5rPKfvNIwChB0KVXcIjcepEBBd2VHC5r9Gy8ueg=="
+ },
+ "@babel/runtime": {
+ "version": "7.17.9",
+ "resolved": "https://registry.npmmirror.com/@babel/runtime/-/runtime-7.17.9.tgz",
+ "integrity": "sha512-lSiBBvodq29uShpWGNbgFdKYNiFDo5/HIYsaCEY9ff4sb10x9jizo2+pRrSyF4jKZCXqgzuqBOQKbUm90gQwJg==",
+ "requires": {
+ "regenerator-runtime": "^0.13.4"
+ }
+ },
+ "@ctrl/tinycolor": {
+ "version": "3.4.1",
+ "resolved": "https://registry.npmmirror.com/@ctrl/tinycolor/-/tinycolor-3.4.1.tgz",
+ "integrity": "sha512-ej5oVy6lykXsvieQtqZxCOaLT+xD4+QNarq78cIYISHmZXshCvROLudpQN3lfL8G0NL7plMSSK+zlyvCaIJ4Iw=="
+ },
+ "@element-plus/icons-vue": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmmirror.com/@element-plus/icons-vue/-/icons-vue-1.1.4.tgz",
+ "integrity": "sha512-Iz/nHqdp1sFPmdzRwHkEQQA3lKvoObk8azgABZ81QUOpW9s/lUyQVUSh0tNtEPZXQlKwlSh7SPgoVxzrE0uuVQ==",
+ "requires": {}
+ },
+ "@floating-ui/core": {
+ "version": "0.6.1",
+ "resolved": "https://registry.npmmirror.com/@floating-ui/core/-/core-0.6.1.tgz",
+ "integrity": "sha512-Y30eVMcZva8o84c0HcXAtDO4BEzPJMvF6+B7x7urL2xbAqVsGJhojOyHLaoQHQYjb6OkqRq5kO+zeySycQwKqg=="
+ },
+ "@floating-ui/dom": {
+ "version": "0.4.4",
+ "resolved": "https://registry.npmmirror.com/@floating-ui/dom/-/dom-0.4.4.tgz",
+ "integrity": "sha512-0Ulu3B/dqQplUUSqnTx0foSrlYuMN+GTtlJWvNJwt6Fr7/PqmlR/Y08o6/+bxDWr6p3roBJRaQ51MDZsNmEhhw==",
+ "requires": {
+ "@floating-ui/core": "^0.6.1"
+ }
+ },
+ "@popperjs/core": {
+ "version": "2.11.5",
+ "resolved": "https://registry.npmmirror.com/@popperjs/core/-/core-2.11.5.tgz",
+ "integrity": "sha512-9X2obfABZuDVLCgPK9aX0a/x4jaOEweTTWE2+9sr0Qqqevj2Uv5XorvusThmc9XGYpS9yI+fhh8RTafBtGposw=="
+ },
+ "@simonwep/pickr": {
+ "version": "1.8.2",
+ "resolved": "https://registry.npmmirror.com/@simonwep/pickr/-/pickr-1.8.2.tgz",
+ "integrity": "sha512-/l5w8BIkrpP6n1xsetx9MWPWlU6OblN5YgZZphxan0Tq4BByTCETL6lyIeY8lagalS2Nbt4F2W034KHLIiunKA==",
+ "requires": {
+ "core-js": "^3.15.1",
+ "nanopop": "^2.1.0"
+ }
+ },
+ "@types/lodash": {
+ "version": "4.14.181",
+ "resolved": "https://registry.npmmirror.com/@types/lodash/-/lodash-4.14.181.tgz",
+ "integrity": "sha512-n3tyKthHJbkiWhDZs3DkhkCzt2MexYHXlX0td5iMplyfwketaOeKboEVBqzceH7juqvEg3q5oUoBFxSLu7zFag=="
+ },
+ "@types/lodash-es": {
+ "version": "4.17.6",
+ "resolved": "https://registry.npmmirror.com/@types/lodash-es/-/lodash-es-4.17.6.tgz",
+ "integrity": "sha512-R+zTeVUKDdfoRxpAryaQNRKk3105Rrgx2CFRClIgRGaqDTdjsm8h6IYA8ir584W3ePzkZfst5xIgDwYrlh9HLg==",
+ "requires": {
+ "@types/lodash": "*"
+ }
+ },
+ "@vitejs/plugin-vue": {
+ "version": "2.3.1",
+ "resolved": "https://registry.npmmirror.com/@vitejs/plugin-vue/-/plugin-vue-2.3.1.tgz",
+ "integrity": "sha512-YNzBt8+jt6bSwpt7LP890U1UcTOIZZxfpE5WOJ638PNxSEKOqAi0+FSKS0nVeukfdZ0Ai/H7AFd6k3hayfGZqQ==",
+ "dev": true,
+ "requires": {}
+ },
+ "@vue/compiler-core": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-core/-/compiler-core-3.2.32.tgz",
+ "integrity": "sha512-bRQ8Rkpm/aYFElDWtKkTPHeLnX5pEkNxhPUcqu5crEJIilZH0yeFu/qUAcV4VfSE2AudNPkQSOwMZofhnuutmA==",
+ "requires": {
+ "@babel/parser": "^7.16.4",
+ "@vue/shared": "3.2.32",
+ "estree-walker": "^2.0.2",
+ "source-map": "^0.6.1"
+ }
+ },
+ "@vue/compiler-dom": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-dom/-/compiler-dom-3.2.32.tgz",
+ "integrity": "sha512-maa3PNB/NxR17h2hDQfcmS02o1f9r9QIpN1y6fe8tWPrS1E4+q8MqrvDDQNhYVPd84rc3ybtyumrgm9D5Rf/kg==",
+ "requires": {
+ "@vue/compiler-core": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "@vue/compiler-sfc": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-sfc/-/compiler-sfc-3.2.32.tgz",
+ "integrity": "sha512-uO6+Gh3AVdWm72lRRCjMr8nMOEqc6ezT9lWs5dPzh1E9TNaJkMYPaRtdY9flUv/fyVQotkfjY/ponjfR+trPSg==",
+ "requires": {
+ "@babel/parser": "^7.16.4",
+ "@vue/compiler-core": "3.2.32",
+ "@vue/compiler-dom": "3.2.32",
+ "@vue/compiler-ssr": "3.2.32",
+ "@vue/reactivity-transform": "3.2.32",
+ "@vue/shared": "3.2.32",
+ "estree-walker": "^2.0.2",
+ "magic-string": "^0.25.7",
+ "postcss": "^8.1.10",
+ "source-map": "^0.6.1"
+ }
+ },
+ "@vue/compiler-ssr": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/compiler-ssr/-/compiler-ssr-3.2.32.tgz",
+ "integrity": "sha512-ZklVUF/SgTx6yrDUkaTaBL/JMVOtSocP+z5Xz/qIqqLdW/hWL90P+ob/jOQ0Xc/om57892Q7sRSrex0wujOL2Q==",
+ "requires": {
+ "@vue/compiler-dom": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "@vue/reactivity": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/reactivity/-/reactivity-3.2.32.tgz",
+ "integrity": "sha512-4zaDumuyDqkuhbb63hRd+YHFGopW7srFIWesLUQ2su/rJfWrSq3YUvoKAJE8Eu1EhZ2Q4c1NuwnEreKj1FkDxA==",
+ "requires": {
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "@vue/reactivity-transform": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/reactivity-transform/-/reactivity-transform-3.2.32.tgz",
+ "integrity": "sha512-CW1W9zaJtE275tZSWIfQKiPG0iHpdtSlmTqYBu7Y62qvtMgKG5yOxtvBs4RlrZHlaqFSE26avLAgQiTp4YHozw==",
+ "requires": {
+ "@babel/parser": "^7.16.4",
+ "@vue/compiler-core": "3.2.32",
+ "@vue/shared": "3.2.32",
+ "estree-walker": "^2.0.2",
+ "magic-string": "^0.25.7"
+ }
+ },
+ "@vue/runtime-core": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/runtime-core/-/runtime-core-3.2.32.tgz",
+ "integrity": "sha512-uKKzK6LaCnbCJ7rcHvsK0azHLGpqs+Vi9B28CV1mfWVq1F3Bj8Okk3cX+5DtD06aUh4V2bYhS2UjjWiUUKUF0w==",
+ "requires": {
+ "@vue/reactivity": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "@vue/runtime-dom": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/runtime-dom/-/runtime-dom-3.2.32.tgz",
+ "integrity": "sha512-AmlIg+GPqjkNoADLjHojEX5RGcAg+TsgXOOcUrtDHwKvA8mO26EnLQLB8nylDjU6AMJh2CIYn8NEgyOV5ZIScQ==",
+ "requires": {
+ "@vue/runtime-core": "3.2.32",
+ "@vue/shared": "3.2.32",
+ "csstype": "^2.6.8"
+ }
+ },
+ "@vue/server-renderer": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/server-renderer/-/server-renderer-3.2.32.tgz",
+ "integrity": "sha512-TYKpZZfRJpGTTiy/s6bVYwQJpAUx3G03z4G7/3O18M11oacrMTVHaHjiPuPqf3xQtY8R4LKmQ3EOT/DRCA/7Wg==",
+ "requires": {
+ "@vue/compiler-ssr": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "@vue/shared": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/@vue/shared/-/shared-3.2.32.tgz",
+ "integrity": "sha512-bjcixPErUsAnTQRQX4Z5IQnICYjIfNCyCl8p29v1M6kfVzvwOICPw+dz48nNuWlTOOx2RHhzHdazJibE8GSnsw=="
+ },
+ "@vueuse/core": {
+ "version": "8.2.5",
+ "resolved": "https://registry.npmmirror.com/@vueuse/core/-/core-8.2.5.tgz",
+ "integrity": "sha512-5prZAA1Ji2ltwNUnzreu6WIXYqHYP/9U2BiY5mD/650VYLpVcwVlYznJDFcLCmEWI3o3Vd34oS1FUf+6Mh68GQ==",
+ "requires": {
+ "@vueuse/metadata": "8.2.5",
+ "@vueuse/shared": "8.2.5",
+ "vue-demi": "*"
+ }
+ },
+ "@vueuse/metadata": {
+ "version": "8.2.5",
+ "resolved": "https://registry.npmmirror.com/@vueuse/metadata/-/metadata-8.2.5.tgz",
+ "integrity": "sha512-Lk9plJjh9cIdiRdcj16dau+2LANxIdFCiTgdfzwYXbflxq0QnMBeOD2qHgKDE7fuVrtPcVWj8VSuZEx1HRfNQA=="
+ },
+ "@vueuse/shared": {
+ "version": "8.2.5",
+ "resolved": "https://registry.npmmirror.com/@vueuse/shared/-/shared-8.2.5.tgz",
+ "integrity": "sha512-lNWo+7sk6JCuOj4AiYM+6HZ6fq4xAuVq1sVckMQKgfCJZpZRe4i8es+ZULO5bYTKP+VrOCtqrLR2GzEfrbr3YQ==",
+ "requires": {
+ "vue-demi": "*"
+ }
+ },
+ "ant-design-vue": {
+ "version": "2.2.8",
+ "resolved": "https://registry.npmmirror.com/ant-design-vue/-/ant-design-vue-2.2.8.tgz",
+ "integrity": "sha512-3graq9/gCfJQs6hznrHV6sa9oDmk/D1H3Oo0vLdVpPS/I61fZPk8NEyNKCHpNA6fT2cx6xx9U3QS63uuyikg/Q==",
+ "requires": {
+ "@ant-design/icons-vue": "^6.0.0",
+ "@babel/runtime": "^7.10.5",
+ "@simonwep/pickr": "~1.8.0",
+ "array-tree-filter": "^2.1.0",
+ "async-validator": "^3.3.0",
+ "dom-align": "^1.12.1",
+ "dom-scroll-into-view": "^2.0.0",
+ "lodash": "^4.17.21",
+ "lodash-es": "^4.17.15",
+ "moment": "^2.27.0",
+ "omit.js": "^2.0.0",
+ "resize-observer-polyfill": "^1.5.1",
+ "scroll-into-view-if-needed": "^2.2.25",
+ "shallow-equal": "^1.0.0",
+ "vue-types": "^3.0.0",
+ "warning": "^4.0.0"
+ },
+ "dependencies": {
+ "async-validator": {
+ "version": "3.5.2",
+ "resolved": "https://registry.npmmirror.com/async-validator/-/async-validator-3.5.2.tgz",
+ "integrity": "sha512-8eLCg00W9pIRZSB781UUX/H6Oskmm8xloZfr09lz5bikRpBVDlJ3hRVuxxP1SxcwsEYfJ4IU8Q19Y8/893r3rQ=="
+ }
+ }
+ },
+ "array-tree-filter": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmmirror.com/array-tree-filter/-/array-tree-filter-2.1.0.tgz",
+ "integrity": "sha512-4ROwICNlNw/Hqa9v+rk5h22KjmzB1JGTMVKP2AKJBOCgb0yL0ASf0+YvCcLNNwquOHNX48jkeZIJ3a+oOQqKcw=="
+ },
+ "async-validator": {
+ "version": "4.0.7",
+ "resolved": "https://registry.npmmirror.com/async-validator/-/async-validator-4.0.7.tgz",
+ "integrity": "sha512-Pj2IR7u8hmUEDOwB++su6baaRi+QvsgajuFB9j95foM1N2gy5HM4z60hfusIO0fBPG5uLAEl6yCJr1jNSVugEQ=="
+ },
+ "axios": {
+ "version": "0.26.1",
+ "resolved": "https://registry.npmmirror.com/axios/-/axios-0.26.1.tgz",
+ "integrity": "sha512-fPwcX4EvnSHuInCMItEhAGnaSEXRBjtzh9fOtsE6E1G6p7vl7edEeZe11QHf18+6+9gR5PbKV/sGKNaD8YaMeA==",
+ "requires": {
+ "follow-redirects": "^1.14.8"
+ },
+ "dependencies": {
+ "follow-redirects": {
+ "version": "1.14.9",
+ "resolved": "https://registry.npmmirror.com/follow-redirects/-/follow-redirects-1.14.9.tgz",
+ "integrity": "sha512-MQDfihBQYMcyy5dhRDJUHcw7lb2Pv/TuE6xP1vyraLukNDHKbDxDNaOE3NbCAdKQApno+GPRyo1YAp89yCjK4w=="
+ }
+ }
+ },
+ "compute-scroll-into-view": {
+ "version": "1.0.17",
+ "resolved": "https://registry.npmmirror.com/compute-scroll-into-view/-/compute-scroll-into-view-1.0.17.tgz",
+ "integrity": "sha512-j4dx+Fb0URmzbwwMUrhqWM2BEWHdFGx+qZ9qqASHRPqvTYdqvWnHg0H1hIbcyLnvgnoNAVMlwkepyqM3DaIFUg=="
+ },
+ "copy-anything": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmmirror.com/copy-anything/-/copy-anything-2.0.6.tgz",
+ "integrity": "sha512-1j20GZTsvKNkc4BY3NpMOM8tt///wY3FpIzozTOFO2ffuZcV61nojHXVKIy3WM+7ADCy5FVhdZYHYDdgTU0yJw==",
+ "requires": {
+ "is-what": "^3.14.1"
+ }
+ },
+ "core-js": {
+ "version": "3.22.5",
+ "resolved": "https://registry.npmmirror.com/core-js/-/core-js-3.22.5.tgz",
+ "integrity": "sha512-VP/xYuvJ0MJWRAobcmQ8F2H6Bsn+s7zqAAjFaHGBMc5AQm7zaelhD1LGduFn2EehEcQcU+br6t+fwbpQ5d1ZWA=="
+ },
+ "csstype": {
+ "version": "2.6.20",
+ "resolved": "https://registry.npmmirror.com/csstype/-/csstype-2.6.20.tgz",
+ "integrity": "sha512-/WwNkdXfckNgw6S5R125rrW8ez139lBHWouiBvX8dfMFtcn6V81REDqnH7+CRpRipfYlyU1CmOnOxrmGcFOjeA=="
+ },
+ "dayjs": {
+ "version": "1.11.0",
+ "resolved": "https://registry.npmmirror.com/dayjs/-/dayjs-1.11.0.tgz",
+ "integrity": "sha512-JLC809s6Y948/FuCZPm5IX8rRhQwOiyMb2TfVVQEixG7P8Lm/gt5S7yoQZmC8x1UehI9Pb7sksEt4xx14m+7Ug=="
+ },
+ "dom-align": {
+ "version": "1.12.3",
+ "resolved": "https://registry.npmmirror.com/dom-align/-/dom-align-1.12.3.tgz",
+ "integrity": "sha512-Gj9hZN3a07cbR6zviMUBOMPdWxYhbMI+x+WS0NAIu2zFZmbK8ys9R79g+iG9qLnlCwpFoaB+fKy8Pdv470GsPA=="
+ },
+ "dom-scroll-into-view": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmmirror.com/dom-scroll-into-view/-/dom-scroll-into-view-2.0.1.tgz",
+ "integrity": "sha512-bvVTQe1lfaUr1oFzZX80ce9KLDlZ3iU+XGNE/bz9HnGdklTieqsbmsLHe+rT2XWqopvL0PckkYqN7ksmm5pe3w=="
+ },
+ "element-plus": {
+ "version": "2.1.9",
+ "resolved": "https://registry.npmmirror.com/element-plus/-/element-plus-2.1.9.tgz",
+ "integrity": "sha512-6mWqS3YrmJPnouWP4otzL8+MehfOnDFqDbcIdnmC07p+Z0JkWe/CVKc4Wky8AYC8nyDMUQyiZYvooCbqGuM7pg==",
+ "requires": {
+ "@ctrl/tinycolor": "^3.4.0",
+ "@element-plus/icons-vue": "^1.1.4",
+ "@floating-ui/dom": "^0.4.2",
+ "@popperjs/core": "^2.11.4",
+ "@types/lodash": "^4.14.181",
+ "@types/lodash-es": "^4.17.6",
+ "@vueuse/core": "^8.2.4",
+ "async-validator": "^4.0.7",
+ "dayjs": "^1.11.0",
+ "escape-html": "^1.0.3",
+ "lodash": "^4.17.21",
+ "lodash-es": "^4.17.21",
+ "lodash-unified": "^1.0.2",
+ "memoize-one": "^6.0.0",
+ "normalize-wheel-es": "^1.1.2"
+ }
+ },
+ "errno": {
+ "version": "0.1.8",
+ "resolved": "https://registry.npmmirror.com/errno/-/errno-0.1.8.tgz",
+ "integrity": "sha512-dJ6oBr5SQ1VSd9qkk7ByRgb/1SH4JZjCHSW/mr63/QcXO9zLVxvJ6Oy13nio03rxpSnVDDjFor75SjVeZWPW/A==",
+ "optional": true,
+ "requires": {
+ "prr": "~1.0.1"
+ }
+ },
+ "esbuild": {
+ "version": "0.14.36",
+ "resolved": "https://registry.npmmirror.com/esbuild/-/esbuild-0.14.36.tgz",
+ "integrity": "sha512-HhFHPiRXGYOCRlrhpiVDYKcFJRdO0sBElZ668M4lh2ER0YgnkLxECuFe7uWCf23FrcLc59Pqr7dHkTqmRPDHmw==",
+ "dev": true,
+ "requires": {
+ "esbuild-android-64": "0.14.36",
+ "esbuild-android-arm64": "0.14.36",
+ "esbuild-darwin-64": "0.14.36",
+ "esbuild-darwin-arm64": "0.14.36",
+ "esbuild-freebsd-64": "0.14.36",
+ "esbuild-freebsd-arm64": "0.14.36",
+ "esbuild-linux-32": "0.14.36",
+ "esbuild-linux-64": "0.14.36",
+ "esbuild-linux-arm": "0.14.36",
+ "esbuild-linux-arm64": "0.14.36",
+ "esbuild-linux-mips64le": "0.14.36",
+ "esbuild-linux-ppc64le": "0.14.36",
+ "esbuild-linux-riscv64": "0.14.36",
+ "esbuild-linux-s390x": "0.14.36",
+ "esbuild-netbsd-64": "0.14.36",
+ "esbuild-openbsd-64": "0.14.36",
+ "esbuild-sunos-64": "0.14.36",
+ "esbuild-windows-32": "0.14.36",
+ "esbuild-windows-64": "0.14.36",
+ "esbuild-windows-arm64": "0.14.36"
+ }
+ },
+ "esbuild-darwin-64": {
+ "version": "0.14.36",
+ "resolved": "https://registry.npmmirror.com/esbuild-darwin-64/-/esbuild-darwin-64-0.14.36.tgz",
+ "integrity": "sha512-kkl6qmV0dTpyIMKagluzYqlc1vO0ecgpviK/7jwPbRDEv5fejRTaBBEE2KxEQbTHcLhiiDbhG7d5UybZWo/1zQ==",
+ "dev": true,
+ "optional": true
+ },
+ "escape-html": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmmirror.com/escape-html/-/escape-html-1.0.3.tgz",
+ "integrity": "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow=="
+ },
+ "estree-walker": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmmirror.com/estree-walker/-/estree-walker-2.0.2.tgz",
+ "integrity": "sha512-Rfkk/Mp/DL7JVje3u18FxFujQlTNR2q6QfMSMB7AvCBx91NGj/ba3kCfza0f6dVDbw7YlRf/nDrn7pQrCCyQ/w=="
+ },
+ "fsevents": {
+ "version": "2.3.2",
+ "resolved": "https://registry.npmmirror.com/fsevents/-/fsevents-2.3.2.tgz",
+ "integrity": "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==",
+ "dev": true,
+ "optional": true
+ },
+ "function-bind": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmmirror.com/function-bind/-/function-bind-1.1.1.tgz",
+ "integrity": "sha512-yIovAzMX49sF8Yl58fSCWJ5svSLuaibPxXQJFLmBObTuCr0Mf1KiPopGM9NiFjiYBCbfaa2Fh6breQ6ANVTI0A==",
+ "dev": true
+ },
+ "graceful-fs": {
+ "version": "4.2.10",
+ "resolved": "https://registry.npmmirror.com/graceful-fs/-/graceful-fs-4.2.10.tgz",
+ "integrity": "sha512-9ByhssR2fPVsNZj478qUUbKfmL0+t5BDVyjShtyZZLiK7ZDAArFFfopyOTj0M05wE2tJPisA4iTnnXl2YoPvOA==",
+ "optional": true
+ },
+ "has": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmmirror.com/has/-/has-1.0.3.tgz",
+ "integrity": "sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==",
+ "dev": true,
+ "requires": {
+ "function-bind": "^1.1.1"
+ }
+ },
+ "iconv-lite": {
+ "version": "0.4.24",
+ "resolved": "https://registry.npmmirror.com/iconv-lite/-/iconv-lite-0.4.24.tgz",
+ "integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==",
+ "optional": true,
+ "requires": {
+ "safer-buffer": ">= 2.1.2 < 3"
+ }
+ },
+ "image-size": {
+ "version": "0.5.5",
+ "resolved": "https://registry.npmmirror.com/image-size/-/image-size-0.5.5.tgz",
+ "integrity": "sha512-6TDAlDPZxUFCv+fuOkIoXT/V/f3Qbq8e37p+YOiYrUv3v9cc3/6x78VdfPgFVaB9dZYeLUfKgHRebpkm/oP2VQ==",
+ "optional": true
+ },
+ "is-core-module": {
+ "version": "2.8.1",
+ "resolved": "https://registry.npmmirror.com/is-core-module/-/is-core-module-2.8.1.tgz",
+ "integrity": "sha512-SdNCUs284hr40hFTFP6l0IfZ/RSrMXF3qgoRHd3/79unUTvrFO/JoXwkGm+5J/Oe3E/b5GsnG330uUNgRpu1PA==",
+ "dev": true,
+ "requires": {
+ "has": "^1.0.3"
+ }
+ },
+ "is-plain-object": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmmirror.com/is-plain-object/-/is-plain-object-3.0.1.tgz",
+ "integrity": "sha512-Xnpx182SBMrr/aBik8y+GuR4U1L9FqMSojwDQwPMmxyC6bvEqly9UBCxhauBF5vNh2gwWJNX6oDV7O+OM4z34g=="
+ },
+ "is-what": {
+ "version": "3.14.1",
+ "resolved": "https://registry.npmmirror.com/is-what/-/is-what-3.14.1.tgz",
+ "integrity": "sha512-sNxgpk9793nzSs7bA6JQJGeIuRBQhAaNGG77kzYQgMkrID+lS6SlK07K5LaptscDlSaIgH+GPFzf+d75FVxozA=="
+ },
+ "js-audio-recorder": {
+ "version": "0.5.7",
+ "resolved": "https://registry.npmmirror.com/js-audio-recorder/-/js-audio-recorder-0.5.7.tgz",
+ "integrity": "sha512-DIlv30N86AYHr7zGHN0O7V/3Rd8Q6SIJ/MBzVJaT9STWTdhF4E/8fxCX6ZMgRSv8xmx6fEqcFFNPoofmxJD4+A=="
+ },
+ "js-tokens": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmmirror.com/js-tokens/-/js-tokens-4.0.0.tgz",
+ "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="
+ },
+ "lamejs": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmmirror.com/lamejs/-/lamejs-1.2.1.tgz",
+ "integrity": "sha512-s7bxvjvYthw6oPLCm5pFxvA84wUROODB8jEO2+CE1adhKgrIvVOlmMgY8zyugxGrvRaDHNJanOiS21/emty6dQ==",
+ "requires": {
+ "use-strict": "1.0.1"
+ }
+ },
+ "less": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmmirror.com/less/-/less-4.1.2.tgz",
+ "integrity": "sha512-EoQp/Et7OSOVu0aJknJOtlXZsnr8XE8KwuzTHOLeVSEx8pVWUICc8Q0VYRHgzyjX78nMEyC/oztWFbgyhtNfDA==",
+ "requires": {
+ "copy-anything": "^2.0.1",
+ "errno": "^0.1.1",
+ "graceful-fs": "^4.1.2",
+ "image-size": "~0.5.0",
+ "make-dir": "^2.1.0",
+ "mime": "^1.4.1",
+ "needle": "^2.5.2",
+ "parse-node-version": "^1.0.1",
+ "source-map": "~0.6.0",
+ "tslib": "^2.3.0"
+ }
+ },
+ "lodash": {
+ "version": "4.17.21",
+ "resolved": "https://registry.npmmirror.com/lodash/-/lodash-4.17.21.tgz",
+ "integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
+ },
+ "lodash-es": {
+ "version": "4.17.21",
+ "resolved": "https://registry.npmmirror.com/lodash-es/-/lodash-es-4.17.21.tgz",
+ "integrity": "sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw=="
+ },
+ "lodash-unified": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmmirror.com/lodash-unified/-/lodash-unified-1.0.2.tgz",
+ "integrity": "sha512-OGbEy+1P+UT26CYi4opY4gebD8cWRDxAT6MAObIVQMiqYdxZr1g3QHWCToVsm31x2NkLS4K3+MC2qInaRMa39g==",
+ "requires": {}
+ },
+ "loose-envify": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmmirror.com/loose-envify/-/loose-envify-1.4.0.tgz",
+ "integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
+ "requires": {
+ "js-tokens": "^3.0.0 || ^4.0.0"
+ }
+ },
+ "magic-string": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmmirror.com/magic-string/-/magic-string-0.25.9.tgz",
+ "integrity": "sha512-RmF0AsMzgt25qzqqLc1+MbHmhdx0ojF2Fvs4XnOqz2ZOBXzzkEwc/dJQZCYHAn7v1jbVOjAZfK8msRn4BxO4VQ==",
+ "requires": {
+ "sourcemap-codec": "^1.4.8"
+ }
+ },
+ "make-dir": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmmirror.com/make-dir/-/make-dir-2.1.0.tgz",
+ "integrity": "sha512-LS9X+dc8KLxXCb8dni79fLIIUA5VyZoyjSMCwTluaXA0o27cCK0bhXkpgw+sTXVpPy/lSO57ilRixqk0vDmtRA==",
+ "optional": true,
+ "requires": {
+ "pify": "^4.0.1",
+ "semver": "^5.6.0"
+ }
+ },
+ "memoize-one": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmmirror.com/memoize-one/-/memoize-one-6.0.0.tgz",
+ "integrity": "sha512-rkpe71W0N0c0Xz6QD0eJETuWAJGnJ9afsl1srmwPrI+yBCkge5EycXXbYRyvL29zZVUWQCY7InPRCv3GDXuZNw=="
+ },
+ "mime": {
+ "version": "1.6.0",
+ "resolved": "https://registry.npmmirror.com/mime/-/mime-1.6.0.tgz",
+ "integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==",
+ "optional": true
+ },
+ "moment": {
+ "version": "2.29.4",
+ "resolved": "https://registry.npmjs.org/moment/-/moment-2.29.4.tgz",
+ "integrity": "sha512-5LC9SOxjSc2HF6vO2CyuTDNivEdoz2IvyJJGj6X8DJ0eFyfszE0QiEd+iXmBvUP3WHxSjFH/vIsA0EN00cgr8w=="
+ },
+ "nanoid": {
+ "version": "3.3.2",
+ "resolved": "https://registry.npmmirror.com/nanoid/-/nanoid-3.3.2.tgz",
+ "integrity": "sha512-CuHBogktKwpm5g2sRgv83jEy2ijFzBwMoYA60orPDR7ynsLijJDqgsi4RDGj3OJpy3Ieb+LYwiRmIOGyytgITA=="
+ },
+ "nanopop": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmmirror.com/nanopop/-/nanopop-2.1.0.tgz",
+ "integrity": "sha512-jGTwpFRexSH+fxappnGQtN9dspgE2ipa1aOjtR24igG0pv6JCxImIAmrLRHX+zUF5+1wtsFVbKyfP51kIGAVNw=="
+ },
+ "needle": {
+ "version": "2.9.1",
+ "resolved": "https://registry.npmmirror.com/needle/-/needle-2.9.1.tgz",
+ "integrity": "sha512-6R9fqJ5Zcmf+uYaFgdIHmLwNldn5HbK8L5ybn7Uz+ylX/rnOsSp1AHcvQSrCaFN+qNM1wpymHqD7mVasEOlHGQ==",
+ "optional": true,
+ "requires": {
+ "debug": "^3.2.6",
+ "iconv-lite": "^0.4.4",
+ "sax": "^1.2.4"
+ },
+ "dependencies": {
+ "debug": {
+ "version": "3.2.7",
+ "resolved": "https://registry.npmmirror.com/debug/-/debug-3.2.7.tgz",
+ "integrity": "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==",
+ "optional": true,
+ "requires": {
+ "ms": "^2.1.1"
+ }
+ },
+ "ms": {
+ "version": "2.1.3",
+ "resolved": "https://registry.npmmirror.com/ms/-/ms-2.1.3.tgz",
+ "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
+ "optional": true
+ }
+ }
+ },
+ "normalize-wheel-es": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmmirror.com/normalize-wheel-es/-/normalize-wheel-es-1.1.2.tgz",
+ "integrity": "sha512-scX83plWJXYH1J4+BhAuIHadROzxX0UBF3+HuZNY2Ks8BciE7tSTQ+5JhTsvzjaO0/EJdm4JBGrfObKxFf3Png=="
+ },
+ "omit.js": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmmirror.com/omit.js/-/omit.js-2.0.2.tgz",
+ "integrity": "sha512-hJmu9D+bNB40YpL9jYebQl4lsTW6yEHRTroJzNLqQJYHm7c+NQnJGfZmIWh8S3q3KoaxV1aLhV6B3+0N0/kyJg=="
+ },
+ "parse-node-version": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmmirror.com/parse-node-version/-/parse-node-version-1.0.1.tgz",
+ "integrity": "sha512-3YHlOa/JgH6Mnpr05jP9eDG254US9ek25LyIxZlDItp2iJtwyaXQb57lBYLdT3MowkUFYEV2XXNAYIPlESvJlA=="
+ },
+ "path-parse": {
+ "version": "1.0.7",
+ "resolved": "https://registry.npmmirror.com/path-parse/-/path-parse-1.0.7.tgz",
+ "integrity": "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==",
+ "dev": true
+ },
+ "picocolors": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmmirror.com/picocolors/-/picocolors-1.0.0.tgz",
+ "integrity": "sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ=="
+ },
+ "pify": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmmirror.com/pify/-/pify-4.0.1.tgz",
+ "integrity": "sha512-uB80kBFb/tfd68bVleG9T5GGsGPjJrLAUpR5PZIrhBnIaRTQRjqdJSsIKkOP6OAIFbj7GOrcudc5pNjZ+geV2g==",
+ "optional": true
+ },
+ "postcss": {
+ "version": "8.4.12",
+ "resolved": "https://registry.npmmirror.com/postcss/-/postcss-8.4.12.tgz",
+ "integrity": "sha512-lg6eITwYe9v6Hr5CncVbK70SoioNQIq81nsaG86ev5hAidQvmOeETBqs7jm43K2F5/Ley3ytDtriImV6TpNiSg==",
+ "requires": {
+ "nanoid": "^3.3.1",
+ "picocolors": "^1.0.0",
+ "source-map-js": "^1.0.2"
+ }
+ },
+ "prr": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmmirror.com/prr/-/prr-1.0.1.tgz",
+ "integrity": "sha512-yPw4Sng1gWghHQWj0B3ZggWUm4qVbPwPFcRG8KyxiU7J2OHFSoEHKS+EZ3fv5l1t9CyCiop6l/ZYeWbrgoQejw==",
+ "optional": true
+ },
+ "regenerator-runtime": {
+ "version": "0.13.9",
+ "resolved": "https://registry.npmmirror.com/regenerator-runtime/-/regenerator-runtime-0.13.9.tgz",
+ "integrity": "sha512-p3VT+cOEgxFsRRA9X4lkI1E+k2/CtnKtU4gcxyaCUreilL/vqI6CdZ3wxVUx3UOUg+gnUOQQcRI7BmSI656MYA=="
+ },
+ "resize-observer-polyfill": {
+ "version": "1.5.1",
+ "resolved": "https://registry.npmmirror.com/resize-observer-polyfill/-/resize-observer-polyfill-1.5.1.tgz",
+ "integrity": "sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg=="
+ },
+ "resolve": {
+ "version": "1.22.0",
+ "resolved": "https://registry.npmmirror.com/resolve/-/resolve-1.22.0.tgz",
+ "integrity": "sha512-Hhtrw0nLeSrFQ7phPp4OOcVjLPIeMnRlr5mcnVuMe7M/7eBn98A3hmFRLoFo3DLZkivSYwhRUJTyPyWAk56WLw==",
+ "dev": true,
+ "requires": {
+ "is-core-module": "^2.8.1",
+ "path-parse": "^1.0.7",
+ "supports-preserve-symlinks-flag": "^1.0.0"
+ }
+ },
+ "rollup": {
+ "version": "2.70.1",
+ "resolved": "https://registry.npmmirror.com/rollup/-/rollup-2.70.1.tgz",
+ "integrity": "sha512-CRYsI5EuzLbXdxC6RnYhOuRdtz4bhejPMSWjsFLfVM/7w/85n2szZv6yExqUXsBdz5KT8eoubeyDUDjhLHEslA==",
+ "dev": true,
+ "requires": {
+ "fsevents": "~2.3.2"
+ }
+ },
+ "safer-buffer": {
+ "version": "2.1.2",
+ "resolved": "https://registry.npmmirror.com/safer-buffer/-/safer-buffer-2.1.2.tgz",
+ "integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
+ "optional": true
+ },
+ "sax": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmmirror.com/sax/-/sax-1.2.4.tgz",
+ "integrity": "sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw==",
+ "optional": true
+ },
+ "scroll-into-view-if-needed": {
+ "version": "2.2.29",
+ "resolved": "https://registry.npmmirror.com/scroll-into-view-if-needed/-/scroll-into-view-if-needed-2.2.29.tgz",
+ "integrity": "sha512-hxpAR6AN+Gh53AdAimHM6C8oTN1ppwVZITihix+WqalywBeFcQ6LdQP5ABNl26nX8GTEL7VT+b8lKpdqq65wXg==",
+ "requires": {
+ "compute-scroll-into-view": "^1.0.17"
+ }
+ },
+ "semver": {
+ "version": "5.7.1",
+ "resolved": "https://registry.npmmirror.com/semver/-/semver-5.7.1.tgz",
+ "integrity": "sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ==",
+ "optional": true
+ },
+ "shallow-equal": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmmirror.com/shallow-equal/-/shallow-equal-1.2.1.tgz",
+ "integrity": "sha512-S4vJDjHHMBaiZuT9NPb616CSmLf618jawtv3sufLl6ivK8WocjAo58cXwbRV1cgqxH0Qbv+iUt6m05eqEa2IRA=="
+ },
+ "source-map": {
+ "version": "0.6.1",
+ "resolved": "https://registry.npmmirror.com/source-map/-/source-map-0.6.1.tgz",
+ "integrity": "sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g=="
+ },
+ "source-map-js": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmmirror.com/source-map-js/-/source-map-js-1.0.2.tgz",
+ "integrity": "sha512-R0XvVJ9WusLiqTCEiGCmICCMplcCkIwwR11mOSD9CR5u+IXYdiseeEuXCVAjS54zqwkLcPNnmU4OeJ6tUrWhDw=="
+ },
+ "sourcemap-codec": {
+ "version": "1.4.8",
+ "resolved": "https://registry.npmmirror.com/sourcemap-codec/-/sourcemap-codec-1.4.8.tgz",
+ "integrity": "sha512-9NykojV5Uih4lgo5So5dtw+f0JgJX30KCNI8gwhz2J9A15wD0Ml6tjHKwf6fTSa6fAdVBdZeNOs9eJ71qCk8vA=="
+ },
+ "supports-preserve-symlinks-flag": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmmirror.com/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz",
+ "integrity": "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==",
+ "dev": true
+ },
+ "tslib": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmmirror.com/tslib/-/tslib-2.4.0.tgz",
+ "integrity": "sha512-d6xOpEDfsi2CZVlPQzGeux8XMwLT9hssAsaPYExaQMuYskwb+x1x7J371tWlbBdWHroy99KnVB6qIkUbs5X3UQ=="
+ },
+ "use-strict": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmmirror.com/use-strict/-/use-strict-1.0.1.tgz",
+ "integrity": "sha512-IeiWvvEXfW5ltKVMkxq6FvNf2LojMKvB2OCeja6+ct24S1XOmQw2dGr2JyndwACWAGJva9B7yPHwAmeA9QCqAQ=="
+ },
+ "vite": {
+ "version": "2.9.1",
+ "resolved": "https://registry.npmmirror.com/vite/-/vite-2.9.1.tgz",
+ "integrity": "sha512-vSlsSdOYGcYEJfkQ/NeLXgnRv5zZfpAsdztkIrs7AZHV8RCMZQkwjo4DS5BnrYTqoWqLoUe1Cah4aVO4oNNqCQ==",
+ "dev": true,
+ "requires": {
+ "esbuild": "^0.14.27",
+ "fsevents": "~2.3.2",
+ "postcss": "^8.4.12",
+ "resolve": "^1.22.0",
+ "rollup": "^2.59.0"
+ }
+ },
+ "vue": {
+ "version": "3.2.32",
+ "resolved": "https://registry.npmmirror.com/vue/-/vue-3.2.32.tgz",
+ "integrity": "sha512-6L3jKZApF042OgbCkh+HcFeAkiYi3Lovi8wNhWqIK98Pi5efAMLZzRHgi91v+60oIRxdJsGS9sTMsb+yDpY8Eg==",
+ "requires": {
+ "@vue/compiler-dom": "3.2.32",
+ "@vue/compiler-sfc": "3.2.32",
+ "@vue/runtime-dom": "3.2.32",
+ "@vue/server-renderer": "3.2.32",
+ "@vue/shared": "3.2.32"
+ }
+ },
+ "vue-demi": {
+ "version": "0.12.5",
+ "resolved": "https://registry.npmmirror.com/vue-demi/-/vue-demi-0.12.5.tgz",
+ "integrity": "sha512-BREuTgTYlUr0zw0EZn3hnhC3I6gPWv+Kwh4MCih6QcAeaTlaIX0DwOVN0wHej7hSvDPecz4jygy/idsgKfW58Q==",
+ "requires": {}
+ },
+ "vue-types": {
+ "version": "3.0.2",
+ "resolved": "https://registry.npmmirror.com/vue-types/-/vue-types-3.0.2.tgz",
+ "integrity": "sha512-IwUC0Aq2zwaXqy74h4WCvFCUtoV0iSWr0snWnE9TnU18S66GAQyqQbRf2qfJtUuiFsBf6qp0MEwdonlwznlcrw==",
+ "requires": {
+ "is-plain-object": "3.0.1"
+ }
+ },
+ "warning": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmmirror.com/warning/-/warning-4.0.3.tgz",
+ "integrity": "sha512-rpJyN222KWIvHJ/F53XSZv0Zl/accqHR8et1kpaMTD/fLCRxtV8iX8czMzY7sVZupTI3zcUTg8eycS2kNF9l6w==",
+ "requires": {
+ "loose-envify": "^1.0.0"
+ }
+ }
+ }
+}
diff --git a/demos/speech_web/web_client/package.json b/demos/speech_web/web_client/package.json
new file mode 100644
index 000000000..7f28d4c97
--- /dev/null
+++ b/demos/speech_web/web_client/package.json
@@ -0,0 +1,23 @@
+{
+ "name": "paddlespeechwebclient",
+ "private": true,
+ "version": "0.0.0",
+ "scripts": {
+ "dev": "vite",
+ "build": "vite build",
+ "preview": "vite preview"
+ },
+ "dependencies": {
+ "ant-design-vue": "^2.2.8",
+ "axios": "^0.26.1",
+ "element-plus": "^2.1.9",
+ "js-audio-recorder": "0.5.7",
+ "lamejs": "^1.2.1",
+ "less": "^4.1.2",
+ "vue": "^3.2.25"
+ },
+ "devDependencies": {
+ "@vitejs/plugin-vue": "^2.3.0",
+ "vite": "^2.9.0"
+ }
+}
diff --git a/demos/speech_web/web_client/public/favicon.ico b/demos/speech_web/web_client/public/favicon.ico
new file mode 100644
index 000000000..342038720
Binary files /dev/null and b/demos/speech_web/web_client/public/favicon.ico differ
diff --git a/demos/speech_web/web_client/src/App.vue b/demos/speech_web/web_client/src/App.vue
new file mode 100644
index 000000000..a70dbf9c4
--- /dev/null
+++ b/demos/speech_web/web_client/src/App.vue
@@ -0,0 +1,19 @@
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/api/API.js b/demos/speech_web/web_client/src/api/API.js
new file mode 100644
index 000000000..0feaa63f1
--- /dev/null
+++ b/demos/speech_web/web_client/src/api/API.js
@@ -0,0 +1,29 @@
+export const apiURL = {
+ ASR_OFFLINE : '/api/asr/offline', // 获取离线语音识别结果
+ ASR_COLLECT_ENV : '/api/asr/collectEnv', // 采集环境噪音
+ ASR_STOP_RECORD : '/api/asr/stopRecord', // 后端暂停录音
+ ASR_RESUME_RECORD : '/api/asr/resumeRecord',// 后端恢复录音
+
+ NLP_CHAT : '/api/nlp/chat', // NLP闲聊接口
+ NLP_IE : '/api/nlp/ie', // 信息抽取接口
+
+ TTS_OFFLINE : '/api/tts/offline', // 获取TTS音频
+
+ VPR_RECOG : '/api/vpr/recog', // 声纹识别接口,返回声纹对比相似度
+ VPR_ENROLL : '/api/vpr/enroll', // 声纹识别注册接口
+ VPR_LIST : '/api/vpr/list', // 获取声纹注册的数据列表
+ VPR_DEL : '/api/vpr/del', // 删除用户声纹
+ VPR_DATA : '/api/vpr/database64?vprId=', // 获取声纹注册数据 bs64格式
+
+ // websocket
+ CHAT_SOCKET_RECORD: 'ws://localhost:8010/ws/asr/offlineStream', // ChatBot websocket 接口
+ ASR_SOCKET_RECORD: 'ws://localhost:8010/ws/asr/onlineStream', // Stream ASR 接口
+ TTS_SOCKET_RECORD: 'ws://localhost:8010/ws/tts/online', // Stream TTS 接口
+}
+
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/api/ApiASR.js b/demos/speech_web/web_client/src/api/ApiASR.js
new file mode 100644
index 000000000..342c56164
--- /dev/null
+++ b/demos/speech_web/web_client/src/api/ApiASR.js
@@ -0,0 +1,30 @@
+import axios from 'axios'
+import {apiURL} from "./API.js"
+
+// 上传音频文件,获得识别结果
+export async function asrOffline(params){
+ const result = await axios.post(
+ apiURL.ASR_OFFLINE, params
+ )
+ return result
+}
+
+// 上传环境采集文件
+export async function asrCollentEnv(params){
+ const result = await axios.post(
+ apiURL.ASR_OFFLINE, params
+ )
+ return result
+}
+
+// 暂停录音
+export async function asrStopRecord(){
+ const result = await axios.get(apiURL.ASR_STOP_RECORD);
+ return result
+}
+
+// 恢复录音
+export async function asrResumeRecord(){
+ const result = await axios.get(apiURL.ASR_RESUME_RECORD);
+ return result
+}
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/api/ApiNLP.js b/demos/speech_web/web_client/src/api/ApiNLP.js
new file mode 100644
index 000000000..92259054a
--- /dev/null
+++ b/demos/speech_web/web_client/src/api/ApiNLP.js
@@ -0,0 +1,17 @@
+import axios from 'axios'
+import {apiURL} from "./API.js"
+
+// 获取闲聊对话结果
+export async function nlpChat(text){
+ const result = await axios.post(apiURL.NLP_CHAT, { chat : text});
+ return result
+}
+
+// 获取信息抽取结果
+export async function nlpIE(text){
+ const result = await axios.post(apiURL.NLP_IE, { chat : text});
+ return result
+}
+
+
+
diff --git a/demos/speech_web/web_client/src/api/ApiTTS.js b/demos/speech_web/web_client/src/api/ApiTTS.js
new file mode 100644
index 000000000..1d23a4bd1
--- /dev/null
+++ b/demos/speech_web/web_client/src/api/ApiTTS.js
@@ -0,0 +1,8 @@
+import axios from 'axios'
+import {apiURL} from "./API.js"
+
+export async function ttsOffline(text){
+ const result = await axios.post(apiURL.TTS_OFFLINE, { text : text});
+ return result
+}
+
diff --git a/demos/speech_web/web_client/src/api/ApiVPR.js b/demos/speech_web/web_client/src/api/ApiVPR.js
new file mode 100644
index 000000000..e3ae2f5ec
--- /dev/null
+++ b/demos/speech_web/web_client/src/api/ApiVPR.js
@@ -0,0 +1,32 @@
+import axios from 'axios'
+import {apiURL} from "./API.js"
+
+// 注册声纹
+export async function vprEnroll(params){
+ const result = await axios.post(apiURL.VPR_ENROLL, params);
+ return result
+}
+
+// 声纹识别
+export async function vprRecog(params){
+ const result = await axios.post(apiURL.VPR_RECOG, params);
+ return result
+}
+
+// 删除声纹
+export async function vprDel(params){
+ const result = await axios.post(apiURL.VPR_DEL, params);
+ return result
+}
+
+// 获取声纹列表
+export async function vprList(){
+ const result = await axios.get(apiURL.VPR_LIST);
+ return result
+}
+
+// 获取声纹音频
+export async function vprData(params){
+ const result = await axios.get(apiURL.VPR_DATA+params);
+ return result
+}
diff --git a/demos/speech_web/web_client/src/assets/image/ic_大-上传文件.svg b/demos/speech_web/web_client/src/assets/image/ic_大-上传文件.svg
new file mode 100644
index 000000000..4c3c86403
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_大-上传文件.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_大-声音波浪.svg b/demos/speech_web/web_client/src/assets/image/ic_大-声音波浪.svg
new file mode 100644
index 000000000..dfbdc0e85
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_大-声音波浪.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_大-语音.svg b/demos/speech_web/web_client/src/assets/image/ic_大-语音.svg
new file mode 100644
index 000000000..54571a3e3
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_大-语音.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_小-录制语音.svg b/demos/speech_web/web_client/src/assets/image/ic_小-录制语音.svg
new file mode 100644
index 000000000..b61f7ac03
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_小-录制语音.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_小-结束.svg b/demos/speech_web/web_client/src/assets/image/ic_小-结束.svg
new file mode 100644
index 000000000..01a8dc65e
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_小-结束.svg
@@ -0,0 +1,3 @@
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_开始聊天.svg b/demos/speech_web/web_client/src/assets/image/ic_开始聊天.svg
new file mode 100644
index 000000000..073efd5e0
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_开始聊天.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_开始聊天_hover.svg b/demos/speech_web/web_client/src/assets/image/ic_开始聊天_hover.svg
new file mode 100644
index 000000000..824f974ab
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_开始聊天_hover.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_播放(按钮).svg b/demos/speech_web/web_client/src/assets/image/ic_播放(按钮).svg
new file mode 100644
index 000000000..4dc1461fd
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_播放(按钮).svg
@@ -0,0 +1,3 @@
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_暂停(按钮).svg b/demos/speech_web/web_client/src/assets/image/ic_暂停(按钮).svg
new file mode 100644
index 000000000..6ede8ea62
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_暂停(按钮).svg
@@ -0,0 +1,3 @@
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/ic_更换示例.svg b/demos/speech_web/web_client/src/assets/image/ic_更换示例.svg
new file mode 100644
index 000000000..d126775d3
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/ic_更换示例.svg
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/icon_小-声音波浪.svg b/demos/speech_web/web_client/src/assets/image/icon_小-声音波浪.svg
new file mode 100644
index 000000000..3dfed9be5
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/icon_小-声音波浪.svg
@@ -0,0 +1,6 @@
+
+
+
+
+
+
diff --git a/demos/speech_web/web_client/src/assets/image/icon_录制声音小语音1.svg b/demos/speech_web/web_client/src/assets/image/icon_录制声音小语音1.svg
new file mode 100644
index 000000000..4fe4f0f7d
--- /dev/null
+++ b/demos/speech_web/web_client/src/assets/image/icon_录制声音小语音1.svg
@@ -0,0 +1,14 @@
+
+
+ icon_录制声音(小语音)
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/assets/image/在线体验-背景@2x.png b/demos/speech_web/web_client/src/assets/image/在线体验-背景@2x.png
new file mode 100644
index 000000000..66627e1e6
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/在线体验-背景@2x.png differ
diff --git a/demos/speech_web/web_client/src/assets/image/场景齐全@3x.png b/demos/speech_web/web_client/src/assets/image/场景齐全@3x.png
new file mode 100644
index 000000000..b85427a1a
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/场景齐全@3x.png differ
diff --git a/demos/speech_web/web_client/src/assets/image/教程丰富@3x.png b/demos/speech_web/web_client/src/assets/image/教程丰富@3x.png
new file mode 100644
index 000000000..6edd64316
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/教程丰富@3x.png differ
diff --git a/demos/speech_web/web_client/src/assets/image/模型全面@3x.png b/demos/speech_web/web_client/src/assets/image/模型全面@3x.png
new file mode 100644
index 000000000..4d54eac05
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/模型全面@3x.png differ
diff --git a/demos/speech_web/web_client/src/assets/image/步骤-箭头切图@2x.png b/demos/speech_web/web_client/src/assets/image/步骤-箭头切图@2x.png
new file mode 100644
index 000000000..d0cedecce
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/步骤-箭头切图@2x.png differ
diff --git a/demos/speech_web/web_client/src/assets/image/用户头像@2x.png b/demos/speech_web/web_client/src/assets/image/用户头像@2x.png
new file mode 100644
index 000000000..2970d0070
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/用户头像@2x.png differ
diff --git a/demos/speech_web/web_client/src/assets/image/飞桨头像@2x.png b/demos/speech_web/web_client/src/assets/image/飞桨头像@2x.png
new file mode 100644
index 000000000..1712170ed
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/image/飞桨头像@2x.png differ
diff --git a/demos/speech_web/web_client/src/assets/logo.png b/demos/speech_web/web_client/src/assets/logo.png
new file mode 100644
index 000000000..f3d2503fc
Binary files /dev/null and b/demos/speech_web/web_client/src/assets/logo.png differ
diff --git a/demos/speech_web/web_client/src/components/Content/Header/Header.vue b/demos/speech_web/web_client/src/components/Content/Header/Header.vue
new file mode 100644
index 000000000..8135a2bff
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/Content/Header/Header.vue
@@ -0,0 +1,26 @@
+
+
+
+ 飞桨-PaddleSpeech
+
+
+ PaddleSpeech 是基于飞桨 PaddlePaddle 的语音方向的开源模型库,用于语音和音频中的各种关键任务的开发,欢迎大家Star收藏鼓励
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/Content/Header/style.less b/demos/speech_web/web_client/src/components/Content/Header/style.less
new file mode 100644
index 000000000..9d0261378
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/Content/Header/style.less
@@ -0,0 +1,148 @@
+.speech_header {
+ width: 1200px;
+ margin: 0 auto;
+ padding-top: 50px;
+ // background: url("../../../assets/image/在线体验-背景@2x.png") no-repeat;
+ box-sizing: border-box;
+ &::after {
+ content: "";
+ display: block;
+ clear: both;
+ visibility: hidden;
+ }
+
+ ;
+
+ // background: pink;
+ .speech_header_title {
+ height: 57px;
+ font-family: PingFangSC-Medium;
+ font-size: 38px;
+ color: #000000;
+ letter-spacing: 0;
+ line-height: 57px;
+ font-weight: 500;
+ margin-bottom: 15px;
+ }
+
+ ;
+
+ .speech_header_describe {
+ height: 26px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #575757;
+ line-height: 26px;
+ font-weight: 400;
+ margin-bottom: 24px;
+ }
+
+ ;
+ .speech_header_link_box {
+ height: 40px;
+ margin-bottom: 40px;
+ display: flex;
+ align-items: center;
+ };
+ .speech_header_link {
+ display: block;
+ background: #2932E1;
+ width: 120px;
+ height: 40px;
+ line-height: 40px;
+ border-radius: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ text-align: center;
+ font-weight: 500;
+ margin-right: 20px;
+ // margin-bottom: 40px;
+
+ &:hover {
+ opacity: 0.9;
+ }
+
+ ;
+ }
+
+ ;
+
+ .speech_header_divider {
+ width: 1200px;
+ height: 1px;
+ background: #D1D1D1;
+ margin-bottom: 40px;
+ }
+
+ ;
+
+ .speech_header_content_wrapper {
+ width: 1200px;
+ margin: 0 auto;
+ // background: pink;
+ margin-bottom: 20px;
+ display: flex;
+ justify-content: space-between;
+ flex-wrap: wrap;
+
+ .speech_header_module {
+ width: 384px;
+ background: #FFFFFF;
+ border: 1px solid rgba(224, 224, 224, 1);
+ box-shadow: 4px 8px 12px 0px rgba(0, 0, 0, 0.05);
+ border-radius: 16px;
+ padding: 30px 34px 0px 34px;
+ box-sizing: border-box;
+ display: flex;
+ margin-bottom: 40px;
+ .speech_header_background_img {
+ width: 46px;
+ height: 46px;
+ background-size: 46px 46px;
+ background-repeat: no-repeat;
+ background-position: center;
+ margin-right: 20px;
+ }
+
+ ;
+
+ .speech_header_content {
+ padding-top: 4px;
+ margin-bottom: 32px;
+
+ .speech_header_module_title {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 20px;
+ color: #000000;
+ letter-spacing: 0;
+ line-height: 26px;
+ font-weight: 500;
+ margin-bottom: 10px;
+ }
+
+ ;
+
+ .speech_header_module_introduce {
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #666666;
+ letter-spacing: 0;
+ font-weight: 400;
+ }
+
+ ;
+ }
+
+ ;
+ }
+
+ ;
+ }
+
+ ;
+}
+
+;
+
diff --git a/demos/speech_web/web_client/src/components/Content/Tail/Tail.vue b/demos/speech_web/web_client/src/components/Content/Tail/Tail.vue
new file mode 100644
index 000000000..e69de29bb
diff --git a/demos/speech_web/web_client/src/components/Content/Tail/style.less b/demos/speech_web/web_client/src/components/Content/Tail/style.less
new file mode 100644
index 000000000..e69de29bb
diff --git a/demos/speech_web/web_client/src/components/Experience.vue b/demos/speech_web/web_client/src/components/Experience.vue
new file mode 100644
index 000000000..5620d6af9
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/Experience.vue
@@ -0,0 +1,50 @@
+
+
+
+
+
+
+ 功能体验
+
+
+ 体验前,请允许浏览器获取麦克风权限
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/ASR.vue b/demos/speech_web/web_client/src/components/SubMenu/ASR/ASR.vue
new file mode 100644
index 000000000..edef6a787
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/ASR.vue
@@ -0,0 +1,154 @@
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/ASRT.vue b/demos/speech_web/web_client/src/components/SubMenu/ASR/ASRT.vue
new file mode 100644
index 000000000..245fddb2c
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/ASRT.vue
@@ -0,0 +1,38 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/AudioFile/AudioFileIdentification.vue b/demos/speech_web/web_client/src/components/SubMenu/ASR/AudioFile/AudioFileIdentification.vue
new file mode 100644
index 000000000..4d3cf3c31
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/AudioFile/AudioFileIdentification.vue
@@ -0,0 +1,241 @@
+
+
+
+
+
+
+
+
+
+
+ 上传文件
+
+
+ 支持50秒内的.wav文件
+
+
+
+
+
+
+
+
+
+
+ {{ filename }}
+
+
重新上传
+
播放
+
+
+ 开始识别
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
识别结果
+
{{ asrResult }}
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/AudioFile/style.less b/demos/speech_web/web_client/src/components/SubMenu/ASR/AudioFile/style.less
new file mode 100644
index 000000000..46b33272d
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/AudioFile/style.less
@@ -0,0 +1,293 @@
+.audioFileIdentification {
+ width: 1106px;
+ height: 270px;
+ // background-color: pink;
+ padding-top: 40px;
+ box-sizing: border-box;
+ display: flex;
+ // 开始上传
+ .public_recognition_speech {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+ // 开始上传
+ .upload_img {
+ width: 116px;
+ height: 116px;
+ background: #2932E1;
+ border-radius: 50%;
+ margin-left: 98px;
+ cursor: pointer;
+ margin-bottom: 20px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ .upload_img_back {
+ width: 34.38px;
+ height: 30.82px;
+ background: #2932E1;
+ background: url("../../../../assets/image/ic_大-上传文件.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 34.38px 30.82px;
+ cursor: pointer;
+ }
+ &:hover {
+ opacity: 0.9;
+ };
+
+ };
+
+
+ .speech_text {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 124px;
+ margin-bottom: 10px;
+ };
+ .speech_text_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #999999;
+ font-weight: 400;
+ margin-left: 84px;
+ };
+ };
+ // 上传中
+ .on_the_cross_speech {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+
+ .on_the_upload_img {
+ width: 116px;
+ height: 116px;
+ background: #7278F5;
+ border-radius: 50%;
+ margin-left: 98px;
+ cursor: pointer;
+ margin-bottom: 20px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+
+ .on_the_upload_img_back {
+ width: 34.38px;
+ height: 30.82px;
+ background: #7278F5;
+ background: url("../../../../assets/image/ic_大-上传文件.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 34.38px 30.82px;
+ cursor: pointer;
+
+ };
+ };
+
+
+ .on_the_speech_text {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 124px;
+ margin-bottom: 10px;
+ display: flex;
+ // justify-content: center;
+ align-items: center;
+ .on_the_speech_loading {
+ display: inline-block;
+ width: 16px;
+ height: 16px;
+ background: #7278F5;
+ // background: url("../../../../assets/image/ic_开始聊天.svg");
+ // background-repeat: no-repeat;
+ // background-position: center;
+ // background-size: 16px 16px;
+ margin-right: 8px;
+ };
+ };
+ };
+
+ //开始识别
+ .public_recognition_speech_start {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+ position: relative;
+ .public_recognition_speech_content {
+ width: 100%;
+ position: absolute;
+ top: 40px;
+ left: 50%;
+ transform: translateX(-50%);
+ display: flex;
+ justify-content: center;
+ align-items: center;
+
+ .public_recognition_speech_title {
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #000000;
+ font-weight: 400;
+ };
+ .public_recognition_speech_again {
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #2932E1;
+ font-weight: 400;
+ margin-left: 30px;
+ cursor: pointer;
+ };
+ .public_recognition_speech_play {
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #2932E1;
+ font-weight: 400;
+ margin-left: 20px;
+ cursor: pointer;
+ };
+ };
+ .speech_promp {
+ position: absolute;
+ top: 112px;
+ left: 50%;
+ transform: translateX(-50%);
+ width: 142px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ text-align: center;
+ line-height: 44px;
+ font-weight: 500;
+ cursor: pointer;
+ };
+
+
+ };
+ // 识别中
+ .public_recognition_speech_identify {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+ position: relative;
+ .public_recognition_speech_identify_box {
+ width: 143px;
+ height: 44px;
+ background: #7278F5;
+ border-radius: 22px;
+ position: absolute;
+ top: 50%;
+ left: 50%;
+ transform: translate(-50%,-50%);
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+ .public_recognition_speech_identify_back_img {
+ width: 16px;
+ height: 16px;
+ // background: #7278F5;
+ // background: url("../../../../assets/image/ic_开始聊天.svg");
+ // background-repeat: no-repeat;
+ // background-position: center;
+ // background-size: 16px 16px;
+ };
+ .public_recognition__identify_the_promp {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+ };
+
+
+
+ };
+ // 重新识别
+ .public_recognition_speech_identify_ahain {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+ position: relative;
+ cursor: pointer;
+ .public_recognition_speech_identify_box_btn {
+ width: 143px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ position: absolute;
+ top: 50%;
+ left: 50%;
+ transform: translate(-50%,-50%);
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+ .public_recognition__identify_the_btn {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ };
+ };
+
+
+
+ };
+ // 指向
+ .public_recognition_point_to {
+ width: 47px;
+ height: 67px;
+ background: url("../../../../assets/image/步骤-箭头切图@2x.png") no-repeat;
+ background-position: center;
+ background-size: 47px 67px;
+ margin-top: 91px;
+ margin-right: 67px;
+ };
+ // 识别结果
+ .public_recognition_result {
+ width: 680px;
+ height: 230px;
+ background: #FAFAFA;
+ padding: 40px 50px 0px 50px;
+ div {
+ &:nth-of-type(1) {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #666666;
+ line-height: 26px;
+ font-weight: 500;
+ margin-bottom: 20px;
+ };
+ &:nth-of-type(2) {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #666666;
+ line-height: 26px;
+ font-weight: 500;
+ };
+ };
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/EndToEnd/EndToEndIdentification.vue b/demos/speech_web/web_client/src/components/SubMenu/ASR/EndToEnd/EndToEndIdentification.vue
new file mode 100644
index 000000000..651e8c725
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/EndToEnd/EndToEndIdentification.vue
@@ -0,0 +1,92 @@
+
+
+
+
+
+
+
+
+
+ 停止录音后得到识别结果
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/EndToEnd/style.less b/demos/speech_web/web_client/src/components/SubMenu/ASR/EndToEnd/style.less
new file mode 100644
index 000000000..1fc04b2c7
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/EndToEnd/style.less
@@ -0,0 +1,114 @@
+.endToEndIdentification {
+ width: 1106px;
+ height: 270px;
+ // background-color: pink;
+ padding-top: 40px;
+ box-sizing: border-box;
+ display: flex;
+ // 开始识别
+ .public_recognition_speech {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+
+ .endToEndIdentification_start_recorder_img {
+ width: 116px;
+ height: 116px;
+ background: #2932E1;
+ background: url("../../../../assets/image/ic_开始聊天.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 116px 116px;
+ margin-left: 98px;
+ cursor: pointer;
+ margin-bottom: 20px;
+ &:hover {
+ background: url("../../../../assets/image/ic_开始聊天_hover.svg");
+
+ };
+
+ };
+
+ .endToEndIdentification_end_recorder_img {
+ width: 116px;
+ height: 116px;
+ background: #2932E1;
+ border-radius: 50%;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ margin-left: 98px;
+ margin-bottom: 20px;
+ cursor: pointer;
+ .endToEndIdentification_end_recorder_img_back {
+ width: 50px;
+ height: 50px;
+ background: url("../../../../assets/image/ic_大-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 50px 50px;
+
+ &:hover {
+ opacity: 0.9;
+
+ };
+ };
+
+ };
+ .endToEndIdentification_prompt {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 124px;
+ margin-bottom: 10px;
+ };
+ .speech_text_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #999999;
+ font-weight: 400;
+ margin-left: 90px;
+ };
+ };
+ // 指向
+ .public_recognition_point_to {
+ width: 47px;
+ height: 67px;
+ background: url("../../../../assets/image/步骤-箭头切图@2x.png") no-repeat;
+ background-position: center;
+ background-size: 47px 67px;
+ margin-top: 91px;
+ margin-right: 67px;
+ };
+ // 识别结果
+ .public_recognition_result {
+ width: 680px;
+ height: 230px;
+ background: #FAFAFA;
+ padding: 40px 50px 0px 50px;
+ div {
+ &:nth-of-type(1) {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #666666;
+ line-height: 26px;
+ font-weight: 500;
+ margin-bottom: 20px;
+ };
+ &:nth-of-type(2) {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #666666;
+ line-height: 26px;
+ font-weight: 500;
+ };
+ };
+ };
+
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/RealTime/RealTime.vue b/demos/speech_web/web_client/src/components/SubMenu/ASR/RealTime/RealTime.vue
new file mode 100644
index 000000000..761a5c11f
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/RealTime/RealTime.vue
@@ -0,0 +1,128 @@
+
+
+
+
+
+
+
+
+
+
+ 实时得到识别结果
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/RealTime/style.less b/demos/speech_web/web_client/src/components/SubMenu/ASR/RealTime/style.less
new file mode 100644
index 000000000..baa89c570
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/RealTime/style.less
@@ -0,0 +1,112 @@
+.realTime{
+ width: 1106px;
+ height: 270px;
+ // background-color: pink;
+ padding-top: 40px;
+ box-sizing: border-box;
+ display: flex;
+ // 开始识别
+ .public_recognition_speech {
+ width: 295px;
+ height: 230px;
+ padding-top: 32px;
+ box-sizing: border-box;
+ .endToEndIdentification_start_recorder_img {
+ width: 116px;
+ height: 116px;
+ background: #2932E1;
+ background: url("../../../../assets/image/ic_开始聊天.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 116px 116px;
+ margin-left: 98px;
+ cursor: pointer;
+ margin-bottom: 20px;
+ &:hover {
+ background: url("../../../../assets/image/ic_开始聊天_hover.svg");
+
+ };
+
+ };
+
+ .endToEndIdentification_end_recorder_img {
+ width: 116px;
+ height: 116px;
+ background: #2932E1;
+ border-radius: 50%;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ margin-left: 98px;
+ margin-bottom: 20px;
+ cursor: pointer;
+ .endToEndIdentification_end_recorder_img_back {
+ width: 50px;
+ height: 50px;
+ background: url("../../../../assets/image/ic_大-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 50px 50px;
+
+ &:hover {
+ opacity: 0.9;
+
+ };
+ };
+
+ };
+ .endToEndIdentification_prompt {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 124px;
+ margin-bottom: 10px;
+ };
+ .speech_text_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #999999;
+ font-weight: 400;
+ margin-left: 105px;
+ };
+ };
+ // 指向
+ .public_recognition_point_to {
+ width: 47px;
+ height: 67px;
+ background: url("../../../../assets/image/步骤-箭头切图@2x.png") no-repeat;
+ background-position: center;
+ background-size: 47px 67px;
+ margin-top: 91px;
+ margin-right: 67px;
+ };
+ // 识别结果
+ .public_recognition_result {
+ width: 680px;
+ height: 230px;
+ background: #FAFAFA;
+ padding: 40px 50px 0px 50px;
+ div {
+ &:nth-of-type(1) {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #666666;
+ line-height: 26px;
+ font-weight: 500;
+ margin-bottom: 20px;
+ };
+ &:nth-of-type(2) {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #666666;
+ line-height: 26px;
+ font-weight: 500;
+ };
+ };
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ASR/style.less b/demos/speech_web/web_client/src/components/SubMenu/ASR/style.less
new file mode 100644
index 000000000..92ce9340b
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ASR/style.less
@@ -0,0 +1,76 @@
+.speech_recognition {
+ width: 1200px;
+ height: 410px;
+ background: #FFFFFF;
+ padding: 40px 50px 50px 44px;
+ position: relative;
+ .frame {
+ width: 605px;
+ height: 50px;
+ border: 1px solid rgba(238,238,238,1);
+ border-radius: 25px;
+ position: absolute;
+ }
+ .speech_recognition_mytabs {
+ .ant-tabs-tab {
+ position: relative;
+ display: inline-flex;
+ align-items: center;
+ // padding: 12px 0;
+ font-size: 14px;
+ background: transparent;
+ border: 0;
+ outline: none;
+ cursor: pointer;
+ padding: 12px 26px;
+ box-sizing: border-box;
+ }
+ .ant-tabs-tab-active {
+ height: 50px;
+ background: #EEEFFD;
+ border-radius: 25px;
+ padding: 12px 26px;
+ box-sizing: border-box;
+ };
+ .speech_recognition .speech_recognition_mytabs .ant-tabs-ink-bar {
+ position: absolute;
+ background: transparent !important;
+ pointer-events: none;
+ }
+ .ant-tabs-ink-bar {
+ position: absolute;
+ background: transparent !important;
+ pointer-events: none;
+ }
+ .experience .experience_wrapper .experience_content .experience_tabs .ant-tabs-nav::before {
+ position: absolute;
+ right: 0;
+ left: 0;
+ border-bottom: 1px solid transparent !important;
+ // border: none;
+ content: '';
+ }
+ .ant-tabs-top > .ant-tabs-nav::before, .ant-tabs-bottom > .ant-tabs-nav::before, .ant-tabs-top > div > .ant-tabs-nav::before, .ant-tabs-bottom > div > .ant-tabs-nav::before {
+ position: absolute;
+ right: 0;
+ left: 0;
+ border-bottom: 1px solid transparent !important;
+ // border: none;
+ content: '';
+ }
+ .ant-tabs-top > .ant-tabs-nav::before, .ant-tabs-bottom > .ant-tabs-nav::before, .ant-tabs-top > div > .ant-tabs-nav::before, .ant-tabs-bottom > div > .ant-tabs-nav::before {
+ position: absolute;
+ right: 0;
+ left: 0;
+ border-bottom: 1px solid transparent !important;
+ content: '';
+ }
+ .ant-tabs-nav::before {
+ position: absolute;
+ right: 0;
+ left: 0;
+ border-bottom: 1px solid transparent !important;
+ content: '';
+ };
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ChatBot/Chat.vue b/demos/speech_web/web_client/src/components/SubMenu/ChatBot/Chat.vue
new file mode 100644
index 000000000..9d356fc80
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ChatBot/Chat.vue
@@ -0,0 +1,298 @@
+
+
+
语音聊天
+
+ {{ recoText }}
+
+ {{ envText }}
+
+ 清空聊天
+
+
+
+
+
{{Result}}
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ChatBot/ChatT.vue b/demos/speech_web/web_client/src/components/SubMenu/ChatBot/ChatT.vue
new file mode 100644
index 000000000..c37c083ff
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ChatBot/ChatT.vue
@@ -0,0 +1,255 @@
+
+
+
+
+
+
点击开始聊天
+
聊天前请允许浏览器获取麦克风权限
+
+
+
+
+
+
+
+
+ {{ nlpResult }}
+
+
+
+
+
+ {{ asrResult }}
+
+
+
+
+
+
+
+ 结束聊天
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/ChatBot/style.less b/demos/speech_web/web_client/src/components/SubMenu/ChatBot/style.less
new file mode 100644
index 000000000..d868fd470
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/ChatBot/style.less
@@ -0,0 +1,181 @@
+.voice_chat {
+ width: 1200px;
+ height: 410px;
+ background: #FFFFFF;
+ position: relative;
+ // 开始聊天
+ .voice_chat_wrapper {
+ top: 50%;
+ left: 50%;
+ transform: translate(-50%,-50%);
+ position: absolute;
+ .voice_chat_btn {
+ width: 116px;
+ height: 116px;
+ margin-left: 54px;
+ // background: #2932E1;
+ border-radius: 50%;
+ cursor: pointer;
+ background: url("../../../assets/image/ic_开始聊天.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 116px 116px;
+ margin-bottom: 17px;
+ &:hover {
+ width: 116px;
+ height: 116px;
+ background: url("../../../assets/image/ic_开始聊天_hover.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 116px 116px;
+ };
+
+ };
+ .voice_chat_btn_title {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ letter-spacing: 0;
+ text-align: center;
+ line-height: 22px;
+ font-weight: 500;
+ margin-bottom: 10px;
+ };
+ .voice_chat_btn_prompt {
+ height: 24px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #999999;
+ letter-spacing: 0;
+ text-align: center;
+ line-height: 24px;
+ font-weight: 400;
+ };
+ };
+ .voice_chat_wrapper::after {
+ content: "";
+ display: block;
+ clear: both;
+ visibility: hidden;
+ };
+ // 结束聊天
+ .voice_chat_dialog_wrapper {
+ width: 1200px;
+ height: 410px;
+ background: #FFFFFF;
+ position: relative;
+ .dialog_box {
+ width: 100%;
+ height: 410px;
+ padding: 50px 198px 82px 199px;
+ box-sizing: border-box;
+
+ .dialog_content {
+ width: 100%;
+ height: 268px;
+ // background: rgb(113, 144, 145);
+ padding: 0px;
+ overflow: auto;
+ li {
+ list-style-type: none;
+ margin-bottom: 33px;
+ display: flex;
+ align-items: center;
+ &:last-of-type(1) {
+ margin-bottom: 0px;
+ };
+ .dialog_content_img_pp {
+ width: 60px;
+ height: 60px;
+ // transform: scaleX(-1);
+ background: url("../../../assets/image/飞桨头像@2x.png");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 60px 60px;
+ margin-right: 20px;
+ };
+ .dialog_content_img_user {
+ width: 60px;
+ height: 60px;
+ transform: scaleX(-1);
+ background: url("../../../assets/image/用户头像@2x.png");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 60px 60px;
+ margin-left: 20px;
+ };
+ .dialog_content_dialogue_pp {
+ height: 50px;
+ background: #F5F5F5;
+ border-radius: 25px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #000000;
+ line-height: 50px;
+ font-weight: 400;
+ padding: 0px 16px;
+ box-sizing: border-box;
+ };
+ .dialog_content_dialogue_user {
+ height: 50px;
+ background: rgba(41,50,225,0.90);
+ border-radius: 25px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #FFFFFF;
+ line-height: 50px;
+ font-weight: 400;
+ padding: 0px 16px;
+ box-sizing: border-box;
+ };
+ };
+ };
+ .move_dialogue {
+ justify-content: flex-end;
+ };
+
+ };
+
+ .btn_end_dialog {
+ width: 124px;
+ height: 42px;
+ line-height: 42px;
+ background: #FFFFFF;
+ box-shadow: 0px 4px 16px 0px rgba(0,0,0,0.09);
+ border-radius: 21px;
+ padding: 0px 24px;
+ box-sizing: border-box;
+ position: absolute;
+ left: 50%;
+ bottom: 40px;
+ transform: translateX(-50%);
+ display: flex;
+ justify-content: space-between;
+ align-items: center;
+ cursor: pointer;
+ span {
+ display: inline-block;
+ &:nth-of-type(1) {
+ width: 16px;
+ height: 16px;
+ background: url("../../../assets/image/ic_小-结束.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 16px 16px;
+
+ };
+ &:nth-of-type(2) {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #F33E3E;
+ text-align: center;
+ font-weight: 400;
+ line-height: 20px;
+ margin-left: 4px;
+ };
+ };
+ };
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/IE/IE.vue b/demos/speech_web/web_client/src/components/SubMenu/IE/IE.vue
new file mode 100644
index 000000000..c7dd04e9d
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/IE/IE.vue
@@ -0,0 +1,125 @@
+
+
+
信息抽取体验
+ {{ recoText }}
+ 识别结果: {{ asrResultOffline }}
+ 时间:{{ time }}
+ 出发地:{{ outset }}
+ 目的地:{{ destination }}
+ 费用:{{ amount }}
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/IE/IET.vue b/demos/speech_web/web_client/src/components/SubMenu/IE/IET.vue
new file mode 100644
index 000000000..50eadec70
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/IE/IET.vue
@@ -0,0 +1,166 @@
+
+
+
+
交通费报销
+
+
+
+
+
试试说“早上八点,我从广州到北京花了四百二十六元”
+
+
+
+
+
+
+
+
+
+ 识别结果
+
+
+
+
+ {{ asrResult }}
+
+
+ 时间:{{voiceCommandsData.time}}
+
+
+ 费用:{{voiceCommandsData.amount}}
+
+
+ 出发地:{{voiceCommandsData.outset}}
+
+
+ 目的地:{{voiceCommandsData.destination}}
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/IE/style.less b/demos/speech_web/web_client/src/components/SubMenu/IE/style.less
new file mode 100644
index 000000000..988666a26
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/IE/style.less
@@ -0,0 +1,179 @@
+.voice_commands {
+ width: 1200px;
+ height: 410px;
+ background: #FFFFFF;
+ padding: 40px 50px 50px 50px;
+ box-sizing: border-box;
+ display: flex;
+ // 交通报销
+ .voice_commands_traffic {
+ width: 468px;
+ height: 320px;
+ .voice_commands_traffic_title {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ letter-spacing: 0;
+ line-height: 26px;
+ font-weight: 500;
+ margin-bottom: 30px;
+ // background: pink;
+ };
+ .voice_commands_traffic_wrapper {
+ width: 465px;
+ height: 264px;
+ // background: #FAFAFA;
+ position: relative;
+ .voice_commands_traffic_wrapper_move {
+ position: absolute;
+ top: 50%;
+ left: 50%;
+ transform: translate(-50%,-50%);
+ };
+ .traffic_btn_img_btn {
+ width: 116px;
+ height: 116px;
+ background: #2932E1;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ border-radius: 50%;
+ cursor: pointer;
+ margin-bottom: 20px;
+ margin-left: 84px;
+ &:hover {
+ width: 116px;
+ height: 116px;
+ background: #7278F5;
+
+ .start_recorder_img{
+ width: 50px;
+ height: 50px;
+ background: url("../../../assets/image/ic_开始聊天_hover.svg") no-repeat;
+ background-position: center;
+ background-size: 50px 50px;
+ };
+
+ };
+
+ .start_recorder_img{
+ width: 50px;
+ height: 50px;
+ background: url("../../../assets/image/ic_开始聊天.svg") no-repeat;
+ background-position: center;
+ background-size: 50px 50px;
+ };
+
+ };
+ .traffic_btn_prompt {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ font-weight: 500;
+ margin-bottom: 16px;
+ margin-left: 110px;
+ };
+ .traffic_btn_list {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #999999;
+ font-weight: 400;
+ width: 112%;
+ };
+ };
+ };
+ //指向
+ .voice_point_to {
+ width: 47px;
+ height: 63px;
+ background: url("../../../assets/image/步骤-箭头切图@2x.png") no-repeat;
+ background-position: center;
+ background-size: 47px 63px;
+ margin-top: 164px;
+ margin-right: 82px;
+ };
+ //识别结果
+ .voice_commands_IdentifyTheResults {
+ .voice_commands_IdentifyTheResults_title {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ line-height: 26px;
+ font-weight: 500;
+ margin-bottom: 30px;
+ };
+ // 显示框
+ .voice_commands_IdentifyTheResults_show {
+ width: 503px;
+ height: 264px;
+ background: #FAFAFA;
+ padding: 40px 0px 0px 50px;
+ box-sizing: border-box;
+ .voice_commands_IdentifyTheResults_show_title {
+ height: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ // text-align: center;
+ font-weight: 500;
+ margin-bottom: 30px;
+ };
+ .oice_commands_IdentifyTheResults_show_time {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #666666;
+ font-weight: 500;
+ margin-bottom: 12px;
+ };
+ .oice_commands_IdentifyTheResults_show_money {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #666666;
+ font-weight: 500;
+ margin-bottom: 12px;
+ };
+ .oice_commands_IdentifyTheResults_show_origin {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #666666;
+ font-weight: 500;
+ margin-bottom: 12px;
+ };
+ .oice_commands_IdentifyTheResults_show_destination {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #666666;
+ font-weight: 500;
+ };
+ };
+ //加载状态
+ .voice_commands_IdentifyTheResults_show_loading {
+ width: 503px;
+ height: 264px;
+ background: #FAFAFA;
+ padding: 40px 0px 0px 50px;
+ box-sizing: border-box;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ };
+ };
+ .end_recorder_img {
+ width: 50px;
+ height: 50px;
+ background: url("../../../assets/image/ic_大-声音波浪.svg") no-repeat;
+ background-position: center;
+ background-size: 50px 50px;
+ };
+ .end_recorder_img:hover {
+ opacity: 0.9;
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/TTS/TTST.vue b/demos/speech_web/web_client/src/components/SubMenu/TTS/TTST.vue
new file mode 100644
index 000000000..353221f7b
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/TTS/TTST.vue
@@ -0,0 +1,359 @@
+
+
+
+
+
+
+
+
+
+ 语音合成
+
+
+
+ 流式合成
+
+
+
+
+
+
+
+
+ 响应时间:{{ Number(streamingAcceptStamp) - Number(streamingSendStamp) }}ms
+
+
+
+
+
+
+
+
+
+
+
+ 端到端合成
+
+
+
+
+
+
+
+
响应时间:{{Number(endToEndAcceptStamp) - Number(endToEndSendStamp) }}ms
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/TTS/style.less b/demos/speech_web/web_client/src/components/SubMenu/TTS/style.less
new file mode 100644
index 000000000..b5d189650
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/TTS/style.less
@@ -0,0 +1,369 @@
+.speech_recognition {
+ width: 1200px;
+ height: 410px;
+ background: #FFFFFF;
+ padding: 40px 0px 50px 50px;
+ box-sizing: border-box;
+ display: flex;
+ .recognition_text {
+ width: 589px;
+ height: 320px;
+ // background: pink;
+ .recognition_text_header {
+ margin-bottom: 30px;
+ display: flex;
+ justify-content: space-between;
+ align-items: center;
+ .recognition_text_title {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ letter-spacing: 0;
+ line-height: 26px;
+ font-weight: 500;
+ };
+ .recognition_text_random {
+ display: flex;
+ align-items: center;
+ cursor: pointer;
+ span {
+ display: inline-block;
+ &:nth-of-type(1) {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image/ic_更换示例.svg") no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 5px;
+
+ };
+ &:nth-of-type(2) {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #2932E1;
+ letter-spacing: 0;
+ font-weight: 400;
+ };
+ };
+ };
+ };
+ .recognition_text_field {
+ width: 589px;
+ height: 264px;
+ background: #FAFAFA;
+ .textToSpeech_content_show_text{
+ width: 100%;
+ height: 264px;
+ padding: 0px 30px 30px 0px;
+ box-sizing: border-box;
+ .ant-input {
+ height: 208px;
+ resize: none;
+ // margin-bottom: 230px;
+ padding: 21px 20px;
+ };
+ };
+ };
+ };
+ // 指向
+ .recognition_point_to {
+ width: 47px;
+ height: 63px;
+ background: url("../../../assets/image/步骤-箭头切图@2x.png") no-repeat;
+ background-position: center;
+ background-size: 47px 63px;
+ margin-top: 164px;
+ margin-right: 101px;
+ margin-left: 100px;
+ margin-top: 164px;
+ };
+ // 语音合成
+ .speech_recognition_new {
+ .speech_recognition_title {
+ height: 26px;
+ font-family: PingFangSC-Medium;
+ font-size: 16px;
+ color: #000000;
+ line-height: 26px;
+ font-weight: 500;
+ margin-left: 32px;
+ margin-bottom: 96px;
+ };
+ // 流式合成
+ .speech_recognition_streaming {
+ width: 136px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ text-align: center;
+ line-height: 44px;
+ margin-bottom: 40px;
+ cursor: pointer;
+ &:hover {
+ opacity: .9;
+ };
+ };
+ // 合成中
+ .streaming_ing_box {
+ display: flex;
+ align-items: center;
+ height: 44px;
+ margin-bottom: 40px;
+ .streaming_ing {
+ width: 136px;
+ height: 44px;
+ background: #7278F5;
+ border-radius: 22px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+
+ .streaming_ing_img {
+ width: 16px;
+ height: 16px;
+ // background: url("../../../assets/image/ic_小-录制语音.svg");
+ // background-repeat: no-repeat;
+ // background-position: center;
+ // background-size: 16px 16px;
+ // margin-right: 12px;
+ };
+ .streaming_ing_text {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+ };
+ // 合成时间文字
+ .streaming_time {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+ };
+
+
+ // 暂停播放
+ .streaming_suspended_box {
+ display: flex;
+ align-items: center;
+ height: 44px;
+ margin-bottom: 40px;
+ .streaming_suspended {
+ width: 136px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+
+ .streaming_suspended_img {
+ width: 16px;
+ height: 16px;
+ background: url("../../../assets/image/ic_暂停(按钮).svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 16px 16px;
+ margin-right: 12px;
+ };
+ .streaming_suspended_text {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+
+ };
+ // 暂停获取时间
+ .suspended_time {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 12px;
+ }
+ };
+
+ // 继续播放
+ .streaming_continue {
+ width: 136px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+ margin-bottom: 40px;
+ .streaming_continue_img {
+ width: 16px;
+ height: 16px;
+ background: url("../../../assets/image/ic_播放(按钮).svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 16px 16px;
+ margin-right: 12px;
+ };
+ .streaming_continue_text {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ };
+ };
+
+
+
+
+
+
+ // 端到端合成
+ .speech_recognition_end_to_end {
+ width: 136px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ text-align: center;
+ line-height: 44px;
+ cursor: pointer;
+ &:hover {
+ opacity: .9;
+ };
+ };
+ // 合成中
+ .end_to_end_ing_box {
+ display: flex;
+ align-items: center;
+ height: 44px;
+ .end_to_end_ing {
+ width: 136px;
+ height: 44px;
+ background: #7278F5;
+ border-radius: 22px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+ .end_to_end_ing_img {
+ width: 16px;
+ height: 16px;
+ // background: url("../../../assets/image/ic_小-录制语音.svg");
+ // background-repeat: no-repeat;
+ // background-position: center;
+ // background-size: 16px 16px;
+
+ };
+ .end_to_end_ing_text {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+ };
+ // 合成时间文本
+ .end_to_end_ing_time {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+ };
+
+
+ // 暂停播放
+ .end_to_end_suspended_box {
+ display: flex;
+ align-items: center;
+ height: 44px;
+ .end_to_end_suspended {
+ width: 136px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+ .end_to_end_suspended_img {
+ width: 16px;
+ height: 16px;
+ background: url("../../../assets/image/ic_暂停(按钮).svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 16px 16px;
+ margin-right: 12px;
+ };
+ .end_to_end_suspended_text {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ };
+ };
+ // 暂停播放时间
+ .end_to_end_ing_suspended_time {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #000000;
+ font-weight: 500;
+ margin-left: 12px;
+ };
+ };
+
+ // 继续播放
+ .end_to_end_continue {
+ width: 136px;
+ height: 44px;
+ background: #2932E1;
+ border-radius: 22px;
+ display: flex;
+ justify-content: center;
+ align-items: center;
+ cursor: pointer;
+ .end_to_end_continue_img {
+ width: 16px;
+ height: 16px;
+ background: url("../../../assets/image/ic_播放(按钮).svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 16px 16px;
+ margin-right: 12px;
+ };
+ .end_to_end_continue_text {
+ height: 20px;
+ font-family: PingFangSC-Medium;
+ font-size: 14px;
+ color: #FFFFFF;
+ font-weight: 500;
+ };
+ };
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/VPR/VPR.vue b/demos/speech_web/web_client/src/components/SubMenu/VPR/VPR.vue
new file mode 100644
index 000000000..1fe71e4d8
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/VPR/VPR.vue
@@ -0,0 +1,178 @@
+
+
+
+
声纹识别展示
+
+ {{ recoText }}
+ 注册
+ 识别
+
+
+
声纹得分结果
+
+
+
+
+
+
+
声纹数据列表
+
+
+
+
+
+
+
+
+
+
+
+ Delete
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/VPR/VPRT.vue b/demos/speech_web/web_client/src/components/SubMenu/VPR/VPRT.vue
new file mode 100644
index 000000000..e398da00c
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/VPR/VPRT.vue
@@ -0,0 +1,335 @@
+
+
+
+
+
+ 试试对我说:欢迎使用飞桨声纹识别系统
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 播放
+ 删除
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 试试对我说:请识别一下我的声音
+
+
+
+
+
+
+
+
+
+
识别结果
+
{{scoreResult}}
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/SubMenu/VPR/style.less b/demos/speech_web/web_client/src/components/SubMenu/VPR/style.less
new file mode 100644
index 000000000..cb3df49ef
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/SubMenu/VPR/style.less
@@ -0,0 +1,419 @@
+.voiceprint {
+ width: 1200px;
+ height: 410px;
+ background: #FFFFFF;
+ padding: 41px 80px 56px 80px;
+ box-sizing: border-box;
+ display: flex;
+ // 录制声纹
+ .voiceprint_recording {
+ width: 423px;
+ height: 354px;
+ margin-right: 66px;
+ .recording_title {
+ display: flex;
+ align-items: center;
+ margin-bottom: 20px;
+ div {
+ &:nth-of-type(1) {
+ width: 24px;
+ height: 24px;
+ background: rgba(41,50,225,0.70);
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #FFFFFF;
+ letter-spacing: 0;
+ text-align: center;
+ line-height: 24px;
+ font-weight: 400;
+ margin-right: 16px;
+ border-radius: 50%;
+ };
+ &:nth-of-type(2) {
+ height: 26px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #000000;
+ line-height: 26px;
+ font-weight: 400;
+ };
+ };
+ };
+ // 开始录音
+ .recording_btn {
+ width: 143px;
+ height: 44px;
+ cursor: pointer;
+ background: #2932E1;
+ padding: 0px 24px 0px 21px;
+ box-sizing: border-box;
+ border-radius: 22px;
+ display: flex;
+ align-items: center;
+ margin-bottom: 20px;
+ margin-top: 10px;
+
+ &:hover {
+ background: #7278F5;
+ .recording_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_录制声音小语音1.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ }
+ .recording_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_录制声音小语音1.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ .recording_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #FFFFFF;
+ font-weight: 400;
+ };
+
+ };
+ // 录音中
+ .recording_btn_the_recording {
+ width: 143px;
+ height: 44px;
+ cursor: pointer;
+ background: #7278F5;
+ padding: 0px 24px 0px 21px;
+ box-sizing: border-box;
+ border-radius: 22px;
+ display: flex;
+ align-items: center;
+ justify-content: center;
+ margin-bottom: 40px;
+ .recording_img_the_recording {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_小-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+ };
+ .recording_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #FFFFFF;
+ font-weight: 400;
+ };
+ };
+ // 完成录音
+ .complete_the_recording_btn {
+ width: 143px;
+ height: 44px;
+ cursor: pointer;
+ background: #2932E1;
+ padding: 0px 24px 0px 21px;
+ box-sizing: border-box;
+ border-radius: 22px;
+ display: flex;
+ align-items: center;
+ margin-bottom: 40px;
+ &:hover {
+ background: #7278F5;
+ .complete_the_recording_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_小-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ }
+ .complete_the_recording_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_小-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ .complete_the_recording_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #FFFFFF;
+ font-weight: 400;
+ };
+
+ };
+ // table
+ .recording_table {
+ width: 322px;
+ .recording_table_box {
+ .ant-table-thead > tr > th {
+ color: rgba(0, 0, 0, 0.85);
+ font-weight: 500;
+ text-align: left;
+ background: rgba(40,50,225,0.08);
+ border-bottom: none;
+ transition: background 0.3s ease;
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #333333;
+ // text-align: center;
+ font-weight: 400;
+ &:nth-of-type(2) {
+ border-left: 2px solid white;
+ };
+ };
+ .ant-table-tbody > tr > td {
+ border-bottom: 1px solid #f0f0f0;
+ transition: background 0.3s;
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #333333;
+ // text-align: center;
+ font-weight: 400;
+ };
+ };
+ };
+ // input
+ .recording_input {
+ width: 322px;
+ margin-bottom: 20px;
+ };
+ };
+ // 指向
+ .recording_point_to {
+ width: 63px;
+ height: 47px;
+ background: url("../../../assets/image//步骤-箭头切图@2x.png");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 63px 47px;
+ margin-right: 66px;
+ margin-top: 198px;
+ };
+ //识别声纹
+ .voiceprint_identify {
+ width: 423px;
+ height: 354px;
+ .identify_title {
+ display: flex;
+ align-items: center;
+ margin-bottom: 20px;
+ div {
+ &:nth-of-type(1) {
+ width: 24px;
+ height: 24px;
+ background: rgba(41,50,225,0.70);
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #FFFFFF;
+ letter-spacing: 0;
+ text-align: center;
+ line-height: 24px;
+ font-weight: 400;
+ margin-right: 16px;
+ border-radius: 50%;
+ };
+ &:nth-of-type(2) {
+ height: 26px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #000000;
+ line-height: 26px;
+ font-weight: 400;
+ };
+ };
+ };
+ // 开始识别
+ .identify_btn {
+ width: 143px;
+ height: 44px;
+ cursor: pointer;
+ background: #2932E1;
+ padding: 0px 24px 0px 21px;
+ box-sizing: border-box;
+ border-radius: 22px;
+ display: flex;
+ align-items: center;
+ margin-bottom: 40px;
+ margin-top: 10px;
+ &:hover {
+ background: #7278F5;
+ .identify_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_录制声音小语音1.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ }
+ .identify_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_录制声音小语音1.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ .identify_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #FFFFFF;
+ font-weight: 400;
+ };
+
+ };
+ // 识别中
+ .identify_btn_the_recording {
+ width: 143px;
+ height: 44px;
+ cursor: pointer;
+ background: #7278F5;
+ padding: 0px 24px 0px 21px;
+ box-sizing: border-box;
+ border-radius: 22px;
+ display: flex;
+ align-items: center;
+ justify-content: center;
+ margin-bottom: 40px;
+ .identify_img_the_recording {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_录制声音小语音1.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+ };
+ .recording_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #FFFFFF;
+ font-weight: 400;
+ };
+ };
+ // 完成识别
+ .identify_complete_the_recording_btn {
+ width: 143px;
+ height: 44px;
+ cursor: pointer;
+ background: #2932E1;
+ padding: 0px 24px 0px 21px;
+ box-sizing: border-box;
+ border-radius: 22px;
+ display: flex;
+ align-items: center;
+ margin-bottom: 40px;
+ &:hover {
+ background: #7278F5;
+ .identify_complete_the_recording_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_小-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ }
+ .identify_complete_the_recording_img {
+ width: 20px;
+ height: 20px;
+ background: url("../../../assets/image//icon_小-声音波浪.svg");
+ background-repeat: no-repeat;
+ background-position: center;
+ background-size: 20px 20px;
+ margin-right: 8.26px;
+
+ };
+ .identify_complete_the_recording_prompt {
+ height: 20px;
+ font-family: PingFangSC-Regular;
+ font-size: 12px;
+ color: #FFFFFF;
+ font-weight: 400;
+ };
+
+ };
+
+
+
+
+ // 结果
+ .identify_result {
+ width: 422px;
+ height: 184px;
+ text-align: center;
+ line-height: 184px;
+ background: #FAFAFA;
+ position: relative;
+ .identify_result_default {
+
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #999999;
+ font-weight: 400;
+ };
+ .identify_result_content {
+ // text-align: center;
+ // position: absolute;
+ // top: 50%;
+ // left: 50%;
+ // transform: translate(-50%,-50%);
+ div {
+ &:nth-of-type(1) {
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #666666;
+ font-weight: 400;
+ margin-bottom: 10px;
+ };
+ &:nth-of-type(2) {
+ height: 33px;
+ font-family: PingFangSC-Medium;
+ font-size: 24px;
+ color: #000000;
+ font-weight: 500;
+ };
+ };
+ };
+ };
+ };
+ .action_btn {
+ display: inline-block;
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 16px;
+ color: #2932E1;
+ text-align: center;
+ font-weight: 400;
+ cursor: pointer;
+ };
+};
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/components/style.less b/demos/speech_web/web_client/src/components/style.less
new file mode 100644
index 000000000..98f414f1c
--- /dev/null
+++ b/demos/speech_web/web_client/src/components/style.less
@@ -0,0 +1,83 @@
+.experience {
+ width: 100%;
+ height: 709px;
+ // background: url("../assets/image/在线体验-背景@2x.png") no-repeat;
+ background-size: 100% 709px;
+ background-position: initial;
+ //
+ .experience_wrapper {
+ width: 1200px;
+ height: 709px;
+ margin: 0 auto;
+ padding: 0px 0px 0px 0px;
+ box-sizing: border-box;
+ // background: red;
+ .experience_title {
+ height: 42px;
+ font-family: PingFangSC-Semibold;
+ font-size: 30px;
+ color: #000000;
+ font-weight: 600;
+ line-height: 42px;
+ text-align: center;
+ margin-bottom: 10px;
+ };
+ .experience_describe {
+ height: 22px;
+ font-family: PingFangSC-Regular;
+ font-size: 14px;
+ color: #666666;
+ letter-spacing: 0;
+ text-align: center;
+ line-height: 22px;
+ font-weight: 400;
+ margin-bottom: 30px;
+ };
+ .experience_content {
+ width: 1200px;
+ margin: 0 auto;
+ display: flex;
+ justify-content: center;
+ .experience_tabs {
+
+ margin-top: 15px;
+
+ & > .ant-tabs-nav {
+ margin-bottom: 20px;
+
+ &::before {
+ content: none;
+ }
+
+ .ant-tabs-nav-wrap {
+ justify-content: center;
+ }
+
+ .ant-tabs-tab {
+ font-size: 20px;
+ }
+
+ .ant-tabs-nav-list {
+ margin-right: -32px;
+ flex: none;
+ }
+ };
+
+ .ant-tabs-nav::before {
+ position: absolute;
+ right: 0;
+ left: 0;
+ border-bottom: 1px solid #f6f7fe;
+ content: '';
+ };
+
+ };
+ };
+ };
+};
+.experience::after {
+ content: "";
+ display: block;
+ clear: both;
+ visibility: hidden;
+}
\ No newline at end of file
diff --git a/demos/speech_web/web_client/src/main.js b/demos/speech_web/web_client/src/main.js
new file mode 100644
index 000000000..3fbf87c85
--- /dev/null
+++ b/demos/speech_web/web_client/src/main.js
@@ -0,0 +1,13 @@
+import { createApp } from 'vue'
+import ElementPlus from 'element-plus'
+import 'element-plus/dist/index.css'
+import Antd from 'ant-design-vue';
+import 'ant-design-vue/dist/antd.css';
+import App from './App.vue'
+import axios from 'axios'
+
+const app = createApp(App)
+app.config.globalProperties.$http = axios
+
+app.use(ElementPlus).use(Antd)
+app.mount('#app')
diff --git a/demos/speech_web/web_client/vite.config.js b/demos/speech_web/web_client/vite.config.js
new file mode 100644
index 000000000..dc7e6978c
--- /dev/null
+++ b/demos/speech_web/web_client/vite.config.js
@@ -0,0 +1,28 @@
+import { defineConfig } from 'vite'
+import vue from '@vitejs/plugin-vue'
+
+// https://vitejs.dev/config/
+export default defineConfig({
+ plugins: [vue()],
+ css:
+ { preprocessorOptions:
+ { css:
+ {
+ charset: false
+ }
+ }
+ },
+ build: {
+ assetsInlineLimit: '2048' // 2kb
+ },
+ server: {
+ host: "0.0.0.0",
+ proxy: {
+ "/api": {
+ target: "http://localhost:8010",
+ changeOrigin: true,
+ rewrite: (path) => path.replace(/^\/api/, ""),
+ },
+ },
+ },
+})
diff --git a/demos/speech_web/web_client/yarn.lock b/demos/speech_web/web_client/yarn.lock
new file mode 100644
index 000000000..6777cf4ce
--- /dev/null
+++ b/demos/speech_web/web_client/yarn.lock
@@ -0,0 +1,785 @@
+# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
+# yarn lockfile v1
+
+
+"@ant-design/colors@^6.0.0":
+ version "6.0.0"
+ resolved "https://registry.npmmirror.com/@ant-design/colors/-/colors-6.0.0.tgz"
+ integrity sha512-qAZRvPzfdWHtfameEGP2Qvuf838NhergR35o+EuVyB5XvSA98xod5r4utvi4TJ3ywmevm290g9nsCG5MryrdWQ==
+ dependencies:
+ "@ctrl/tinycolor" "^3.4.0"
+
+"@ant-design/icons-svg@^4.2.1":
+ version "4.2.1"
+ resolved "https://registry.npmmirror.com/@ant-design/icons-svg/-/icons-svg-4.2.1.tgz"
+ integrity sha512-EB0iwlKDGpG93hW8f85CTJTs4SvMX7tt5ceupvhALp1IF44SeUFOMhKUOYqpsoYWQKAOuTRDMqn75rEaKDp0Xw==
+
+"@ant-design/icons-vue@^6.0.0":
+ version "6.1.0"
+ resolved "https://registry.npmmirror.com/@ant-design/icons-vue/-/icons-vue-6.1.0.tgz"
+ integrity sha512-EX6bYm56V+ZrKN7+3MT/ubDkvJ5rK/O2t380WFRflDcVFgsvl3NLH7Wxeau6R8DbrO5jWR6DSTC3B6gYFp77AA==
+ dependencies:
+ "@ant-design/colors" "^6.0.0"
+ "@ant-design/icons-svg" "^4.2.1"
+
+"@babel/parser@^7.16.4":
+ version "7.17.9"
+ resolved "https://registry.npmmirror.com/@babel/parser/-/parser-7.17.9.tgz"
+ integrity sha512-vqUSBLP8dQHFPdPi9bc5GK9vRkYHJ49fsZdtoJ8EQ8ibpwk5rPKfvNIwChB0KVXcIjcepEBBd2VHC5r9Gy8ueg==
+
+"@babel/runtime@^7.10.5":
+ version "7.17.9"
+ resolved "https://registry.npmmirror.com/@babel/runtime/-/runtime-7.17.9.tgz"
+ integrity sha512-lSiBBvodq29uShpWGNbgFdKYNiFDo5/HIYsaCEY9ff4sb10x9jizo2+pRrSyF4jKZCXqgzuqBOQKbUm90gQwJg==
+ dependencies:
+ regenerator-runtime "^0.13.4"
+
+"@ctrl/tinycolor@^3.4.0":
+ version "3.4.1"
+ resolved "https://registry.npmmirror.com/@ctrl/tinycolor/-/tinycolor-3.4.1.tgz"
+ integrity sha512-ej5oVy6lykXsvieQtqZxCOaLT+xD4+QNarq78cIYISHmZXshCvROLudpQN3lfL8G0NL7plMSSK+zlyvCaIJ4Iw==
+
+"@element-plus/icons-vue@^1.1.4":
+ version "1.1.4"
+ resolved "https://registry.npmmirror.com/@element-plus/icons-vue/-/icons-vue-1.1.4.tgz"
+ integrity sha512-Iz/nHqdp1sFPmdzRwHkEQQA3lKvoObk8azgABZ81QUOpW9s/lUyQVUSh0tNtEPZXQlKwlSh7SPgoVxzrE0uuVQ==
+
+"@floating-ui/core@^0.6.1":
+ version "0.6.1"
+ resolved "https://registry.npmmirror.com/@floating-ui/core/-/core-0.6.1.tgz"
+ integrity sha512-Y30eVMcZva8o84c0HcXAtDO4BEzPJMvF6+B7x7urL2xbAqVsGJhojOyHLaoQHQYjb6OkqRq5kO+zeySycQwKqg==
+
+"@floating-ui/dom@^0.4.2":
+ version "0.4.4"
+ resolved "https://registry.npmmirror.com/@floating-ui/dom/-/dom-0.4.4.tgz"
+ integrity sha512-0Ulu3B/dqQplUUSqnTx0foSrlYuMN+GTtlJWvNJwt6Fr7/PqmlR/Y08o6/+bxDWr6p3roBJRaQ51MDZsNmEhhw==
+ dependencies:
+ "@floating-ui/core" "^0.6.1"
+
+"@popperjs/core@^2.11.4":
+ version "2.11.5"
+ resolved "https://registry.npmmirror.com/@popperjs/core/-/core-2.11.5.tgz"
+ integrity sha512-9X2obfABZuDVLCgPK9aX0a/x4jaOEweTTWE2+9sr0Qqqevj2Uv5XorvusThmc9XGYpS9yI+fhh8RTafBtGposw==
+
+"@simonwep/pickr@~1.8.0":
+ version "1.8.2"
+ resolved "https://registry.npmmirror.com/@simonwep/pickr/-/pickr-1.8.2.tgz"
+ integrity sha512-/l5w8BIkrpP6n1xsetx9MWPWlU6OblN5YgZZphxan0Tq4BByTCETL6lyIeY8lagalS2Nbt4F2W034KHLIiunKA==
+ dependencies:
+ core-js "^3.15.1"
+ nanopop "^2.1.0"
+
+"@types/lodash-es@^4.17.6":
+ version "4.17.6"
+ resolved "https://registry.npmmirror.com/@types/lodash-es/-/lodash-es-4.17.6.tgz"
+ integrity sha512-R+zTeVUKDdfoRxpAryaQNRKk3105Rrgx2CFRClIgRGaqDTdjsm8h6IYA8ir584W3ePzkZfst5xIgDwYrlh9HLg==
+ dependencies:
+ "@types/lodash" "*"
+
+"@types/lodash@*", "@types/lodash@^4.14.181":
+ version "4.14.181"
+ resolved "https://registry.npmmirror.com/@types/lodash/-/lodash-4.14.181.tgz"
+ integrity sha512-n3tyKthHJbkiWhDZs3DkhkCzt2MexYHXlX0td5iMplyfwketaOeKboEVBqzceH7juqvEg3q5oUoBFxSLu7zFag==
+
+"@vitejs/plugin-vue@^2.3.0":
+ version "2.3.1"
+ resolved "https://registry.npmmirror.com/@vitejs/plugin-vue/-/plugin-vue-2.3.1.tgz"
+ integrity sha512-YNzBt8+jt6bSwpt7LP890U1UcTOIZZxfpE5WOJ638PNxSEKOqAi0+FSKS0nVeukfdZ0Ai/H7AFd6k3hayfGZqQ==
+
+"@vue/compiler-core@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/compiler-core/-/compiler-core-3.2.32.tgz"
+ integrity sha512-bRQ8Rkpm/aYFElDWtKkTPHeLnX5pEkNxhPUcqu5crEJIilZH0yeFu/qUAcV4VfSE2AudNPkQSOwMZofhnuutmA==
+ dependencies:
+ "@babel/parser" "^7.16.4"
+ "@vue/shared" "3.2.32"
+ estree-walker "^2.0.2"
+ source-map "^0.6.1"
+
+"@vue/compiler-dom@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/compiler-dom/-/compiler-dom-3.2.32.tgz"
+ integrity sha512-maa3PNB/NxR17h2hDQfcmS02o1f9r9QIpN1y6fe8tWPrS1E4+q8MqrvDDQNhYVPd84rc3ybtyumrgm9D5Rf/kg==
+ dependencies:
+ "@vue/compiler-core" "3.2.32"
+ "@vue/shared" "3.2.32"
+
+"@vue/compiler-sfc@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/compiler-sfc/-/compiler-sfc-3.2.32.tgz"
+ integrity sha512-uO6+Gh3AVdWm72lRRCjMr8nMOEqc6ezT9lWs5dPzh1E9TNaJkMYPaRtdY9flUv/fyVQotkfjY/ponjfR+trPSg==
+ dependencies:
+ "@babel/parser" "^7.16.4"
+ "@vue/compiler-core" "3.2.32"
+ "@vue/compiler-dom" "3.2.32"
+ "@vue/compiler-ssr" "3.2.32"
+ "@vue/reactivity-transform" "3.2.32"
+ "@vue/shared" "3.2.32"
+ estree-walker "^2.0.2"
+ magic-string "^0.25.7"
+ postcss "^8.1.10"
+ source-map "^0.6.1"
+
+"@vue/compiler-ssr@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/compiler-ssr/-/compiler-ssr-3.2.32.tgz"
+ integrity sha512-ZklVUF/SgTx6yrDUkaTaBL/JMVOtSocP+z5Xz/qIqqLdW/hWL90P+ob/jOQ0Xc/om57892Q7sRSrex0wujOL2Q==
+ dependencies:
+ "@vue/compiler-dom" "3.2.32"
+ "@vue/shared" "3.2.32"
+
+"@vue/reactivity-transform@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/reactivity-transform/-/reactivity-transform-3.2.32.tgz"
+ integrity sha512-CW1W9zaJtE275tZSWIfQKiPG0iHpdtSlmTqYBu7Y62qvtMgKG5yOxtvBs4RlrZHlaqFSE26avLAgQiTp4YHozw==
+ dependencies:
+ "@babel/parser" "^7.16.4"
+ "@vue/compiler-core" "3.2.32"
+ "@vue/shared" "3.2.32"
+ estree-walker "^2.0.2"
+ magic-string "^0.25.7"
+
+"@vue/reactivity@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/reactivity/-/reactivity-3.2.32.tgz"
+ integrity sha512-4zaDumuyDqkuhbb63hRd+YHFGopW7srFIWesLUQ2su/rJfWrSq3YUvoKAJE8Eu1EhZ2Q4c1NuwnEreKj1FkDxA==
+ dependencies:
+ "@vue/shared" "3.2.32"
+
+"@vue/runtime-core@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/runtime-core/-/runtime-core-3.2.32.tgz"
+ integrity sha512-uKKzK6LaCnbCJ7rcHvsK0azHLGpqs+Vi9B28CV1mfWVq1F3Bj8Okk3cX+5DtD06aUh4V2bYhS2UjjWiUUKUF0w==
+ dependencies:
+ "@vue/reactivity" "3.2.32"
+ "@vue/shared" "3.2.32"
+
+"@vue/runtime-dom@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/runtime-dom/-/runtime-dom-3.2.32.tgz"
+ integrity sha512-AmlIg+GPqjkNoADLjHojEX5RGcAg+TsgXOOcUrtDHwKvA8mO26EnLQLB8nylDjU6AMJh2CIYn8NEgyOV5ZIScQ==
+ dependencies:
+ "@vue/runtime-core" "3.2.32"
+ "@vue/shared" "3.2.32"
+ csstype "^2.6.8"
+
+"@vue/server-renderer@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/server-renderer/-/server-renderer-3.2.32.tgz"
+ integrity sha512-TYKpZZfRJpGTTiy/s6bVYwQJpAUx3G03z4G7/3O18M11oacrMTVHaHjiPuPqf3xQtY8R4LKmQ3EOT/DRCA/7Wg==
+ dependencies:
+ "@vue/compiler-ssr" "3.2.32"
+ "@vue/shared" "3.2.32"
+
+"@vue/shared@3.2.32":
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/@vue/shared/-/shared-3.2.32.tgz"
+ integrity sha512-bjcixPErUsAnTQRQX4Z5IQnICYjIfNCyCl8p29v1M6kfVzvwOICPw+dz48nNuWlTOOx2RHhzHdazJibE8GSnsw==
+
+"@vueuse/core@^8.2.4":
+ version "8.2.5"
+ resolved "https://registry.npmmirror.com/@vueuse/core/-/core-8.2.5.tgz"
+ integrity sha512-5prZAA1Ji2ltwNUnzreu6WIXYqHYP/9U2BiY5mD/650VYLpVcwVlYznJDFcLCmEWI3o3Vd34oS1FUf+6Mh68GQ==
+ dependencies:
+ "@vueuse/metadata" "8.2.5"
+ "@vueuse/shared" "8.2.5"
+ vue-demi "*"
+
+"@vueuse/metadata@8.2.5":
+ version "8.2.5"
+ resolved "https://registry.npmmirror.com/@vueuse/metadata/-/metadata-8.2.5.tgz"
+ integrity sha512-Lk9plJjh9cIdiRdcj16dau+2LANxIdFCiTgdfzwYXbflxq0QnMBeOD2qHgKDE7fuVrtPcVWj8VSuZEx1HRfNQA==
+
+"@vueuse/shared@8.2.5":
+ version "8.2.5"
+ resolved "https://registry.npmmirror.com/@vueuse/shared/-/shared-8.2.5.tgz"
+ integrity sha512-lNWo+7sk6JCuOj4AiYM+6HZ6fq4xAuVq1sVckMQKgfCJZpZRe4i8es+ZULO5bYTKP+VrOCtqrLR2GzEfrbr3YQ==
+ dependencies:
+ vue-demi "*"
+
+ant-design-vue@^2.2.8:
+ version "2.2.8"
+ resolved "https://registry.npmmirror.com/ant-design-vue/-/ant-design-vue-2.2.8.tgz"
+ integrity sha512-3graq9/gCfJQs6hznrHV6sa9oDmk/D1H3Oo0vLdVpPS/I61fZPk8NEyNKCHpNA6fT2cx6xx9U3QS63uuyikg/Q==
+ dependencies:
+ "@ant-design/icons-vue" "^6.0.0"
+ "@babel/runtime" "^7.10.5"
+ "@simonwep/pickr" "~1.8.0"
+ array-tree-filter "^2.1.0"
+ async-validator "^3.3.0"
+ dom-align "^1.12.1"
+ dom-scroll-into-view "^2.0.0"
+ lodash "^4.17.21"
+ lodash-es "^4.17.15"
+ moment "^2.27.0"
+ omit.js "^2.0.0"
+ resize-observer-polyfill "^1.5.1"
+ scroll-into-view-if-needed "^2.2.25"
+ shallow-equal "^1.0.0"
+ vue-types "^3.0.0"
+ warning "^4.0.0"
+
+array-tree-filter@^2.1.0:
+ version "2.1.0"
+ resolved "https://registry.npmmirror.com/array-tree-filter/-/array-tree-filter-2.1.0.tgz"
+ integrity sha512-4ROwICNlNw/Hqa9v+rk5h22KjmzB1JGTMVKP2AKJBOCgb0yL0ASf0+YvCcLNNwquOHNX48jkeZIJ3a+oOQqKcw==
+
+async-validator@^3.3.0:
+ version "3.5.2"
+ resolved "https://registry.npmmirror.com/async-validator/-/async-validator-3.5.2.tgz"
+ integrity sha512-8eLCg00W9pIRZSB781UUX/H6Oskmm8xloZfr09lz5bikRpBVDlJ3hRVuxxP1SxcwsEYfJ4IU8Q19Y8/893r3rQ==
+
+async-validator@^4.0.7:
+ version "4.0.7"
+ resolved "https://registry.npmmirror.com/async-validator/-/async-validator-4.0.7.tgz"
+ integrity sha512-Pj2IR7u8hmUEDOwB++su6baaRi+QvsgajuFB9j95foM1N2gy5HM4z60hfusIO0fBPG5uLAEl6yCJr1jNSVugEQ==
+
+axios@^0.26.1:
+ version "0.26.1"
+ resolved "https://registry.npmmirror.com/axios/-/axios-0.26.1.tgz"
+ integrity sha512-fPwcX4EvnSHuInCMItEhAGnaSEXRBjtzh9fOtsE6E1G6p7vl7edEeZe11QHf18+6+9gR5PbKV/sGKNaD8YaMeA==
+ dependencies:
+ follow-redirects "^1.14.8"
+
+compute-scroll-into-view@^1.0.17:
+ version "1.0.17"
+ resolved "https://registry.npmmirror.com/compute-scroll-into-view/-/compute-scroll-into-view-1.0.17.tgz"
+ integrity sha512-j4dx+Fb0URmzbwwMUrhqWM2BEWHdFGx+qZ9qqASHRPqvTYdqvWnHg0H1hIbcyLnvgnoNAVMlwkepyqM3DaIFUg==
+
+copy-anything@^2.0.1:
+ version "2.0.6"
+ resolved "https://registry.npmmirror.com/copy-anything/-/copy-anything-2.0.6.tgz"
+ integrity sha512-1j20GZTsvKNkc4BY3NpMOM8tt///wY3FpIzozTOFO2ffuZcV61nojHXVKIy3WM+7ADCy5FVhdZYHYDdgTU0yJw==
+ dependencies:
+ is-what "^3.14.1"
+
+core-js@^3.15.1:
+ version "3.22.5"
+ resolved "https://registry.npmmirror.com/core-js/-/core-js-3.22.5.tgz"
+ integrity sha512-VP/xYuvJ0MJWRAobcmQ8F2H6Bsn+s7zqAAjFaHGBMc5AQm7zaelhD1LGduFn2EehEcQcU+br6t+fwbpQ5d1ZWA==
+
+csstype@^2.6.8:
+ version "2.6.20"
+ resolved "https://registry.npmmirror.com/csstype/-/csstype-2.6.20.tgz"
+ integrity sha512-/WwNkdXfckNgw6S5R125rrW8ez139lBHWouiBvX8dfMFtcn6V81REDqnH7+CRpRipfYlyU1CmOnOxrmGcFOjeA==
+
+dayjs@^1.11.0:
+ version "1.11.0"
+ resolved "https://registry.npmmirror.com/dayjs/-/dayjs-1.11.0.tgz"
+ integrity sha512-JLC809s6Y948/FuCZPm5IX8rRhQwOiyMb2TfVVQEixG7P8Lm/gt5S7yoQZmC8x1UehI9Pb7sksEt4xx14m+7Ug==
+
+debug@^3.2.6:
+ version "3.2.7"
+ resolved "https://registry.npmmirror.com/debug/-/debug-3.2.7.tgz"
+ integrity sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==
+ dependencies:
+ ms "^2.1.1"
+
+dom-align@^1.12.1:
+ version "1.12.3"
+ resolved "https://registry.npmmirror.com/dom-align/-/dom-align-1.12.3.tgz"
+ integrity sha512-Gj9hZN3a07cbR6zviMUBOMPdWxYhbMI+x+WS0NAIu2zFZmbK8ys9R79g+iG9qLnlCwpFoaB+fKy8Pdv470GsPA==
+
+dom-scroll-into-view@^2.0.0:
+ version "2.0.1"
+ resolved "https://registry.npmmirror.com/dom-scroll-into-view/-/dom-scroll-into-view-2.0.1.tgz"
+ integrity sha512-bvVTQe1lfaUr1oFzZX80ce9KLDlZ3iU+XGNE/bz9HnGdklTieqsbmsLHe+rT2XWqopvL0PckkYqN7ksmm5pe3w==
+
+element-plus@^2.1.9:
+ version "2.1.9"
+ resolved "https://registry.npmmirror.com/element-plus/-/element-plus-2.1.9.tgz"
+ integrity sha512-6mWqS3YrmJPnouWP4otzL8+MehfOnDFqDbcIdnmC07p+Z0JkWe/CVKc4Wky8AYC8nyDMUQyiZYvooCbqGuM7pg==
+ dependencies:
+ "@ctrl/tinycolor" "^3.4.0"
+ "@element-plus/icons-vue" "^1.1.4"
+ "@floating-ui/dom" "^0.4.2"
+ "@popperjs/core" "^2.11.4"
+ "@types/lodash" "^4.14.181"
+ "@types/lodash-es" "^4.17.6"
+ "@vueuse/core" "^8.2.4"
+ async-validator "^4.0.7"
+ dayjs "^1.11.0"
+ escape-html "^1.0.3"
+ lodash "^4.17.21"
+ lodash-es "^4.17.21"
+ lodash-unified "^1.0.2"
+ memoize-one "^6.0.0"
+ normalize-wheel-es "^1.1.2"
+
+errno@^0.1.1:
+ version "0.1.8"
+ resolved "https://registry.npmmirror.com/errno/-/errno-0.1.8.tgz"
+ integrity sha512-dJ6oBr5SQ1VSd9qkk7ByRgb/1SH4JZjCHSW/mr63/QcXO9zLVxvJ6Oy13nio03rxpSnVDDjFor75SjVeZWPW/A==
+ dependencies:
+ prr "~1.0.1"
+
+esbuild-android-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-android-64/-/esbuild-android-64-0.14.36.tgz#fc5f95ce78c8c3d790fa16bc71bd904f2bb42aa1"
+ integrity sha512-jwpBhF1jmo0tVCYC/ORzVN+hyVcNZUWuozGcLHfod0RJCedTDTvR4nwlTXdx1gtncDqjk33itjO+27OZHbiavw==
+
+esbuild-android-arm64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-android-arm64/-/esbuild-android-arm64-0.14.36.tgz#44356fbb9f8de82a5cdf11849e011dfb3ad0a8a8"
+ integrity sha512-/hYkyFe7x7Yapmfv4X/tBmyKnggUmdQmlvZ8ZlBnV4+PjisrEhAvC3yWpURuD9XoB8Wa1d5dGkTsF53pIvpjsg==
+
+esbuild-darwin-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.npmmirror.com/esbuild-darwin-64/-/esbuild-darwin-64-0.14.36.tgz"
+ integrity sha512-kkl6qmV0dTpyIMKagluzYqlc1vO0ecgpviK/7jwPbRDEv5fejRTaBBEE2KxEQbTHcLhiiDbhG7d5UybZWo/1zQ==
+
+esbuild-darwin-arm64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-darwin-arm64/-/esbuild-darwin-arm64-0.14.36.tgz#2a8040c2e465131e5281034f3c72405e643cb7b2"
+ integrity sha512-q8fY4r2Sx6P0Pr3VUm//eFYKVk07C5MHcEinU1BjyFnuYz4IxR/03uBbDwluR6ILIHnZTE7AkTUWIdidRi1Jjw==
+
+esbuild-freebsd-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-freebsd-64/-/esbuild-freebsd-64-0.14.36.tgz#d82c387b4d01fe9e8631f97d41eb54f2dbeb68a3"
+ integrity sha512-Hn8AYuxXXRptybPqoMkga4HRFE7/XmhtlQjXFHoAIhKUPPMeJH35GYEUWGbjteai9FLFvBAjEAlwEtSGxnqWww==
+
+esbuild-freebsd-arm64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-freebsd-arm64/-/esbuild-freebsd-arm64-0.14.36.tgz#e8ce2e6c697da6c7ecd0cc0ac821d47c5ab68529"
+ integrity sha512-S3C0attylLLRiCcHiJd036eDEMOY32+h8P+jJ3kTcfhJANNjP0TNBNL30TZmEdOSx/820HJFgRrqpNAvTbjnDA==
+
+esbuild-linux-32@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-32/-/esbuild-linux-32-0.14.36.tgz#a4a261e2af91986ea62451f2db712a556cb38a15"
+ integrity sha512-Eh9OkyTrEZn9WGO4xkI3OPPpUX7p/3QYvdG0lL4rfr73Ap2HAr6D9lP59VMF64Ex01LhHSXwIsFG/8AQjh6eNw==
+
+esbuild-linux-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-64/-/esbuild-linux-64-0.14.36.tgz#4a9500f9197e2c8fcb884a511d2c9d4c2debde72"
+ integrity sha512-vFVFS5ve7PuwlfgoWNyRccGDi2QTNkQo/2k5U5ttVD0jRFaMlc8UQee708fOZA6zTCDy5RWsT5MJw3sl2X6KDg==
+
+esbuild-linux-arm64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-arm64/-/esbuild-linux-arm64-0.14.36.tgz#c91c21e25b315464bd7da867365dd1dae14ca176"
+ integrity sha512-24Vq1M7FdpSmaTYuu1w0Hdhiqkbto1I5Pjyi+4Cdw5fJKGlwQuw+hWynTcRI/cOZxBcBpP21gND7W27gHAiftw==
+
+esbuild-linux-arm@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-arm/-/esbuild-linux-arm-0.14.36.tgz#90e23bca2e6e549affbbe994f80ba3bb6c4d934a"
+ integrity sha512-NhgU4n+NCsYgt7Hy61PCquEz5aevI6VjQvxwBxtxrooXsxt5b2xtOUXYZe04JxqQo+XZk3d1gcr7pbV9MAQ/Lg==
+
+esbuild-linux-mips64le@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-mips64le/-/esbuild-linux-mips64le-0.14.36.tgz#40e11afb08353ff24709fc89e4db0f866bc131d2"
+ integrity sha512-hZUeTXvppJN+5rEz2EjsOFM9F1bZt7/d2FUM1lmQo//rXh1RTFYzhC0txn7WV0/jCC7SvrGRaRz0NMsRPf8SIA==
+
+esbuild-linux-ppc64le@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-ppc64le/-/esbuild-linux-ppc64le-0.14.36.tgz#9e8a588c513d06cc3859f9dcc52e5fdfce8a1a5e"
+ integrity sha512-1Bg3QgzZjO+QtPhP9VeIBhAduHEc2kzU43MzBnMwpLSZ890azr4/A9Dganun8nsqD/1TBcqhId0z4mFDO8FAvg==
+
+esbuild-linux-riscv64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-riscv64/-/esbuild-linux-riscv64-0.14.36.tgz#e578c09b23b3b97652e60e3692bfda628b541f06"
+ integrity sha512-dOE5pt3cOdqEhaufDRzNCHf5BSwxgygVak9UR7PH7KPVHwSTDAZHDoEjblxLqjJYpc5XaU9+gKJ9F8mp9r5I4A==
+
+esbuild-linux-s390x@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-linux-s390x/-/esbuild-linux-s390x-0.14.36.tgz#3c9dab40d0d69932ffded0fd7317bb403626c9bc"
+ integrity sha512-g4FMdh//BBGTfVHjF6MO7Cz8gqRoDPzXWxRvWkJoGroKA18G9m0wddvPbEqcQf5Tbt2vSc1CIgag7cXwTmoTXg==
+
+esbuild-netbsd-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-netbsd-64/-/esbuild-netbsd-64-0.14.36.tgz#e27847f6d506218291619b8c1e121ecd97628494"
+ integrity sha512-UB2bVImxkWk4vjnP62ehFNZ73lQY1xcnL5ZNYF3x0AG+j8HgdkNF05v67YJdCIuUJpBuTyCK8LORCYo9onSW+A==
+
+esbuild-openbsd-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-openbsd-64/-/esbuild-openbsd-64-0.14.36.tgz#c94c04c557fae516872a586eae67423da6d2fabb"
+ integrity sha512-NvGB2Chf8GxuleXRGk8e9zD3aSdRO5kLt9coTQbCg7WMGXeX471sBgh4kSg8pjx0yTXRt0MlrUDnjVYnetyivg==
+
+esbuild-sunos-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-sunos-64/-/esbuild-sunos-64-0.14.36.tgz#9b79febc0df65a30f1c9bd63047d1675511bf99d"
+ integrity sha512-VkUZS5ftTSjhRjuRLp+v78auMO3PZBXu6xl4ajomGenEm2/rGuWlhFSjB7YbBNErOchj51Jb2OK8lKAo8qdmsQ==
+
+esbuild-windows-32@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-windows-32/-/esbuild-windows-32-0.14.36.tgz#910d11936c8d2122ffdd3275e5b28d8a4e1240ec"
+ integrity sha512-bIar+A6hdytJjZrDxfMBUSEHHLfx3ynoEZXx/39nxy86pX/w249WZm8Bm0dtOAByAf4Z6qV0LsnTIJHiIqbw0w==
+
+esbuild-windows-64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-windows-64/-/esbuild-windows-64-0.14.36.tgz#21b4ce8b42a4efc63f4b58ec617f1302448aad26"
+ integrity sha512-+p4MuRZekVChAeueT1Y9LGkxrT5x7YYJxYE8ZOTcEfeUUN43vktSn6hUNsvxzzATrSgq5QqRdllkVBxWZg7KqQ==
+
+esbuild-windows-arm64@0.14.36:
+ version "0.14.36"
+ resolved "https://registry.yarnpkg.com/esbuild-windows-arm64/-/esbuild-windows-arm64-0.14.36.tgz#ba21546fecb7297667d0052d00150de22c044b24"
+ integrity sha512-fBB4WlDqV1m18EF/aheGYQkQZHfPHiHJSBYzXIo8yKehek+0BtBwo/4PNwKGJ5T0YK0oc8pBKjgwPbzSrPLb+Q==
+
+esbuild@^0.14.27:
+ version "0.14.36"
+ resolved "https://registry.npmmirror.com/esbuild/-/esbuild-0.14.36.tgz"
+ integrity sha512-HhFHPiRXGYOCRlrhpiVDYKcFJRdO0sBElZ668M4lh2ER0YgnkLxECuFe7uWCf23FrcLc59Pqr7dHkTqmRPDHmw==
+ optionalDependencies:
+ esbuild-android-64 "0.14.36"
+ esbuild-android-arm64 "0.14.36"
+ esbuild-darwin-64 "0.14.36"
+ esbuild-darwin-arm64 "0.14.36"
+ esbuild-freebsd-64 "0.14.36"
+ esbuild-freebsd-arm64 "0.14.36"
+ esbuild-linux-32 "0.14.36"
+ esbuild-linux-64 "0.14.36"
+ esbuild-linux-arm "0.14.36"
+ esbuild-linux-arm64 "0.14.36"
+ esbuild-linux-mips64le "0.14.36"
+ esbuild-linux-ppc64le "0.14.36"
+ esbuild-linux-riscv64 "0.14.36"
+ esbuild-linux-s390x "0.14.36"
+ esbuild-netbsd-64 "0.14.36"
+ esbuild-openbsd-64 "0.14.36"
+ esbuild-sunos-64 "0.14.36"
+ esbuild-windows-32 "0.14.36"
+ esbuild-windows-64 "0.14.36"
+ esbuild-windows-arm64 "0.14.36"
+
+escape-html@^1.0.3:
+ version "1.0.3"
+ resolved "https://registry.npmmirror.com/escape-html/-/escape-html-1.0.3.tgz"
+ integrity sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==
+
+estree-walker@^2.0.2:
+ version "2.0.2"
+ resolved "https://registry.npmmirror.com/estree-walker/-/estree-walker-2.0.2.tgz"
+ integrity sha512-Rfkk/Mp/DL7JVje3u18FxFujQlTNR2q6QfMSMB7AvCBx91NGj/ba3kCfza0f6dVDbw7YlRf/nDrn7pQrCCyQ/w==
+
+follow-redirects@^1.14.8:
+ version "1.14.9"
+ resolved "https://registry.npmmirror.com/follow-redirects/-/follow-redirects-1.14.9.tgz"
+ integrity sha512-MQDfihBQYMcyy5dhRDJUHcw7lb2Pv/TuE6xP1vyraLukNDHKbDxDNaOE3NbCAdKQApno+GPRyo1YAp89yCjK4w==
+
+fsevents@~2.3.2:
+ version "2.3.2"
+ resolved "https://registry.npmmirror.com/fsevents/-/fsevents-2.3.2.tgz"
+ integrity sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==
+
+function-bind@^1.1.1:
+ version "1.1.1"
+ resolved "https://registry.npmmirror.com/function-bind/-/function-bind-1.1.1.tgz"
+ integrity sha512-yIovAzMX49sF8Yl58fSCWJ5svSLuaibPxXQJFLmBObTuCr0Mf1KiPopGM9NiFjiYBCbfaa2Fh6breQ6ANVTI0A==
+
+graceful-fs@^4.1.2:
+ version "4.2.10"
+ resolved "https://registry.npmmirror.com/graceful-fs/-/graceful-fs-4.2.10.tgz"
+ integrity sha512-9ByhssR2fPVsNZj478qUUbKfmL0+t5BDVyjShtyZZLiK7ZDAArFFfopyOTj0M05wE2tJPisA4iTnnXl2YoPvOA==
+
+has@^1.0.3:
+ version "1.0.3"
+ resolved "https://registry.npmmirror.com/has/-/has-1.0.3.tgz"
+ integrity sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==
+ dependencies:
+ function-bind "^1.1.1"
+
+iconv-lite@^0.4.4:
+ version "0.4.24"
+ resolved "https://registry.npmmirror.com/iconv-lite/-/iconv-lite-0.4.24.tgz"
+ integrity sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==
+ dependencies:
+ safer-buffer ">= 2.1.2 < 3"
+
+image-size@~0.5.0:
+ version "0.5.5"
+ resolved "https://registry.npmmirror.com/image-size/-/image-size-0.5.5.tgz"
+ integrity sha512-6TDAlDPZxUFCv+fuOkIoXT/V/f3Qbq8e37p+YOiYrUv3v9cc3/6x78VdfPgFVaB9dZYeLUfKgHRebpkm/oP2VQ==
+
+is-core-module@^2.8.1:
+ version "2.8.1"
+ resolved "https://registry.npmmirror.com/is-core-module/-/is-core-module-2.8.1.tgz"
+ integrity sha512-SdNCUs284hr40hFTFP6l0IfZ/RSrMXF3qgoRHd3/79unUTvrFO/JoXwkGm+5J/Oe3E/b5GsnG330uUNgRpu1PA==
+ dependencies:
+ has "^1.0.3"
+
+is-plain-object@3.0.1:
+ version "3.0.1"
+ resolved "https://registry.npmmirror.com/is-plain-object/-/is-plain-object-3.0.1.tgz"
+ integrity sha512-Xnpx182SBMrr/aBik8y+GuR4U1L9FqMSojwDQwPMmxyC6bvEqly9UBCxhauBF5vNh2gwWJNX6oDV7O+OM4z34g==
+
+is-what@^3.14.1:
+ version "3.14.1"
+ resolved "https://registry.npmmirror.com/is-what/-/is-what-3.14.1.tgz"
+ integrity sha512-sNxgpk9793nzSs7bA6JQJGeIuRBQhAaNGG77kzYQgMkrID+lS6SlK07K5LaptscDlSaIgH+GPFzf+d75FVxozA==
+
+js-audio-recorder@0.5.7:
+ version "0.5.7"
+ resolved "https://registry.npmmirror.com/js-audio-recorder/-/js-audio-recorder-0.5.7.tgz"
+ integrity sha512-DIlv30N86AYHr7zGHN0O7V/3Rd8Q6SIJ/MBzVJaT9STWTdhF4E/8fxCX6ZMgRSv8xmx6fEqcFFNPoofmxJD4+A==
+
+"js-tokens@^3.0.0 || ^4.0.0":
+ version "4.0.0"
+ resolved "https://registry.npmmirror.com/js-tokens/-/js-tokens-4.0.0.tgz"
+ integrity sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==
+
+lamejs@^1.2.1:
+ version "1.2.1"
+ resolved "https://registry.npmmirror.com/lamejs/-/lamejs-1.2.1.tgz"
+ integrity sha512-s7bxvjvYthw6oPLCm5pFxvA84wUROODB8jEO2+CE1adhKgrIvVOlmMgY8zyugxGrvRaDHNJanOiS21/emty6dQ==
+ dependencies:
+ use-strict "1.0.1"
+
+less@^4.1.2:
+ version "4.1.2"
+ resolved "https://registry.npmmirror.com/less/-/less-4.1.2.tgz"
+ integrity sha512-EoQp/Et7OSOVu0aJknJOtlXZsnr8XE8KwuzTHOLeVSEx8pVWUICc8Q0VYRHgzyjX78nMEyC/oztWFbgyhtNfDA==
+ dependencies:
+ copy-anything "^2.0.1"
+ parse-node-version "^1.0.1"
+ tslib "^2.3.0"
+ optionalDependencies:
+ errno "^0.1.1"
+ graceful-fs "^4.1.2"
+ image-size "~0.5.0"
+ make-dir "^2.1.0"
+ mime "^1.4.1"
+ needle "^2.5.2"
+ source-map "~0.6.0"
+
+lodash-es@^4.17.15, lodash-es@^4.17.21:
+ version "4.17.21"
+ resolved "https://registry.npmmirror.com/lodash-es/-/lodash-es-4.17.21.tgz"
+ integrity sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw==
+
+lodash-unified@^1.0.2:
+ version "1.0.2"
+ resolved "https://registry.npmmirror.com/lodash-unified/-/lodash-unified-1.0.2.tgz"
+ integrity sha512-OGbEy+1P+UT26CYi4opY4gebD8cWRDxAT6MAObIVQMiqYdxZr1g3QHWCToVsm31x2NkLS4K3+MC2qInaRMa39g==
+
+lodash@^4.17.21:
+ version "4.17.21"
+ resolved "https://registry.npmmirror.com/lodash/-/lodash-4.17.21.tgz"
+ integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==
+
+loose-envify@^1.0.0:
+ version "1.4.0"
+ resolved "https://registry.npmmirror.com/loose-envify/-/loose-envify-1.4.0.tgz"
+ integrity sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==
+ dependencies:
+ js-tokens "^3.0.0 || ^4.0.0"
+
+magic-string@^0.25.7:
+ version "0.25.9"
+ resolved "https://registry.npmmirror.com/magic-string/-/magic-string-0.25.9.tgz"
+ integrity sha512-RmF0AsMzgt25qzqqLc1+MbHmhdx0ojF2Fvs4XnOqz2ZOBXzzkEwc/dJQZCYHAn7v1jbVOjAZfK8msRn4BxO4VQ==
+ dependencies:
+ sourcemap-codec "^1.4.8"
+
+make-dir@^2.1.0:
+ version "2.1.0"
+ resolved "https://registry.npmmirror.com/make-dir/-/make-dir-2.1.0.tgz"
+ integrity sha512-LS9X+dc8KLxXCb8dni79fLIIUA5VyZoyjSMCwTluaXA0o27cCK0bhXkpgw+sTXVpPy/lSO57ilRixqk0vDmtRA==
+ dependencies:
+ pify "^4.0.1"
+ semver "^5.6.0"
+
+memoize-one@^6.0.0:
+ version "6.0.0"
+ resolved "https://registry.npmmirror.com/memoize-one/-/memoize-one-6.0.0.tgz"
+ integrity sha512-rkpe71W0N0c0Xz6QD0eJETuWAJGnJ9afsl1srmwPrI+yBCkge5EycXXbYRyvL29zZVUWQCY7InPRCv3GDXuZNw==
+
+mime@^1.4.1:
+ version "1.6.0"
+ resolved "https://registry.npmmirror.com/mime/-/mime-1.6.0.tgz"
+ integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==
+
+moment@^2.27.0:
+ version "2.29.4"
+ resolved "https://registry.yarnpkg.com/moment/-/moment-2.29.4.tgz#3dbe052889fe7c1b2ed966fcb3a77328964ef108"
+ integrity sha512-5LC9SOxjSc2HF6vO2CyuTDNivEdoz2IvyJJGj6X8DJ0eFyfszE0QiEd+iXmBvUP3WHxSjFH/vIsA0EN00cgr8w==
+
+ms@^2.1.1:
+ version "2.1.3"
+ resolved "https://registry.npmmirror.com/ms/-/ms-2.1.3.tgz"
+ integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==
+
+nanoid@^3.3.1:
+ version "3.3.2"
+ resolved "https://registry.npmmirror.com/nanoid/-/nanoid-3.3.2.tgz"
+ integrity sha512-CuHBogktKwpm5g2sRgv83jEy2ijFzBwMoYA60orPDR7ynsLijJDqgsi4RDGj3OJpy3Ieb+LYwiRmIOGyytgITA==
+
+nanopop@^2.1.0:
+ version "2.1.0"
+ resolved "https://registry.npmmirror.com/nanopop/-/nanopop-2.1.0.tgz"
+ integrity sha512-jGTwpFRexSH+fxappnGQtN9dspgE2ipa1aOjtR24igG0pv6JCxImIAmrLRHX+zUF5+1wtsFVbKyfP51kIGAVNw==
+
+needle@^2.5.2:
+ version "2.9.1"
+ resolved "https://registry.npmmirror.com/needle/-/needle-2.9.1.tgz"
+ integrity sha512-6R9fqJ5Zcmf+uYaFgdIHmLwNldn5HbK8L5ybn7Uz+ylX/rnOsSp1AHcvQSrCaFN+qNM1wpymHqD7mVasEOlHGQ==
+ dependencies:
+ debug "^3.2.6"
+ iconv-lite "^0.4.4"
+ sax "^1.2.4"
+
+normalize-wheel-es@^1.1.2:
+ version "1.1.2"
+ resolved "https://registry.npmmirror.com/normalize-wheel-es/-/normalize-wheel-es-1.1.2.tgz"
+ integrity sha512-scX83plWJXYH1J4+BhAuIHadROzxX0UBF3+HuZNY2Ks8BciE7tSTQ+5JhTsvzjaO0/EJdm4JBGrfObKxFf3Png==
+
+omit.js@^2.0.0:
+ version "2.0.2"
+ resolved "https://registry.npmmirror.com/omit.js/-/omit.js-2.0.2.tgz"
+ integrity sha512-hJmu9D+bNB40YpL9jYebQl4lsTW6yEHRTroJzNLqQJYHm7c+NQnJGfZmIWh8S3q3KoaxV1aLhV6B3+0N0/kyJg==
+
+parse-node-version@^1.0.1:
+ version "1.0.1"
+ resolved "https://registry.npmmirror.com/parse-node-version/-/parse-node-version-1.0.1.tgz"
+ integrity sha512-3YHlOa/JgH6Mnpr05jP9eDG254US9ek25LyIxZlDItp2iJtwyaXQb57lBYLdT3MowkUFYEV2XXNAYIPlESvJlA==
+
+path-parse@^1.0.7:
+ version "1.0.7"
+ resolved "https://registry.npmmirror.com/path-parse/-/path-parse-1.0.7.tgz"
+ integrity sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==
+
+picocolors@^1.0.0:
+ version "1.0.0"
+ resolved "https://registry.npmmirror.com/picocolors/-/picocolors-1.0.0.tgz"
+ integrity sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ==
+
+pify@^4.0.1:
+ version "4.0.1"
+ resolved "https://registry.npmmirror.com/pify/-/pify-4.0.1.tgz"
+ integrity sha512-uB80kBFb/tfd68bVleG9T5GGsGPjJrLAUpR5PZIrhBnIaRTQRjqdJSsIKkOP6OAIFbj7GOrcudc5pNjZ+geV2g==
+
+postcss@^8.1.10, postcss@^8.4.12:
+ version "8.4.12"
+ resolved "https://registry.npmmirror.com/postcss/-/postcss-8.4.12.tgz"
+ integrity sha512-lg6eITwYe9v6Hr5CncVbK70SoioNQIq81nsaG86ev5hAidQvmOeETBqs7jm43K2F5/Ley3ytDtriImV6TpNiSg==
+ dependencies:
+ nanoid "^3.3.1"
+ picocolors "^1.0.0"
+ source-map-js "^1.0.2"
+
+prr@~1.0.1:
+ version "1.0.1"
+ resolved "https://registry.npmmirror.com/prr/-/prr-1.0.1.tgz"
+ integrity sha512-yPw4Sng1gWghHQWj0B3ZggWUm4qVbPwPFcRG8KyxiU7J2OHFSoEHKS+EZ3fv5l1t9CyCiop6l/ZYeWbrgoQejw==
+
+regenerator-runtime@^0.13.4:
+ version "0.13.9"
+ resolved "https://registry.npmmirror.com/regenerator-runtime/-/regenerator-runtime-0.13.9.tgz"
+ integrity sha512-p3VT+cOEgxFsRRA9X4lkI1E+k2/CtnKtU4gcxyaCUreilL/vqI6CdZ3wxVUx3UOUg+gnUOQQcRI7BmSI656MYA==
+
+resize-observer-polyfill@^1.5.1:
+ version "1.5.1"
+ resolved "https://registry.npmmirror.com/resize-observer-polyfill/-/resize-observer-polyfill-1.5.1.tgz"
+ integrity sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg==
+
+resolve@^1.22.0:
+ version "1.22.0"
+ resolved "https://registry.npmmirror.com/resolve/-/resolve-1.22.0.tgz"
+ integrity sha512-Hhtrw0nLeSrFQ7phPp4OOcVjLPIeMnRlr5mcnVuMe7M/7eBn98A3hmFRLoFo3DLZkivSYwhRUJTyPyWAk56WLw==
+ dependencies:
+ is-core-module "^2.8.1"
+ path-parse "^1.0.7"
+ supports-preserve-symlinks-flag "^1.0.0"
+
+rollup@^2.59.0:
+ version "2.70.1"
+ resolved "https://registry.npmmirror.com/rollup/-/rollup-2.70.1.tgz"
+ integrity sha512-CRYsI5EuzLbXdxC6RnYhOuRdtz4bhejPMSWjsFLfVM/7w/85n2szZv6yExqUXsBdz5KT8eoubeyDUDjhLHEslA==
+ optionalDependencies:
+ fsevents "~2.3.2"
+
+"safer-buffer@>= 2.1.2 < 3":
+ version "2.1.2"
+ resolved "https://registry.npmmirror.com/safer-buffer/-/safer-buffer-2.1.2.tgz"
+ integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==
+
+sax@^1.2.4:
+ version "1.2.4"
+ resolved "https://registry.npmmirror.com/sax/-/sax-1.2.4.tgz"
+ integrity sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw==
+
+scroll-into-view-if-needed@^2.2.25:
+ version "2.2.29"
+ resolved "https://registry.npmmirror.com/scroll-into-view-if-needed/-/scroll-into-view-if-needed-2.2.29.tgz"
+ integrity sha512-hxpAR6AN+Gh53AdAimHM6C8oTN1ppwVZITihix+WqalywBeFcQ6LdQP5ABNl26nX8GTEL7VT+b8lKpdqq65wXg==
+ dependencies:
+ compute-scroll-into-view "^1.0.17"
+
+semver@^5.6.0:
+ version "5.7.1"
+ resolved "https://registry.npmmirror.com/semver/-/semver-5.7.1.tgz"
+ integrity sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ==
+
+shallow-equal@^1.0.0:
+ version "1.2.1"
+ resolved "https://registry.npmmirror.com/shallow-equal/-/shallow-equal-1.2.1.tgz"
+ integrity sha512-S4vJDjHHMBaiZuT9NPb616CSmLf618jawtv3sufLl6ivK8WocjAo58cXwbRV1cgqxH0Qbv+iUt6m05eqEa2IRA==
+
+source-map-js@^1.0.2:
+ version "1.0.2"
+ resolved "https://registry.npmmirror.com/source-map-js/-/source-map-js-1.0.2.tgz"
+ integrity sha512-R0XvVJ9WusLiqTCEiGCmICCMplcCkIwwR11mOSD9CR5u+IXYdiseeEuXCVAjS54zqwkLcPNnmU4OeJ6tUrWhDw==
+
+source-map@^0.6.1, source-map@~0.6.0:
+ version "0.6.1"
+ resolved "https://registry.npmmirror.com/source-map/-/source-map-0.6.1.tgz"
+ integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
+
+sourcemap-codec@^1.4.8:
+ version "1.4.8"
+ resolved "https://registry.npmmirror.com/sourcemap-codec/-/sourcemap-codec-1.4.8.tgz"
+ integrity sha512-9NykojV5Uih4lgo5So5dtw+f0JgJX30KCNI8gwhz2J9A15wD0Ml6tjHKwf6fTSa6fAdVBdZeNOs9eJ71qCk8vA==
+
+supports-preserve-symlinks-flag@^1.0.0:
+ version "1.0.0"
+ resolved "https://registry.npmmirror.com/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz"
+ integrity sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==
+
+tslib@^2.3.0:
+ version "2.4.0"
+ resolved "https://registry.npmmirror.com/tslib/-/tslib-2.4.0.tgz"
+ integrity sha512-d6xOpEDfsi2CZVlPQzGeux8XMwLT9hssAsaPYExaQMuYskwb+x1x7J371tWlbBdWHroy99KnVB6qIkUbs5X3UQ==
+
+use-strict@1.0.1:
+ version "1.0.1"
+ resolved "https://registry.npmmirror.com/use-strict/-/use-strict-1.0.1.tgz"
+ integrity sha512-IeiWvvEXfW5ltKVMkxq6FvNf2LojMKvB2OCeja6+ct24S1XOmQw2dGr2JyndwACWAGJva9B7yPHwAmeA9QCqAQ==
+
+vite@^2.9.0:
+ version "2.9.1"
+ resolved "https://registry.npmmirror.com/vite/-/vite-2.9.1.tgz"
+ integrity sha512-vSlsSdOYGcYEJfkQ/NeLXgnRv5zZfpAsdztkIrs7AZHV8RCMZQkwjo4DS5BnrYTqoWqLoUe1Cah4aVO4oNNqCQ==
+ dependencies:
+ esbuild "^0.14.27"
+ postcss "^8.4.12"
+ resolve "^1.22.0"
+ rollup "^2.59.0"
+ optionalDependencies:
+ fsevents "~2.3.2"
+
+vue-demi@*:
+ version "0.12.5"
+ resolved "https://registry.npmmirror.com/vue-demi/-/vue-demi-0.12.5.tgz"
+ integrity sha512-BREuTgTYlUr0zw0EZn3hnhC3I6gPWv+Kwh4MCih6QcAeaTlaIX0DwOVN0wHej7hSvDPecz4jygy/idsgKfW58Q==
+
+vue-types@^3.0.0:
+ version "3.0.2"
+ resolved "https://registry.npmmirror.com/vue-types/-/vue-types-3.0.2.tgz"
+ integrity sha512-IwUC0Aq2zwaXqy74h4WCvFCUtoV0iSWr0snWnE9TnU18S66GAQyqQbRf2qfJtUuiFsBf6qp0MEwdonlwznlcrw==
+ dependencies:
+ is-plain-object "3.0.1"
+
+vue@^3.2.25:
+ version "3.2.32"
+ resolved "https://registry.npmmirror.com/vue/-/vue-3.2.32.tgz"
+ integrity sha512-6L3jKZApF042OgbCkh+HcFeAkiYi3Lovi8wNhWqIK98Pi5efAMLZzRHgi91v+60oIRxdJsGS9sTMsb+yDpY8Eg==
+ dependencies:
+ "@vue/compiler-dom" "3.2.32"
+ "@vue/compiler-sfc" "3.2.32"
+ "@vue/runtime-dom" "3.2.32"
+ "@vue/server-renderer" "3.2.32"
+ "@vue/shared" "3.2.32"
+
+warning@^4.0.0:
+ version "4.0.3"
+ resolved "https://registry.npmmirror.com/warning/-/warning-4.0.3.tgz"
+ integrity sha512-rpJyN222KWIvHJ/F53XSZv0Zl/accqHR8et1kpaMTD/fLCRxtV8iX8czMzY7sVZupTI3zcUTg8eycS2kNF9l6w==
+ dependencies:
+ loose-envify "^1.0.0"
diff --git a/demos/streaming_asr_server/README.md b/demos/streaming_asr_server/README.md
index a770f58c3..a97486757 100644
--- a/demos/streaming_asr_server/README.md
+++ b/demos/streaming_asr_server/README.md
@@ -7,12 +7,18 @@ This demo is an implementation of starting the streaming speech service and acce
Streaming ASR server only support `websocket` protocol, and doesn't support `http` protocol.
+服务接口定义请参考:
+- [PaddleSpeech Streaming Server WebSocket API](https://github.com/PaddlePaddle/PaddleSpeech/wiki/PaddleSpeech-Server-WebSocket-API)
+
## Usage
### 1. Installation
see [installation](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/install.md).
-It is recommended to use **paddlepaddle 2.2.1** or above.
-You can choose one way from meduim and hard to install paddlespeech.
+It is recommended to use **paddlepaddle 2.3.1** or above.
+
+You can choose one way from easy, meduim and hard to install paddlespeech.
+
+**If you install in easy mode, you need to prepare the yaml file by yourself, you can refer to
### 2. Prepare config File
The configuration file can be found in `conf/ws_application.yaml` 和 `conf/ws_conformer_wenetspeech_application.yaml`.
@@ -47,28 +53,28 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
- `log_file`: log file. Default: `./log/paddlespeech.log`
Output:
- ```bash
- [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
- [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
- [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
- [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
- [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
- [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
- [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
- [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
- INFO: Started server process [4242]
- [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
- INFO: Waiting for application startup.
- [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
- INFO: Application startup complete.
- [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
- INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
- [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ ```text
+ [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
+ [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
+ [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
+ [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
+ [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
+ [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
+ [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
+ [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
+ INFO: Started server process [4242]
+ [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
+ INFO: Waiting for application startup.
+ [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
+ INFO: Application startup complete.
+ [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
+ INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
```
- Python API
@@ -84,28 +90,28 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
```
Output:
- ```bash
- [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
- [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
- [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
- [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
- [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
- [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
- [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
- [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
- INFO: Started server process [4242]
- [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
- INFO: Waiting for application startup.
- [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
- INFO: Application startup complete.
- [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
- INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
- [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ ```text
+ [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
+ [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
+ [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
+ [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
+ [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
+ [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
+ [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
+ [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
+ INFO: Started server process [4242]
+ [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
+ INFO: Waiting for application startup.
+ [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
+ INFO: Application startup complete.
+ [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
+ INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
```
@@ -116,7 +122,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
If `127.0.0.1` is not accessible, you need to use the actual service IP address.
- ```
+ ```bash
paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8090 --input ./zh.wav
```
@@ -125,6 +131,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
```bash
paddlespeech_client asr_online --help
```
+
Arguments:
- `server_ip`: server ip. Default: 127.0.0.1
- `port`: server port. Default: 8090
@@ -136,75 +143,74 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
- `punc.server_port`: punctuation server port. Default: None.
Output:
- ```bash
- [2022-05-06 21:10:35,598] [ INFO] - Start to do streaming asr client
- [2022-05-06 21:10:35,600] [ INFO] - asr websocket client start
- [2022-05-06 21:10:35,600] [ INFO] - endpoint: ws://127.0.0.1:8390/paddlespeech/asr/streaming
- [2022-05-06 21:10:35,600] [ INFO] - start to process the wavscp: ./zh.wav
- [2022-05-06 21:10:35,670] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
- [2022-05-06 21:10:35,699] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,713] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,726] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,738] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,750] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,762] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,774] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,786] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,387] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,398] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,407] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,416] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,425] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,434] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,442] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,930] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,938] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,946] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,954] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,962] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,970] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,977] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,985] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:37,484] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,492] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,500] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,508] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,517] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,525] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,532] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:38,050] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,058] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,066] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,073] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,081] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,089] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,097] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,105] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,630] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,639] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,647] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,655] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,663] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,671] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,679] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:39,216] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,224] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,232] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,240] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,248] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,256] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,264] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,272] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,885] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,896] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,905] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,915] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,924] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,934] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:44,827] [ INFO] - client final receive msg={'status': 'ok', 'signal': 'finished', 'result': '我认为跑步最重要的就是给我带来了身体健康', 'times': [{'w': '我', 'bg': 0.0, 'ed': 0.7000000000000001}, {'w': '认', 'bg': 0.7000000000000001, 'ed': 0.84}, {'w': '为', 'bg': 0.84, 'ed': 1.0}, {'w': '跑', 'bg': 1.0, 'ed': 1.18}, {'w': '步', 'bg': 1.18, 'ed': 1.36}, {'w': '最', 'bg': 1.36, 'ed': 1.5}, {'w': '重', 'bg': 1.5, 'ed': 1.6400000000000001}, {'w': '要', 'bg': 1.6400000000000001, 'ed': 1.78}, {'w': '的', 'bg': 1.78, 'ed': 1.9000000000000001}, {'w': '就', 'bg': 1.9000000000000001, 'ed': 2.06}, {'w': '是', 'bg': 2.06, 'ed': 2.62}, {'w': '给', 'bg': 2.62, 'ed': 3.16}, {'w': '我', 'bg': 3.16, 'ed': 3.3200000000000003}, {'w': '带', 'bg': 3.3200000000000003, 'ed': 3.48}, {'w': '来', 'bg': 3.48, 'ed': 3.62}, {'w': '了', 'bg': 3.62, 'ed': 3.7600000000000002}, {'w': '身', 'bg': 3.7600000000000002, 'ed': 3.9}, {'w': '体', 'bg': 3.9, 'ed': 4.0600000000000005}, {'w': '健', 'bg': 4.0600000000000005, 'ed': 4.26}, {'w': '康', 'bg': 4.26, 'ed': 4.96}]}
- [2022-05-06 21:10:44,827] [ INFO] - audio duration: 4.9968125, elapsed time: 9.225094079971313, RTF=1.846195765794957
- [2022-05-06 21:10:44,828] [ INFO] - asr websocket client finished : 我认为跑步最重要的就是给我带来了身体健康
-
+ ```text
+ [2022-05-06 21:10:35,598] [ INFO] - Start to do streaming asr client
+ [2022-05-06 21:10:35,600] [ INFO] - asr websocket client start
+ [2022-05-06 21:10:35,600] [ INFO] - endpoint: ws://127.0.0.1:8390/paddlespeech/asr/streaming
+ [2022-05-06 21:10:35,600] [ INFO] - start to process the wavscp: ./zh.wav
+ [2022-05-06 21:10:35,670] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
+ [2022-05-06 21:10:35,699] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,713] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,726] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,738] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,750] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,762] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,774] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,786] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,387] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,398] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,407] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,416] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,425] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,434] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,442] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,930] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,938] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,946] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,954] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,962] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,970] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,977] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,985] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:37,484] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,492] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,500] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,508] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,517] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,525] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,532] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:38,050] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,058] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,066] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,073] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,081] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,089] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,097] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,105] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,630] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,639] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,647] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,655] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,663] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,671] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,679] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:39,216] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,224] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,232] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,240] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,248] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,256] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,264] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,272] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,885] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,896] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,905] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,915] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,924] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,934] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:44,827] [ INFO] - client final receive msg={'status': 'ok', 'signal': 'finished', 'result': '我认为跑步最重要的就是给我带来了身体健康', 'times': [{'w': '我', 'bg': 0.0, 'ed': 0.7000000000000001}, {'w': '认', 'bg': 0.7000000000000001, 'ed': 0.84}, {'w': '为', 'bg': 0.84, 'ed': 1.0}, {'w': '跑', 'bg': 1.0, 'ed': 1.18}, {'w': '步', 'bg': 1.18, 'ed': 1.36}, {'w': '最', 'bg': 1.36, 'ed': 1.5}, {'w': '重', 'bg': 1.5, 'ed': 1.6400000000000001}, {'w': '要', 'bg': 1.6400000000000001, 'ed': 1.78}, {'w': '的', 'bg': 1.78, 'ed': 1.9000000000000001}, {'w': '就', 'bg': 1.9000000000000001, 'ed': 2.06}, {'w': '是', 'bg': 2.06, 'ed': 2.62}, {'w': '给', 'bg': 2.62, 'ed': 3.16}, {'w': '我', 'bg': 3.16, 'ed': 3.3200000000000003}, {'w': '带', 'bg': 3.3200000000000003, 'ed': 3.48}, {'w': '来', 'bg': 3.48, 'ed': 3.62}, {'w': '了', 'bg': 3.62, 'ed': 3.7600000000000002}, {'w': '身', 'bg': 3.7600000000000002, 'ed': 3.9}, {'w': '体', 'bg': 3.9, 'ed': 4.0600000000000005}, {'w': '健', 'bg': 4.0600000000000005, 'ed': 4.26}, {'w': '康', 'bg': 4.26, 'ed': 4.96}]}
+ [2022-05-06 21:10:44,827] [ INFO] - audio duration: 4.9968125, elapsed time: 9.225094079971313, RTF=1.846195765794957
+ [2022-05-06 21:10:44,828] [ INFO] - asr websocket client finished : 我认为跑步最重要的就是给我带来了身体健康
```
- Python API
@@ -223,7 +229,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
```
Output:
- ```bash
+ ```text
[2022-05-06 21:14:03,137] [ INFO] - asr websocket client start
[2022-05-06 21:14:03,137] [ INFO] - endpoint: ws://127.0.0.1:8390/paddlespeech/asr/streaming
[2022-05-06 21:14:03,149] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
@@ -298,12 +304,11 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
- Command Line
**Note:** The default deployment of the server is on the 'CPU' device, which can be deployed on the 'GPU' by modifying the 'device' parameter in the service configuration file.
- ``` bash
+ ```bash
In PaddleSpeech/demos/streaming_asr_server directory to lanuch punctuation service
paddlespeech_server start --config_file conf/punc_application.yaml
```
-
Usage:
```bash
paddlespeech_server start --help
@@ -315,7 +320,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
Output:
- ``` bash
+ ```text
[2022-05-02 17:59:26,285] [ INFO] - Create the TextEngine Instance
[2022-05-02 17:59:26,285] [ INFO] - Init the text engine
[2022-05-02 17:59:26,285] [ INFO] - Text Engine set the device: gpu:0
@@ -347,26 +352,26 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
log_file="./log/paddlespeech.log")
```
- Output:
- ```
- [2022-05-02 18:09:02,542] [ INFO] - Create the TextEngine Instance
- [2022-05-02 18:09:02,543] [ INFO] - Init the text engine
- [2022-05-02 18:09:02,543] [ INFO] - Text Engine set the device: gpu:0
- [2022-05-02 18:09:02,545] [ INFO] - File /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar.gz md5 checking...
- [2022-05-02 18:09:06,919] [ INFO] - Use pretrained model stored in: /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar
- W0502 18:09:07.523002 22615 device_context.cc:447] Please NOTE: device: 0, GPU Compute Capability: 6.1, Driver API Version: 10.2, Runtime API Version: 10.2
- W0502 18:09:07.527882 22615 device_context.cc:465] device: 0, cuDNN Version: 7.6.
- [2022-05-02 18:09:10,900] [ INFO] - Already cached /home/users/xiongxinlei/.paddlenlp/models/ernie-1.0/vocab.txt
- [2022-05-02 18:09:10,913] [ INFO] - Init the text engine successfully
- INFO: Started server process [22615]
- [2022-05-02 18:09:10] [INFO] [server.py:75] Started server process [22615]
- INFO: Waiting for application startup.
- [2022-05-02 18:09:10] [INFO] [on.py:45] Waiting for application startup.
- INFO: Application startup complete.
- [2022-05-02 18:09:10] [INFO] [on.py:59] Application startup complete.
- INFO: Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
- [2022-05-02 18:09:10] [INFO] [server.py:206] Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
- ```
+ Output:
+ ```text
+ [2022-05-02 18:09:02,542] [ INFO] - Create the TextEngine Instance
+ [2022-05-02 18:09:02,543] [ INFO] - Init the text engine
+ [2022-05-02 18:09:02,543] [ INFO] - Text Engine set the device: gpu:0
+ [2022-05-02 18:09:02,545] [ INFO] - File /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar.gz md5 checking...
+ [2022-05-02 18:09:06,919] [ INFO] - Use pretrained model stored in: /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar
+ W0502 18:09:07.523002 22615 device_context.cc:447] Please NOTE: device: 0, GPU Compute Capability: 6.1, Driver API Version: 10.2, Runtime API Version: 10.2
+ W0502 18:09:07.527882 22615 device_context.cc:465] device: 0, cuDNN Version: 7.6.
+ [2022-05-02 18:09:10,900] [ INFO] - Already cached /home/users/xiongxinlei/.paddlenlp/models/ernie-1.0/vocab.txt
+ [2022-05-02 18:09:10,913] [ INFO] - Init the text engine successfully
+ INFO: Started server process [22615]
+ [2022-05-02 18:09:10] [INFO] [server.py:75] Started server process [22615]
+ INFO: Waiting for application startup.
+ [2022-05-02 18:09:10] [INFO] [on.py:45] Waiting for application startup.
+ INFO: Application startup complete.
+ [2022-05-02 18:09:10] [INFO] [on.py:59] Application startup complete.
+ INFO: Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
+ [2022-05-02 18:09:10] [INFO] [server.py:206] Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
+ ```
### 2. Client usage
**Note** The response time will be slightly longer when using the client for the first time
@@ -375,17 +380,17 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
If `127.0.0.1` is not accessible, you need to use the actual service IP address.
- ```
+ ```bash
paddlespeech_client text --server_ip 127.0.0.1 --port 8190 --input "我认为跑步最重要的就是给我带来了身体健康"
```
Output
- ```
+ ```text
[2022-05-02 18:12:29,767] [ INFO] - The punc text: 我认为跑步最重要的就是给我带来了身体健康。
[2022-05-02 18:12:29,767] [ INFO] - Response time 0.096548 s.
```
-- Python3 API
+- Python API
```python
from paddlespeech.server.bin.paddlespeech_client import TextClientExecutor
@@ -399,11 +404,10 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
```
Output:
- ``` bash
+ ```text
我认为跑步最重要的就是给我带来了身体健康。
```
-
## Join streaming asr and punctuation server
By default, each server is deployed on the 'CPU' device and speech recognition and punctuation prediction can be deployed on different 'GPU' by modifying the' device 'parameter in the service configuration file respectively.
@@ -412,7 +416,7 @@ We use `streaming_ asr_server.py` and `punc_server.py` two services to lanuch st
### 1. Start two server
-``` bash
+```bash
Note: streaming speech recognition and punctuation prediction are configured on different graphics cards through configuration files
bash server.sh
```
@@ -422,11 +426,11 @@ bash server.sh
If `127.0.0.1` is not accessible, you need to use the actual service IP address.
- ```
+ ```bash
paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8290 --punc.server_ip 127.0.0.1 --punc.port 8190 --input ./zh.wav
```
Output:
- ```
+ ```text
[2022-05-07 11:21:47,060] [ INFO] - asr websocket client start
[2022-05-07 11:21:47,060] [ INFO] - endpoint: ws://127.0.0.1:8490/paddlespeech/asr/streaming
[2022-05-07 11:21:47,080] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
@@ -500,11 +504,11 @@ bash server.sh
If `127.0.0.1` is not accessible, you need to use the actual service IP address.
- ```
+ ```bash
python3 websocket_client.py --server_ip 127.0.0.1 --port 8290 --punc.server_ip 127.0.0.1 --punc.port 8190 --wavfile ./zh.wav
```
Output:
- ```
+ ```text
[2022-05-07 11:11:02,984] [ INFO] - Start to do streaming asr client
[2022-05-07 11:11:02,985] [ INFO] - asr websocket client start
[2022-05-07 11:11:02,985] [ INFO] - endpoint: ws://127.0.0.1:8490/paddlespeech/asr/streaming
@@ -573,5 +577,3 @@ bash server.sh
[2022-05-07 11:11:18,915] [ INFO] - audio duration: 4.9968125, elapsed time: 15.928460597991943, RTF=3.187724293835709
[2022-05-07 11:11:18,916] [ INFO] - asr websocket client finished : 我认为跑步最重要的就是给我带来了身体健康
```
-
-
diff --git a/demos/streaming_asr_server/README_cn.md b/demos/streaming_asr_server/README_cn.md
index c771869e9..267367729 100644
--- a/demos/streaming_asr_server/README_cn.md
+++ b/demos/streaming_asr_server/README_cn.md
@@ -3,17 +3,22 @@
# 流式语音识别服务
## 介绍
-这个demo是一个启动流式语音服务和访问服务的实现。 它可以通过使用`paddlespeech_server` 和 `paddlespeech_client`的单个命令或 python 的几行代码来实现。
+这个 demo 是一个启动流式语音服务和访问服务的实现。 它可以通过使用 `paddlespeech_server` 和 `paddlespeech_client` 的单个命令或 python 的几行代码来实现。
**流式语音识别服务只支持 `weboscket` 协议,不支持 `http` 协议。**
+服务接口定义请参考:
+- [PaddleSpeech Streaming Server WebSocket API](https://github.com/PaddlePaddle/PaddleSpeech/wiki/PaddleSpeech-Server-WebSocket-API)
+
## 使用方法
### 1. 安装
安装 PaddleSpeech 的详细过程请看 [安装文档](https://github.com/PaddlePaddle/PaddleSpeech/blob/develop/docs/source/install.md)。
-推荐使用 **paddlepaddle 2.2.1** 或以上版本。
-你可以从medium,hard 两种方式中选择一种方式安装 PaddleSpeech。
+推荐使用 **paddlepaddle 2.3.1** 或以上版本。
+
+你可以从简单,中等,困难 几种方式中选择一种方式安装 PaddleSpeech。
+**如果使用简单模式安装,需要自行准备 yaml 文件,可参考 conf 目录下的 yaml 文件。**
### 2. 准备配置文件
@@ -26,7 +31,6 @@
* conformer: `conf/ws_conformer_wenetspeech_application.yaml`
-
这个 ASR client 的输入应该是一个 WAV 文件(`.wav`),并且采样率必须与模型的采样率相同。
可以下载此 ASR client的示例音频:
@@ -54,28 +58,28 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
- `log_file`: log 文件. 默认:`./log/paddlespeech.log`
输出:
- ```bash
- [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
- [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
- [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
- [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
- [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
- [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
- [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
- [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
- INFO: Started server process [4242]
- [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
- INFO: Waiting for application startup.
- [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
- INFO: Application startup complete.
- [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
- INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
- [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ ```text
+ [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
+ [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
+ [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
+ [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
+ [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
+ [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
+ [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
+ [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
+ INFO: Started server process [4242]
+ [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
+ INFO: Waiting for application startup.
+ [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
+ INFO: Application startup complete.
+ [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
+ INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
```
- Python API
@@ -90,29 +94,29 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
log_file="./log/paddlespeech.log")
```
- 输出:
- ```bash
- [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
- [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
- [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
- [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
- [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
- [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
- [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
- [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
- [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
- INFO: Started server process [4242]
- [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
- INFO: Waiting for application startup.
- [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
- INFO: Application startup complete.
- [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
- INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
- [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ 输出:
+ ```text
+ [2022-05-14 04:56:13,086] [ INFO] - create the online asr engine instance
+ [2022-05-14 04:56:13,086] [ INFO] - paddlespeech_server set the device: cpu
+ [2022-05-14 04:56:13,087] [ INFO] - Load the pretrained model, tag = conformer_online_wenetspeech-zh-16k
+ [2022-05-14 04:56:13,087] [ INFO] - File /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar.gz md5 checking...
+ [2022-05-14 04:56:17,542] [ INFO] - Use pretrained model stored in: /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1. 0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/model.yaml
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,543] [ INFO] - /root/.paddlespeech/models/conformer_online_wenetspeech-zh-16k/asr1_chunk_conformer_wenetspeech_ckpt_1.0.0a.model.tar/exp/ chunk_conformer/checkpoints/avg_10.pdparams
+ [2022-05-14 04:56:17,852] [ INFO] - start to create the stream conformer asr engine
+ [2022-05-14 04:56:17,863] [ INFO] - model name: conformer_online
+ [2022-05-14 04:56:22,756] [ INFO] - create the transformer like model success
+ [2022-05-14 04:56:22,758] [ INFO] - Initialize ASR server engine successfully.
+ INFO: Started server process [4242]
+ [2022-05-14 04:56:22] [INFO] [server.py:75] Started server process [4242]
+ INFO: Waiting for application startup.
+ [2022-05-14 04:56:22] [INFO] [on.py:45] Waiting for application startup.
+ INFO: Application startup complete.
+ [2022-05-14 04:56:22] [INFO] [on.py:59] Application startup complete.
+ INFO: Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
+ [2022-05-14 04:56:22] [INFO] [server.py:211] Uvicorn running on http://0.0.0.0:8090 (Press CTRL+C to quit)
```
### 4. ASR 客户端使用方法
@@ -120,98 +124,97 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
**注意:** 初次使用客户端时响应时间会略长
- 命令行 (推荐使用)
- 若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
+ 若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
- ```
- paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8090 --input ./zh.wav
- ```
+ ```bash
+ paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8090 --input ./zh.wav
+ ```
- 使用帮助:
-
- ```bash
- paddlespeech_client asr_online --help
- ```
+ 使用帮助:
+
+ ```bash
+ paddlespeech_client asr_online --help
+ ```
- 参数:
- - `server_ip`: 服务端ip地址,默认: 127.0.0.1。
- - `port`: 服务端口,默认: 8090。
- - `input`(必须输入): 用于识别的音频文件。
- - `sample_rate`: 音频采样率,默认值:16000。
- - `lang`: 模型语言,默认值:zh_cn。
- - `audio_format`: 音频格式,默认值:wav。
- - `punc.server_ip` 标点预测服务的ip。默认是None。
- - `punc.server_port` 标点预测服务的端口port。默认是None。
-
- 输出:
-
- ```bash
- [2022-05-06 21:10:35,598] [ INFO] - Start to do streaming asr client
- [2022-05-06 21:10:35,600] [ INFO] - asr websocket client start
- [2022-05-06 21:10:35,600] [ INFO] - endpoint: ws://127.0.0.1:8390/paddlespeech/asr/streaming
- [2022-05-06 21:10:35,600] [ INFO] - start to process the wavscp: ./zh.wav
- [2022-05-06 21:10:35,670] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
- [2022-05-06 21:10:35,699] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,713] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,726] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,738] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,750] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,762] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,774] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:35,786] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,387] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,398] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,407] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,416] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,425] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,434] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,442] [ INFO] - client receive msg={'result': ''}
- [2022-05-06 21:10:36,930] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,938] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,946] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,954] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,962] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,970] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,977] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:36,985] [ INFO] - client receive msg={'result': '我认为跑'}
- [2022-05-06 21:10:37,484] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,492] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,500] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,508] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,517] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,525] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:37,532] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
- [2022-05-06 21:10:38,050] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,058] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,066] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,073] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,081] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,089] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,097] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,105] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
- [2022-05-06 21:10:38,630] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,639] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,647] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,655] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,663] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,671] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:38,679] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
- [2022-05-06 21:10:39,216] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,224] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,232] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,240] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,248] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,256] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,264] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,272] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
- [2022-05-06 21:10:39,885] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,896] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,905] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,915] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,924] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:39,934] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
- [2022-05-06 21:10:44,827] [ INFO] - client final receive msg={'status': 'ok', 'signal': 'finished', 'result': '我认为跑步最重要的就是给我带来了身体健康', 'times': [{'w': '我', 'bg': 0.0, 'ed': 0.7000000000000001}, {'w': '认', 'bg': 0.7000000000000001, 'ed': 0.84}, {'w': '为', 'bg': 0.84, 'ed': 1.0}, {'w': '跑', 'bg': 1.0, 'ed': 1.18}, {'w': '步', 'bg': 1.18, 'ed': 1.36}, {'w': '最', 'bg': 1.36, 'ed': 1.5}, {'w': '重', 'bg': 1.5, 'ed': 1.6400000000000001}, {'w': '要', 'bg': 1.6400000000000001, 'ed': 1.78}, {'w': '的', 'bg': 1.78, 'ed': 1.9000000000000001}, {'w': '就', 'bg': 1.9000000000000001, 'ed': 2.06}, {'w': '是', 'bg': 2.06, 'ed': 2.62}, {'w': '给', 'bg': 2.62, 'ed': 3.16}, {'w': '我', 'bg': 3.16, 'ed': 3.3200000000000003}, {'w': '带', 'bg': 3.3200000000000003, 'ed': 3.48}, {'w': '来', 'bg': 3.48, 'ed': 3.62}, {'w': '了', 'bg': 3.62, 'ed': 3.7600000000000002}, {'w': '身', 'bg': 3.7600000000000002, 'ed': 3.9}, {'w': '体', 'bg': 3.9, 'ed': 4.0600000000000005}, {'w': '健', 'bg': 4.0600000000000005, 'ed': 4.26}, {'w': '康', 'bg': 4.26, 'ed': 4.96}]}
- [2022-05-06 21:10:44,827] [ INFO] - audio duration: 4.9968125, elapsed time: 9.225094079971313, RTF=1.846195765794957
- [2022-05-06 21:10:44,828] [ INFO] - asr websocket client finished : 我认为跑步最重要的就是给我带来了身体健康
+ 参数:
+ - `server_ip`: 服务端ip地址,默认: 127.0.0.1。
+ - `port`: 服务端口,默认: 8090。
+ - `input`(必须输入): 用于识别的音频文件。
+ - `sample_rate`: 音频采样率,默认值:16000。
+ - `lang`: 模型语言,默认值:zh_cn。
+ - `audio_format`: 音频格式,默认值:wav。
+ - `punc.server_ip` 标点预测服务的ip。默认是None。
+ - `punc.server_port` 标点预测服务的端口port。默认是None。
+
+ 输出:
+ ```text
+ [2022-05-06 21:10:35,598] [ INFO] - Start to do streaming asr client
+ [2022-05-06 21:10:35,600] [ INFO] - asr websocket client start
+ [2022-05-06 21:10:35,600] [ INFO] - endpoint: ws://127.0.0.1:8390/paddlespeech/asr/streaming
+ [2022-05-06 21:10:35,600] [ INFO] - start to process the wavscp: ./zh.wav
+ [2022-05-06 21:10:35,670] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
+ [2022-05-06 21:10:35,699] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,713] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,726] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,738] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,750] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,762] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,774] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:35,786] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,387] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,398] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,407] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,416] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,425] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,434] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,442] [ INFO] - client receive msg={'result': ''}
+ [2022-05-06 21:10:36,930] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,938] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,946] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,954] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,962] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,970] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,977] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:36,985] [ INFO] - client receive msg={'result': '我认为跑'}
+ [2022-05-06 21:10:37,484] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,492] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,500] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,508] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,517] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,525] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:37,532] [ INFO] - client receive msg={'result': '我认为跑步最重要的'}
+ [2022-05-06 21:10:38,050] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,058] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,066] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,073] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,081] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,089] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,097] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,105] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是'}
+ [2022-05-06 21:10:38,630] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,639] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,647] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,655] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,663] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,671] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:38,679] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给'}
+ [2022-05-06 21:10:39,216] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,224] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,232] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,240] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,248] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,256] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,264] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,272] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了'}
+ [2022-05-06 21:10:39,885] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,896] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,905] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,915] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,924] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:39,934] [ INFO] - client receive msg={'result': '我认为跑步最重要的就是给我带来了身体健康'}
+ [2022-05-06 21:10:44,827] [ INFO] - client final receive msg={'status': 'ok', 'signal': 'finished', 'result': '我认为跑步最重要的就是给我带来了身体健康', 'times': [{'w': '我', 'bg': 0.0, 'ed': 0.7000000000000001}, {'w': '认', 'bg': 0.7000000000000001, 'ed': 0.84}, {'w': '为', 'bg': 0.84, 'ed': 1.0}, {'w': '跑', 'bg': 1.0, 'ed': 1.18}, {'w': '步', 'bg': 1.18, 'ed': 1.36}, {'w': '最', 'bg': 1.36, 'ed': 1.5}, {'w': '重', 'bg': 1.5, 'ed': 1.6400000000000001}, {'w': '要', 'bg': 1.6400000000000001, 'ed': 1.78}, {'w': '的', 'bg': 1.78, 'ed': 1.9000000000000001}, {'w': '就', 'bg': 1.9000000000000001, 'ed': 2.06}, {'w': '是', 'bg': 2.06, 'ed': 2.62}, {'w': '给', 'bg': 2.62, 'ed': 3.16}, {'w': '我', 'bg': 3.16, 'ed': 3.3200000000000003}, {'w': '带', 'bg': 3.3200000000000003, 'ed': 3.48}, {'w': '来', 'bg': 3.48, 'ed': 3.62}, {'w': '了', 'bg': 3.62, 'ed': 3.7600000000000002}, {'w': '身', 'bg': 3.7600000000000002, 'ed': 3.9}, {'w': '体', 'bg': 3.9, 'ed': 4.0600000000000005}, {'w': '健', 'bg': 4.0600000000000005, 'ed': 4.26}, {'w': '康', 'bg': 4.26, 'ed': 4.96}]}
+ [2022-05-06 21:10:44,827] [ INFO] - audio duration: 4.9968125, elapsed time: 9.225094079971313, RTF=1.846195765794957
+ [2022-05-06 21:10:44,828] [ INFO] - asr websocket client finished : 我认为跑步最重要的就是给我带来了身体健康
```
- Python API
@@ -230,7 +233,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
```
输出:
- ```bash
+ ```text
[2022-05-06 21:14:03,137] [ INFO] - asr websocket client start
[2022-05-06 21:14:03,137] [ INFO] - endpoint: ws://127.0.0.1:8390/paddlespeech/asr/streaming
[2022-05-06 21:14:03,149] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
@@ -297,34 +300,29 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
[2022-05-06 21:14:12,159] [ INFO] - audio duration: 4.9968125, elapsed time: 9.019973039627075, RTF=1.8051453881103354
[2022-05-06 21:14:12,160] [ INFO] - asr websocket client finished
```
-
-
-
## 标点预测
### 1. 服务端使用方法
- 命令行
**注意:** 默认部署在 `cpu` 设备上,可以通过修改服务配置文件中 `device` 参数部署在 `gpu` 上。
- ``` bash
- 在 PaddleSpeech/demos/streaming_asr_server 目录下启动标点预测服务
+ ```bash
+ # 在 PaddleSpeech/demos/streaming_asr_server 目录下启动标点预测服务
paddlespeech_server start --config_file conf/punc_application.yaml
```
-
- 使用方法:
-
+ 使用方法:
```bash
paddlespeech_server start --help
```
- 参数:
+ 参数:
- `config_file`: 服务的配置文件。
- `log_file`: log 文件。
- 输出:
- ``` bash
+ 输出:
+ ```text
[2022-05-02 17:59:26,285] [ INFO] - Create the TextEngine Instance
[2022-05-02 17:59:26,285] [ INFO] - Init the text engine
[2022-05-02 17:59:26,285] [ INFO] - Text Engine set the device: gpu:0
@@ -356,26 +354,26 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
log_file="./log/paddlespeech.log")
```
- 输出
- ```
- [2022-05-02 18:09:02,542] [ INFO] - Create the TextEngine Instance
- [2022-05-02 18:09:02,543] [ INFO] - Init the text engine
- [2022-05-02 18:09:02,543] [ INFO] - Text Engine set the device: gpu:0
- [2022-05-02 18:09:02,545] [ INFO] - File /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar.gz md5 checking...
- [2022-05-02 18:09:06,919] [ INFO] - Use pretrained model stored in: /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar
- W0502 18:09:07.523002 22615 device_context.cc:447] Please NOTE: device: 0, GPU Compute Capability: 6.1, Driver API Version: 10.2, Runtime API Version: 10.2
- W0502 18:09:07.527882 22615 device_context.cc:465] device: 0, cuDNN Version: 7.6.
- [2022-05-02 18:09:10,900] [ INFO] - Already cached /home/users/xiongxinlei/.paddlenlp/models/ernie-1.0/vocab.txt
- [2022-05-02 18:09:10,913] [ INFO] - Init the text engine successfully
- INFO: Started server process [22615]
- [2022-05-02 18:09:10] [INFO] [server.py:75] Started server process [22615]
- INFO: Waiting for application startup.
- [2022-05-02 18:09:10] [INFO] [on.py:45] Waiting for application startup.
- INFO: Application startup complete.
- [2022-05-02 18:09:10] [INFO] [on.py:59] Application startup complete.
- INFO: Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
- [2022-05-02 18:09:10] [INFO] [server.py:206] Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
- ```
+ 输出:
+ ```text
+ [2022-05-02 18:09:02,542] [ INFO] - Create the TextEngine Instance
+ [2022-05-02 18:09:02,543] [ INFO] - Init the text engine
+ [2022-05-02 18:09:02,543] [ INFO] - Text Engine set the device: gpu:0
+ [2022-05-02 18:09:02,545] [ INFO] - File /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar.gz md5 checking...
+ [2022-05-02 18:09:06,919] [ INFO] - Use pretrained model stored in: /home/users/xiongxinlei/.paddlespeech/models/ernie_linear_p3_wudao-punc-zh/ernie_linear_p3_wudao-punc-zh.tar
+ W0502 18:09:07.523002 22615 device_context.cc:447] Please NOTE: device: 0, GPU Compute Capability: 6.1, Driver API Version: 10.2, Runtime API Version: 10.2
+ W0502 18:09:07.527882 22615 device_context.cc:465] device: 0, cuDNN Version: 7.6.
+ [2022-05-02 18:09:10,900] [ INFO] - Already cached /home/users/xiongxinlei/.paddlenlp/models/ernie-1.0/vocab.txt
+ [2022-05-02 18:09:10,913] [ INFO] - Init the text engine successfully
+ INFO: Started server process [22615]
+ [2022-05-02 18:09:10] [INFO] [server.py:75] Started server process [22615]
+ INFO: Waiting for application startup.
+ [2022-05-02 18:09:10] [INFO] [on.py:45] Waiting for application startup.
+ INFO: Application startup complete.
+ [2022-05-02 18:09:10] [INFO] [on.py:59] Application startup complete.
+ INFO: Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
+ [2022-05-02 18:09:10] [INFO] [server.py:206] Uvicorn running on http://0.0.0.0:8190 (Press CTRL+C to quit)
+ ```
### 2. 标点预测客户端使用方法
**注意:** 初次使用客户端时响应时间会略长
@@ -384,17 +382,17 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
- ```
- paddlespeech_client text --server_ip 127.0.0.1 --port 8190 --input "我认为跑步最重要的就是给我带来了身体健康"
- ```
-
- 输出
+ ```bash
+ paddlespeech_client text --server_ip 127.0.0.1 --port 8190 --input "我认为跑步最重要的就是给我带来了身体健康"
```
+
+ 输出:
+ ```text
[2022-05-02 18:12:29,767] [ INFO] - The punc text: 我认为跑步最重要的就是给我带来了身体健康。
[2022-05-02 18:12:29,767] [ INFO] - Response time 0.096548 s.
```
-- Python3 API
+- Python API
```python
from paddlespeech.server.bin.paddlespeech_client import TextClientExecutor
@@ -407,12 +405,11 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
print(res)
```
- 输出:
- ``` bash
+ 输出:
+ ```text
我认为跑步最重要的就是给我带来了身体健康。
```
-
## 联合流式语音识别和标点预测
**注意:** 默认部署在 `cpu` 设备上,可以通过修改服务配置文件中 `device` 参数将语音识别和标点预测部署在不同的 `gpu` 上。
@@ -420,7 +417,7 @@ wget -c https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav
### 1. 启动服务
-``` bash
+```bash
注意:流式语音识别和标点预测通过配置文件配置到不同的显卡上
bash server.sh
```
@@ -430,11 +427,11 @@ bash server.sh
若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
- ```
+ ```bash
paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8290 --punc.server_ip 127.0.0.1 --punc.port 8190 --input ./zh.wav
```
- 输出:
- ```
+ 输出:
+ ```text
[2022-05-07 11:21:47,060] [ INFO] - asr websocket client start
[2022-05-07 11:21:47,060] [ INFO] - endpoint: ws://127.0.0.1:8490/paddlespeech/asr/streaming
[2022-05-07 11:21:47,080] [ INFO] - client receive msg={"status": "ok", "signal": "server_ready"}
@@ -508,11 +505,11 @@ bash server.sh
若 `127.0.0.1` 不能访问,则需要使用实际服务 IP 地址
- ```
+ ```bash
python3 websocket_client.py --server_ip 127.0.0.1 --port 8290 --punc.server_ip 127.0.0.1 --punc.port 8190 --wavfile ./zh.wav
```
- 输出:
- ```
+ 输出:
+ ```text
[2022-05-07 11:11:02,984] [ INFO] - Start to do streaming asr client
[2022-05-07 11:11:02,985] [ INFO] - asr websocket client start
[2022-05-07 11:11:02,985] [ INFO] - endpoint: ws://127.0.0.1:8490/paddlespeech/asr/streaming
@@ -581,5 +578,3 @@ bash server.sh
[2022-05-07 11:11:18,915] [ INFO] - audio duration: 4.9968125, elapsed time: 15.928460597991943, RTF=3.187724293835709
[2022-05-07 11:11:18,916] [ INFO] - asr websocket client finished : 我认为跑步最重要的就是给我带来了身体健康
```
-
-
diff --git a/demos/streaming_asr_server/conf/ws_ds2_application.yaml b/demos/streaming_asr_server/conf/ws_ds2_application.yaml
index e36a829cc..ac20b2a23 100644
--- a/demos/streaming_asr_server/conf/ws_ds2_application.yaml
+++ b/demos/streaming_asr_server/conf/ws_ds2_application.yaml
@@ -18,12 +18,13 @@ engine_list: ['asr_online-onnx']
# ENGINE CONFIG #
#################################################################################
+
################################### ASR #########################################
-################### speech task: asr; engine_type: online-inference #######################
-asr_online-inference:
+################### speech task: asr; engine_type: online-onnx #######################
+asr_online-onnx:
model_type: 'deepspeech2online_wenetspeech'
- am_model: # the pdmodel file of am static model [optional]
- am_params: # the pdiparams file of am static model [optional]
+ am_model: # the pdmodel file of onnx am static model [optional]
+ am_params: # the pdiparams file of am static model [optional]
lang: 'zh'
sample_rate: 16000
cfg_path:
@@ -32,11 +33,14 @@ asr_online-inference:
force_yes: True
device: 'cpu' # cpu or gpu:id
+ # https://onnxruntime.ai/docs/api/python/api_summary.html#inferencesession
am_predictor_conf:
- device: # set 'gpu:id' or 'cpu'
- switch_ir_optim: True
- glog_info: False # True -> print glog
- summary: True # False -> do not show predictor config
+ device: 'cpu' # set 'gpu:id' or 'cpu'
+ graph_optimization_level: 0
+ intra_op_num_threads: 0 # Sets the number of threads used to parallelize the execution within nodes.
+ inter_op_num_threads: 0 # Sets the number of threads used to parallelize the execution of the graph (across nodes).
+ log_severity_level: 2 # Log severity level. Applies to session load, initialization, etc. 0:Verbose, 1:Info, 2:Warning. 3:Error, 4:Fatal. Default is 2.
+ log_verbosity_level: 0 # VLOG level if DEBUG build and session_log_severity_level is 0. Applies to session load, initialization, etc. Default is 0.
chunk_buffer_conf:
frame_duration_ms: 85
@@ -49,13 +53,12 @@ asr_online-inference:
shift_ms: 10 # ms
-
################################### ASR #########################################
-################### speech task: asr; engine_type: online-onnx #######################
-asr_online-onnx:
+################### speech task: asr; engine_type: online-inference #######################
+asr_online-inference:
model_type: 'deepspeech2online_wenetspeech'
- am_model: # the pdmodel file of onnx am static model [optional]
- am_params: # the pdiparams file of am static model [optional]
+ am_model: # the pdmodel file of am static model [optional]
+ am_params: # the pdiparams file of am static model [optional]
lang: 'zh'
sample_rate: 16000
cfg_path:
@@ -64,21 +67,18 @@ asr_online-onnx:
force_yes: True
device: 'cpu' # cpu or gpu:id
- # https://onnxruntime.ai/docs/api/python/api_summary.html#inferencesession
am_predictor_conf:
- device: 'cpu' # set 'gpu:id' or 'cpu'
- graph_optimization_level: 0
- intra_op_num_threads: 0 # Sets the number of threads used to parallelize the execution within nodes.
- inter_op_num_threads: 0 # Sets the number of threads used to parallelize the execution of the graph (across nodes).
- log_severity_level: 2 # Log severity level. Applies to session load, initialization, etc. 0:Verbose, 1:Info, 2:Warning. 3:Error, 4:Fatal. Default is 2.
- log_verbosity_level: 0 # VLOG level if DEBUG build and session_log_severity_level is 0. Applies to session load, initialization, etc. Default is 0.
+ device: # set 'gpu:id' or 'cpu'
+ switch_ir_optim: True
+ glog_info: False # True -> print glog
+ summary: True # False -> do not show predictor config
chunk_buffer_conf:
- frame_duration_ms: 80
+ frame_duration_ms: 85
shift_ms: 40
sample_rate: 16000
sample_width: 2
window_n: 7 # frame
shift_n: 4 # frame
window_ms: 25 # ms
- shift_ms: 10 # ms
+ shift_ms: 10 # ms
\ No newline at end of file
diff --git a/demos/streaming_asr_server/punc_server.py b/demos/streaming_asr_server/local/punc_server.py
similarity index 100%
rename from demos/streaming_asr_server/punc_server.py
rename to demos/streaming_asr_server/local/punc_server.py
diff --git a/demos/streaming_asr_server/local/rtf_from_log.py b/demos/streaming_asr_server/local/rtf_from_log.py
index a5634388b..4b89b48fd 100755
--- a/demos/streaming_asr_server/local/rtf_from_log.py
+++ b/demos/streaming_asr_server/local/rtf_from_log.py
@@ -33,7 +33,8 @@ if __name__ == '__main__':
P = 0.0
n = 0
for m in rtfs:
- n += 1
+ # not accurate, may have duplicate log
+ n += 1
T += m['T']
P += m['P']
diff --git a/demos/streaming_asr_server/streaming_asr_server.py b/demos/streaming_asr_server/local/streaming_asr_server.py
similarity index 100%
rename from demos/streaming_asr_server/streaming_asr_server.py
rename to demos/streaming_asr_server/local/streaming_asr_server.py
diff --git a/demos/streaming_asr_server/local/websocket_client.py b/demos/streaming_asr_server/local/websocket_client.py
index 51ae7a2f4..8b70eb2d6 100644
--- a/demos/streaming_asr_server/local/websocket_client.py
+++ b/demos/streaming_asr_server/local/websocket_client.py
@@ -18,7 +18,6 @@
import argparse
import asyncio
import codecs
-import logging
import os
from paddlespeech.cli.log import logger
@@ -44,7 +43,7 @@ def main(args):
# support to process batch audios from wav.scp
if args.wavscp and os.path.exists(args.wavscp):
- logging.info(f"start to process the wavscp: {args.wavscp}")
+ logger.info(f"start to process the wavscp: {args.wavscp}")
with codecs.open(args.wavscp, 'r', encoding='utf-8') as f,\
codecs.open("result.txt", 'w', encoding='utf-8') as w:
for line in f:
diff --git a/demos/streaming_asr_server/run.sh b/demos/streaming_asr_server/run.sh
old mode 100644
new mode 100755
diff --git a/demos/streaming_asr_server/server.sh b/demos/streaming_asr_server/server.sh
index f532546e7..961cb046a 100755
--- a/demos/streaming_asr_server/server.sh
+++ b/demos/streaming_asr_server/server.sh
@@ -1,9 +1,8 @@
-export CUDA_VISIBLE_DEVICE=0,1,2,3
- export CUDA_VISIBLE_DEVICE=0,1,2,3
+#export CUDA_VISIBLE_DEVICE=0,1,2,3
-# nohup python3 punc_server.py --config_file conf/punc_application.yaml > punc.log 2>&1 &
+# nohup python3 local/punc_server.py --config_file conf/punc_application.yaml > punc.log 2>&1 &
paddlespeech_server start --config_file conf/punc_application.yaml &> punc.log &
-# nohup python3 streaming_asr_server.py --config_file conf/ws_conformer_wenetspeech_application.yaml > streaming_asr.log 2>&1 &
+# nohup python3 local/streaming_asr_server.py --config_file conf/ws_conformer_wenetspeech_application.yaml > streaming_asr.log 2>&1 &
paddlespeech_server start --config_file conf/ws_conformer_wenetspeech_application.yaml &> streaming_asr.log &
diff --git a/demos/streaming_asr_server/test.sh b/demos/streaming_asr_server/test.sh
index 67a5ec4c5..386c7f894 100755
--- a/demos/streaming_asr_server/test.sh
+++ b/demos/streaming_asr_server/test.sh
@@ -7,5 +7,5 @@ paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8090 --input ./zh.wa
# read the wav and call streaming and punc service
# If `127.0.0.1` is not accessible, you need to use the actual service IP address.
-paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8290 --punc.server_ip 127.0.0.1 --punc.port 8190 --input ./zh.wav
+paddlespeech_client asr_online --server_ip 127.0.0.1 --port 8090 --punc.server_ip 127.0.0.1 --punc.port 8190 --input ./zh.wav
diff --git a/demos/streaming_asr_server/web/app.py b/demos/streaming_asr_server/web/app.py
deleted file mode 100644
index 22993c08e..000000000
--- a/demos/streaming_asr_server/web/app.py
+++ /dev/null
@@ -1,23 +0,0 @@
-#!/usr/bin/env python3
-# -*- coding: utf-8 -*-
-# Copyright 2021 Mobvoi Inc. All Rights Reserved.
-# Author: zhendong.peng@mobvoi.com (Zhendong Peng)
-import argparse
-
-from flask import Flask
-from flask import render_template
-
-parser = argparse.ArgumentParser(description='training your network')
-parser.add_argument('--port', default=19999, type=int, help='port id')
-args = parser.parse_args()
-
-app = Flask(__name__)
-
-
-@app.route('/')
-def index():
- return render_template('index.html')
-
-
-if __name__ == '__main__':
- app.run(host='0.0.0.0', port=args.port, debug=True)
diff --git a/demos/streaming_asr_server/web/favicon.ico b/demos/streaming_asr_server/web/favicon.ico
new file mode 100644
index 000000000..342038720
Binary files /dev/null and b/demos/streaming_asr_server/web/favicon.ico differ
diff --git a/demos/streaming_asr_server/web/index.html b/demos/streaming_asr_server/web/index.html
new file mode 100644
index 000000000..33c676c55
--- /dev/null
+++ b/demos/streaming_asr_server/web/index.html
@@ -0,0 +1,218 @@
+
+
+
+
+
+
+ 飞桨PaddleSpeech
+
+
+
+
+
+
+
+
diff --git a/demos/streaming_asr_server/web/paddle_web_demo.png b/demos/streaming_asr_server/web/paddle_web_demo.png
index 214edffd0..db4b63ab9 100644
Binary files a/demos/streaming_asr_server/web/paddle_web_demo.png and b/demos/streaming_asr_server/web/paddle_web_demo.png differ
diff --git a/demos/streaming_asr_server/web/readme.md b/demos/streaming_asr_server/web/readme.md
index 8310a2571..bef421711 100644
--- a/demos/streaming_asr_server/web/readme.md
+++ b/demos/streaming_asr_server/web/readme.md
@@ -1,18 +1,20 @@
# paddlespeech serving 网页Demo
-- 感谢[wenet](https://github.com/wenet-e2e/wenet)团队的前端demo代码.
+
+step1: 开启流式语音识别服务器端
-## 使用方法
-### 1. 在本地电脑启动网页服务
- ```
- python app.py
+```
+# 开启流式语音识别服务
+cd PaddleSpeech/demos/streaming_asr_server
+paddlespeech_server start --config_file conf/ws_conformer_wenetspeech_application_faster.yaml
+```
- ```
+step2: 谷歌游览器打开 `web`目录下`index.html`
-### 2. 本地电脑浏览器
+step3: 点击`连接`,验证WebSocket是否成功连接
+
+step4:点击开始录音(弹窗询问,允许录音)
-在浏览器中输入127.0.0.1:19999 即可看到相关网页Demo。
-
diff --git a/demos/streaming_asr_server/web/static/css/font-awesome.min.css b/demos/streaming_asr_server/web/static/css/font-awesome.min.css
deleted file mode 100644
index 540440ce8..000000000
--- a/demos/streaming_asr_server/web/static/css/font-awesome.min.css
+++ /dev/null
@@ -1,4 +0,0 @@
-/*!
- * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome
- * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License)
- */@font-face{font-family:'FontAwesome';src:url('../fonts/fontawesome-webfont.eot?v=4.7.0');src:url('../fonts/fontawesome-webfont.eot?#iefix&v=4.7.0') format('embedded-opentype'),url('../fonts/fontawesome-webfont.woff2?v=4.7.0') format('woff2'),url('../fonts/fontawesome-webfont.woff?v=4.7.0') format('woff'),url('../fonts/fontawesome-webfont.ttf?v=4.7.0') format('truetype'),url('../fonts/fontawesome-webfont.svg?v=4.7.0#fontawesomeregular') format('svg');font-weight:normal;font-style:normal}.fa{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571429em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14285714em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14285714em;width:2.14285714em;top:.14285714em;text-align:center}.fa-li.fa-lg{left:-1.85714286em}.fa-border{padding:.2em .25em .15em;border:solid .08em #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa.fa-pull-left{margin-right:.3em}.fa.fa-pull-right{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left{margin-right:.3em}.fa.pull-right{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s infinite linear;animation:fa-spin 2s infinite linear}.fa-pulse{-webkit-animation:fa-spin 1s infinite steps(8);animation:fa-spin 1s infinite steps(8)}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}100%{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}100%{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scale(-1, 1);-ms-transform:scale(-1, 1);transform:scale(-1, 1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scale(1, -1);-ms-transform:scale(1, -1);transform:scale(1, -1)}:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270,:root .fa-flip-horizontal,:root .fa-flip-vertical{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:"\f000"}.fa-music:before{content:"\f001"}.fa-search:before{content:"\f002"}.fa-envelope-o:before{content:"\f003"}.fa-heart:before{content:"\f004"}.fa-star:before{content:"\f005"}.fa-star-o:before{content:"\f006"}.fa-user:before{content:"\f007"}.fa-film:before{content:"\f008"}.fa-th-large:before{content:"\f009"}.fa-th:before{content:"\f00a"}.fa-th-list:before{content:"\f00b"}.fa-check:before{content:"\f00c"}.fa-remove:before,.fa-close:before,.fa-times:before{content:"\f00d"}.fa-search-plus:before{content:"\f00e"}.fa-search-minus:before{content:"\f010"}.fa-power-off:before{content:"\f011"}.fa-signal:before{content:"\f012"}.fa-gear:before,.fa-cog:before{content:"\f013"}.fa-trash-o:before{content:"\f014"}.fa-home:before{content:"\f015"}.fa-file-o:before{content:"\f016"}.fa-clock-o:before{content:"\f017"}.fa-road:before{content:"\f018"}.fa-download:before{content:"\f019"}.fa-arrow-circle-o-down:before{content:"\f01a"}.fa-arrow-circle-o-up:before{content:"\f01b"}.fa-inbox:before{content:"\f01c"}.fa-play-circle-o:before{content:"\f01d"}.fa-rotate-right:before,.fa-repeat:before{content:"\f01e"}.fa-refresh:before{content:"\f021"}.fa-list-alt:before{content:"\f022"}.fa-lock:before{content:"\f023"}.fa-flag:before{content:"\f024"}.fa-headphones:before{content:"\f025"}.fa-volume-off:before{content:"\f026"}.fa-volume-down:before{content:"\f027"}.fa-volume-up:before{content:"\f028"}.fa-qrcode:before{content:"\f029"}.fa-barcode:before{content:"\f02a"}.fa-tag:before{content:"\f02b"}.fa-tags:before{content:"\f02c"}.fa-book:before{content:"\f02d"}.fa-bookmark:before{content:"\f02e"}.fa-print:before{content:"\f02f"}.fa-camera:before{content:"\f030"}.fa-font:before{content:"\f031"}.fa-bold:before{content:"\f032"}.fa-italic:before{content:"\f033"}.fa-text-height:before{content:"\f034"}.fa-text-width:before{content:"\f035"}.fa-align-left:before{content:"\f036"}.fa-align-center:before{content:"\f037"}.fa-align-right:before{content:"\f038"}.fa-align-justify:before{content:"\f039"}.fa-list:before{content:"\f03a"}.fa-dedent:before,.fa-outdent:before{content:"\f03b"}.fa-indent:before{content:"\f03c"}.fa-video-camera:before{content:"\f03d"}.fa-photo:before,.fa-image:before,.fa-picture-o:before{content:"\f03e"}.fa-pencil:before{content:"\f040"}.fa-map-marker:before{content:"\f041"}.fa-adjust:before{content:"\f042"}.fa-tint:before{content:"\f043"}.fa-edit:before,.fa-pencil-square-o:before{content:"\f044"}.fa-share-square-o:before{content:"\f045"}.fa-check-square-o:before{content:"\f046"}.fa-arrows:before{content:"\f047"}.fa-step-backward:before{content:"\f048"}.fa-fast-backward:before{content:"\f049"}.fa-backward:before{content:"\f04a"}.fa-play:before{content:"\f04b"}.fa-pause:before{content:"\f04c"}.fa-stop:before{content:"\f04d"}.fa-forward:before{content:"\f04e"}.fa-fast-forward:before{content:"\f050"}.fa-step-forward:before{content:"\f051"}.fa-eject:before{content:"\f052"}.fa-chevron-left:before{content:"\f053"}.fa-chevron-right:before{content:"\f054"}.fa-plus-circle:before{content:"\f055"}.fa-minus-circle:before{content:"\f056"}.fa-times-circle:before{content:"\f057"}.fa-check-circle:before{content:"\f058"}.fa-question-circle:before{content:"\f059"}.fa-info-circle:before{content:"\f05a"}.fa-crosshairs:before{content:"\f05b"}.fa-times-circle-o:before{content:"\f05c"}.fa-check-circle-o:before{content:"\f05d"}.fa-ban:before{content:"\f05e"}.fa-arrow-left:before{content:"\f060"}.fa-arrow-right:before{content:"\f061"}.fa-arrow-up:before{content:"\f062"}.fa-arrow-down:before{content:"\f063"}.fa-mail-forward:before,.fa-share:before{content:"\f064"}.fa-expand:before{content:"\f065"}.fa-compress:before{content:"\f066"}.fa-plus:before{content:"\f067"}.fa-minus:before{content:"\f068"}.fa-asterisk:before{content:"\f069"}.fa-exclamation-circle:before{content:"\f06a"}.fa-gift:before{content:"\f06b"}.fa-leaf:before{content:"\f06c"}.fa-fire:before{content:"\f06d"}.fa-eye:before{content:"\f06e"}.fa-eye-slash:before{content:"\f070"}.fa-warning:before,.fa-exclamation-triangle:before{content:"\f071"}.fa-plane:before{content:"\f072"}.fa-calendar:before{content:"\f073"}.fa-random:before{content:"\f074"}.fa-comment:before{content:"\f075"}.fa-magnet:before{content:"\f076"}.fa-chevron-up:before{content:"\f077"}.fa-chevron-down:before{content:"\f078"}.fa-retweet:before{content:"\f079"}.fa-shopping-cart:before{content:"\f07a"}.fa-folder:before{content:"\f07b"}.fa-folder-open:before{content:"\f07c"}.fa-arrows-v:before{content:"\f07d"}.fa-arrows-h:before{content:"\f07e"}.fa-bar-chart-o:before,.fa-bar-chart:before{content:"\f080"}.fa-twitter-square:before{content:"\f081"}.fa-facebook-square:before{content:"\f082"}.fa-camera-retro:before{content:"\f083"}.fa-key:before{content:"\f084"}.fa-gears:before,.fa-cogs:before{content:"\f085"}.fa-comments:before{content:"\f086"}.fa-thumbs-o-up:before{content:"\f087"}.fa-thumbs-o-down:before{content:"\f088"}.fa-star-half:before{content:"\f089"}.fa-heart-o:before{content:"\f08a"}.fa-sign-out:before{content:"\f08b"}.fa-linkedin-square:before{content:"\f08c"}.fa-thumb-tack:before{content:"\f08d"}.fa-external-link:before{content:"\f08e"}.fa-sign-in:before{content:"\f090"}.fa-trophy:before{content:"\f091"}.fa-github-square:before{content:"\f092"}.fa-upload:before{content:"\f093"}.fa-lemon-o:before{content:"\f094"}.fa-phone:before{content:"\f095"}.fa-square-o:before{content:"\f096"}.fa-bookmark-o:before{content:"\f097"}.fa-phone-square:before{content:"\f098"}.fa-twitter:before{content:"\f099"}.fa-facebook-f:before,.fa-facebook:before{content:"\f09a"}.fa-github:before{content:"\f09b"}.fa-unlock:before{content:"\f09c"}.fa-credit-card:before{content:"\f09d"}.fa-feed:before,.fa-rss:before{content:"\f09e"}.fa-hdd-o:before{content:"\f0a0"}.fa-bullhorn:before{content:"\f0a1"}.fa-bell:before{content:"\f0f3"}.fa-certificate:before{content:"\f0a3"}.fa-hand-o-right:before{content:"\f0a4"}.fa-hand-o-left:before{content:"\f0a5"}.fa-hand-o-up:before{content:"\f0a6"}.fa-hand-o-down:before{content:"\f0a7"}.fa-arrow-circle-left:before{content:"\f0a8"}.fa-arrow-circle-right:before{content:"\f0a9"}.fa-arrow-circle-up:before{content:"\f0aa"}.fa-arrow-circle-down:before{content:"\f0ab"}.fa-globe:before{content:"\f0ac"}.fa-wrench:before{content:"\f0ad"}.fa-tasks:before{content:"\f0ae"}.fa-filter:before{content:"\f0b0"}.fa-briefcase:before{content:"\f0b1"}.fa-arrows-alt:before{content:"\f0b2"}.fa-group:before,.fa-users:before{content:"\f0c0"}.fa-chain:before,.fa-link:before{content:"\f0c1"}.fa-cloud:before{content:"\f0c2"}.fa-flask:before{content:"\f0c3"}.fa-cut:before,.fa-scissors:before{content:"\f0c4"}.fa-copy:before,.fa-files-o:before{content:"\f0c5"}.fa-paperclip:before{content:"\f0c6"}.fa-save:before,.fa-floppy-o:before{content:"\f0c7"}.fa-square:before{content:"\f0c8"}.fa-navicon:before,.fa-reorder:before,.fa-bars:before{content:"\f0c9"}.fa-list-ul:before{content:"\f0ca"}.fa-list-ol:before{content:"\f0cb"}.fa-strikethrough:before{content:"\f0cc"}.fa-underline:before{content:"\f0cd"}.fa-table:before{content:"\f0ce"}.fa-magic:before{content:"\f0d0"}.fa-truck:before{content:"\f0d1"}.fa-pinterest:before{content:"\f0d2"}.fa-pinterest-square:before{content:"\f0d3"}.fa-google-plus-square:before{content:"\f0d4"}.fa-google-plus:before{content:"\f0d5"}.fa-money:before{content:"\f0d6"}.fa-caret-down:before{content:"\f0d7"}.fa-caret-up:before{content:"\f0d8"}.fa-caret-left:before{content:"\f0d9"}.fa-caret-right:before{content:"\f0da"}.fa-columns:before{content:"\f0db"}.fa-unsorted:before,.fa-sort:before{content:"\f0dc"}.fa-sort-down:before,.fa-sort-desc:before{content:"\f0dd"}.fa-sort-up:before,.fa-sort-asc:before{content:"\f0de"}.fa-envelope:before{content:"\f0e0"}.fa-linkedin:before{content:"\f0e1"}.fa-rotate-left:before,.fa-undo:before{content:"\f0e2"}.fa-legal:before,.fa-gavel:before{content:"\f0e3"}.fa-dashboard:before,.fa-tachometer:before{content:"\f0e4"}.fa-comment-o:before{content:"\f0e5"}.fa-comments-o:before{content:"\f0e6"}.fa-flash:before,.fa-bolt:before{content:"\f0e7"}.fa-sitemap:before{content:"\f0e8"}.fa-umbrella:before{content:"\f0e9"}.fa-paste:before,.fa-clipboard:before{content:"\f0ea"}.fa-lightbulb-o:before{content:"\f0eb"}.fa-exchange:before{content:"\f0ec"}.fa-cloud-download:before{content:"\f0ed"}.fa-cloud-upload:before{content:"\f0ee"}.fa-user-md:before{content:"\f0f0"}.fa-stethoscope:before{content:"\f0f1"}.fa-suitcase:before{content:"\f0f2"}.fa-bell-o:before{content:"\f0a2"}.fa-coffee:before{content:"\f0f4"}.fa-cutlery:before{content:"\f0f5"}.fa-file-text-o:before{content:"\f0f6"}.fa-building-o:before{content:"\f0f7"}.fa-hospital-o:before{content:"\f0f8"}.fa-ambulance:before{content:"\f0f9"}.fa-medkit:before{content:"\f0fa"}.fa-fighter-jet:before{content:"\f0fb"}.fa-beer:before{content:"\f0fc"}.fa-h-square:before{content:"\f0fd"}.fa-plus-square:before{content:"\f0fe"}.fa-angle-double-left:before{content:"\f100"}.fa-angle-double-right:before{content:"\f101"}.fa-angle-double-up:before{content:"\f102"}.fa-angle-double-down:before{content:"\f103"}.fa-angle-left:before{content:"\f104"}.fa-angle-right:before{content:"\f105"}.fa-angle-up:before{content:"\f106"}.fa-angle-down:before{content:"\f107"}.fa-desktop:before{content:"\f108"}.fa-laptop:before{content:"\f109"}.fa-tablet:before{content:"\f10a"}.fa-mobile-phone:before,.fa-mobile:before{content:"\f10b"}.fa-circle-o:before{content:"\f10c"}.fa-quote-left:before{content:"\f10d"}.fa-quote-right:before{content:"\f10e"}.fa-spinner:before{content:"\f110"}.fa-circle:before{content:"\f111"}.fa-mail-reply:before,.fa-reply:before{content:"\f112"}.fa-github-alt:before{content:"\f113"}.fa-folder-o:before{content:"\f114"}.fa-folder-open-o:before{content:"\f115"}.fa-smile-o:before{content:"\f118"}.fa-frown-o:before{content:"\f119"}.fa-meh-o:before{content:"\f11a"}.fa-gamepad:before{content:"\f11b"}.fa-keyboard-o:before{content:"\f11c"}.fa-flag-o:before{content:"\f11d"}.fa-flag-checkered:before{content:"\f11e"}.fa-terminal:before{content:"\f120"}.fa-code:before{content:"\f121"}.fa-mail-reply-all:before,.fa-reply-all:before{content:"\f122"}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:"\f123"}.fa-location-arrow:before{content:"\f124"}.fa-crop:before{content:"\f125"}.fa-code-fork:before{content:"\f126"}.fa-unlink:before,.fa-chain-broken:before{content:"\f127"}.fa-question:before{content:"\f128"}.fa-info:before{content:"\f129"}.fa-exclamation:before{content:"\f12a"}.fa-superscript:before{content:"\f12b"}.fa-subscript:before{content:"\f12c"}.fa-eraser:before{content:"\f12d"}.fa-puzzle-piece:before{content:"\f12e"}.fa-microphone:before{content:"\f130"}.fa-microphone-slash:before{content:"\f131"}.fa-shield:before{content:"\f132"}.fa-calendar-o:before{content:"\f133"}.fa-fire-extinguisher:before{content:"\f134"}.fa-rocket:before{content:"\f135"}.fa-maxcdn:before{content:"\f136"}.fa-chevron-circle-left:before{content:"\f137"}.fa-chevron-circle-right:before{content:"\f138"}.fa-chevron-circle-up:before{content:"\f139"}.fa-chevron-circle-down:before{content:"\f13a"}.fa-html5:before{content:"\f13b"}.fa-css3:before{content:"\f13c"}.fa-anchor:before{content:"\f13d"}.fa-unlock-alt:before{content:"\f13e"}.fa-bullseye:before{content:"\f140"}.fa-ellipsis-h:before{content:"\f141"}.fa-ellipsis-v:before{content:"\f142"}.fa-rss-square:before{content:"\f143"}.fa-play-circle:before{content:"\f144"}.fa-ticket:before{content:"\f145"}.fa-minus-square:before{content:"\f146"}.fa-minus-square-o:before{content:"\f147"}.fa-level-up:before{content:"\f148"}.fa-level-down:before{content:"\f149"}.fa-check-square:before{content:"\f14a"}.fa-pencil-square:before{content:"\f14b"}.fa-external-link-square:before{content:"\f14c"}.fa-share-square:before{content:"\f14d"}.fa-compass:before{content:"\f14e"}.fa-toggle-down:before,.fa-caret-square-o-down:before{content:"\f150"}.fa-toggle-up:before,.fa-caret-square-o-up:before{content:"\f151"}.fa-toggle-right:before,.fa-caret-square-o-right:before{content:"\f152"}.fa-euro:before,.fa-eur:before{content:"\f153"}.fa-gbp:before{content:"\f154"}.fa-dollar:before,.fa-usd:before{content:"\f155"}.fa-rupee:before,.fa-inr:before{content:"\f156"}.fa-cny:before,.fa-rmb:before,.fa-yen:before,.fa-jpy:before{content:"\f157"}.fa-ruble:before,.fa-rouble:before,.fa-rub:before{content:"\f158"}.fa-won:before,.fa-krw:before{content:"\f159"}.fa-bitcoin:before,.fa-btc:before{content:"\f15a"}.fa-file:before{content:"\f15b"}.fa-file-text:before{content:"\f15c"}.fa-sort-alpha-asc:before{content:"\f15d"}.fa-sort-alpha-desc:before{content:"\f15e"}.fa-sort-amount-asc:before{content:"\f160"}.fa-sort-amount-desc:before{content:"\f161"}.fa-sort-numeric-asc:before{content:"\f162"}.fa-sort-numeric-desc:before{content:"\f163"}.fa-thumbs-up:before{content:"\f164"}.fa-thumbs-down:before{content:"\f165"}.fa-youtube-square:before{content:"\f166"}.fa-youtube:before{content:"\f167"}.fa-xing:before{content:"\f168"}.fa-xing-square:before{content:"\f169"}.fa-youtube-play:before{content:"\f16a"}.fa-dropbox:before{content:"\f16b"}.fa-stack-overflow:before{content:"\f16c"}.fa-instagram:before{content:"\f16d"}.fa-flickr:before{content:"\f16e"}.fa-adn:before{content:"\f170"}.fa-bitbucket:before{content:"\f171"}.fa-bitbucket-square:before{content:"\f172"}.fa-tumblr:before{content:"\f173"}.fa-tumblr-square:before{content:"\f174"}.fa-long-arrow-down:before{content:"\f175"}.fa-long-arrow-up:before{content:"\f176"}.fa-long-arrow-left:before{content:"\f177"}.fa-long-arrow-right:before{content:"\f178"}.fa-apple:before{content:"\f179"}.fa-windows:before{content:"\f17a"}.fa-android:before{content:"\f17b"}.fa-linux:before{content:"\f17c"}.fa-dribbble:before{content:"\f17d"}.fa-skype:before{content:"\f17e"}.fa-foursquare:before{content:"\f180"}.fa-trello:before{content:"\f181"}.fa-female:before{content:"\f182"}.fa-male:before{content:"\f183"}.fa-gittip:before,.fa-gratipay:before{content:"\f184"}.fa-sun-o:before{content:"\f185"}.fa-moon-o:before{content:"\f186"}.fa-archive:before{content:"\f187"}.fa-bug:before{content:"\f188"}.fa-vk:before{content:"\f189"}.fa-weibo:before{content:"\f18a"}.fa-renren:before{content:"\f18b"}.fa-pagelines:before{content:"\f18c"}.fa-stack-exchange:before{content:"\f18d"}.fa-arrow-circle-o-right:before{content:"\f18e"}.fa-arrow-circle-o-left:before{content:"\f190"}.fa-toggle-left:before,.fa-caret-square-o-left:before{content:"\f191"}.fa-dot-circle-o:before{content:"\f192"}.fa-wheelchair:before{content:"\f193"}.fa-vimeo-square:before{content:"\f194"}.fa-turkish-lira:before,.fa-try:before{content:"\f195"}.fa-plus-square-o:before{content:"\f196"}.fa-space-shuttle:before{content:"\f197"}.fa-slack:before{content:"\f198"}.fa-envelope-square:before{content:"\f199"}.fa-wordpress:before{content:"\f19a"}.fa-openid:before{content:"\f19b"}.fa-institution:before,.fa-bank:before,.fa-university:before{content:"\f19c"}.fa-mortar-board:before,.fa-graduation-cap:before{content:"\f19d"}.fa-yahoo:before{content:"\f19e"}.fa-google:before{content:"\f1a0"}.fa-reddit:before{content:"\f1a1"}.fa-reddit-square:before{content:"\f1a2"}.fa-stumbleupon-circle:before{content:"\f1a3"}.fa-stumbleupon:before{content:"\f1a4"}.fa-delicious:before{content:"\f1a5"}.fa-digg:before{content:"\f1a6"}.fa-pied-piper-pp:before{content:"\f1a7"}.fa-pied-piper-alt:before{content:"\f1a8"}.fa-drupal:before{content:"\f1a9"}.fa-joomla:before{content:"\f1aa"}.fa-language:before{content:"\f1ab"}.fa-fax:before{content:"\f1ac"}.fa-building:before{content:"\f1ad"}.fa-child:before{content:"\f1ae"}.fa-paw:before{content:"\f1b0"}.fa-spoon:before{content:"\f1b1"}.fa-cube:before{content:"\f1b2"}.fa-cubes:before{content:"\f1b3"}.fa-behance:before{content:"\f1b4"}.fa-behance-square:before{content:"\f1b5"}.fa-steam:before{content:"\f1b6"}.fa-steam-square:before{content:"\f1b7"}.fa-recycle:before{content:"\f1b8"}.fa-automobile:before,.fa-car:before{content:"\f1b9"}.fa-cab:before,.fa-taxi:before{content:"\f1ba"}.fa-tree:before{content:"\f1bb"}.fa-spotify:before{content:"\f1bc"}.fa-deviantart:before{content:"\f1bd"}.fa-soundcloud:before{content:"\f1be"}.fa-database:before{content:"\f1c0"}.fa-file-pdf-o:before{content:"\f1c1"}.fa-file-word-o:before{content:"\f1c2"}.fa-file-excel-o:before{content:"\f1c3"}.fa-file-powerpoint-o:before{content:"\f1c4"}.fa-file-photo-o:before,.fa-file-picture-o:before,.fa-file-image-o:before{content:"\f1c5"}.fa-file-zip-o:before,.fa-file-archive-o:before{content:"\f1c6"}.fa-file-sound-o:before,.fa-file-audio-o:before{content:"\f1c7"}.fa-file-movie-o:before,.fa-file-video-o:before{content:"\f1c8"}.fa-file-code-o:before{content:"\f1c9"}.fa-vine:before{content:"\f1ca"}.fa-codepen:before{content:"\f1cb"}.fa-jsfiddle:before{content:"\f1cc"}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-saver:before,.fa-support:before,.fa-life-ring:before{content:"\f1cd"}.fa-circle-o-notch:before{content:"\f1ce"}.fa-ra:before,.fa-resistance:before,.fa-rebel:before{content:"\f1d0"}.fa-ge:before,.fa-empire:before{content:"\f1d1"}.fa-git-square:before{content:"\f1d2"}.fa-git:before{content:"\f1d3"}.fa-y-combinator-square:before,.fa-yc-square:before,.fa-hacker-news:before{content:"\f1d4"}.fa-tencent-weibo:before{content:"\f1d5"}.fa-qq:before{content:"\f1d6"}.fa-wechat:before,.fa-weixin:before{content:"\f1d7"}.fa-send:before,.fa-paper-plane:before{content:"\f1d8"}.fa-send-o:before,.fa-paper-plane-o:before{content:"\f1d9"}.fa-history:before{content:"\f1da"}.fa-circle-thin:before{content:"\f1db"}.fa-header:before{content:"\f1dc"}.fa-paragraph:before{content:"\f1dd"}.fa-sliders:before{content:"\f1de"}.fa-share-alt:before{content:"\f1e0"}.fa-share-alt-square:before{content:"\f1e1"}.fa-bomb:before{content:"\f1e2"}.fa-soccer-ball-o:before,.fa-futbol-o:before{content:"\f1e3"}.fa-tty:before{content:"\f1e4"}.fa-binoculars:before{content:"\f1e5"}.fa-plug:before{content:"\f1e6"}.fa-slideshare:before{content:"\f1e7"}.fa-twitch:before{content:"\f1e8"}.fa-yelp:before{content:"\f1e9"}.fa-newspaper-o:before{content:"\f1ea"}.fa-wifi:before{content:"\f1eb"}.fa-calculator:before{content:"\f1ec"}.fa-paypal:before{content:"\f1ed"}.fa-google-wallet:before{content:"\f1ee"}.fa-cc-visa:before{content:"\f1f0"}.fa-cc-mastercard:before{content:"\f1f1"}.fa-cc-discover:before{content:"\f1f2"}.fa-cc-amex:before{content:"\f1f3"}.fa-cc-paypal:before{content:"\f1f4"}.fa-cc-stripe:before{content:"\f1f5"}.fa-bell-slash:before{content:"\f1f6"}.fa-bell-slash-o:before{content:"\f1f7"}.fa-trash:before{content:"\f1f8"}.fa-copyright:before{content:"\f1f9"}.fa-at:before{content:"\f1fa"}.fa-eyedropper:before{content:"\f1fb"}.fa-paint-brush:before{content:"\f1fc"}.fa-birthday-cake:before{content:"\f1fd"}.fa-area-chart:before{content:"\f1fe"}.fa-pie-chart:before{content:"\f200"}.fa-line-chart:before{content:"\f201"}.fa-lastfm:before{content:"\f202"}.fa-lastfm-square:before{content:"\f203"}.fa-toggle-off:before{content:"\f204"}.fa-toggle-on:before{content:"\f205"}.fa-bicycle:before{content:"\f206"}.fa-bus:before{content:"\f207"}.fa-ioxhost:before{content:"\f208"}.fa-angellist:before{content:"\f209"}.fa-cc:before{content:"\f20a"}.fa-shekel:before,.fa-sheqel:before,.fa-ils:before{content:"\f20b"}.fa-meanpath:before{content:"\f20c"}.fa-buysellads:before{content:"\f20d"}.fa-connectdevelop:before{content:"\f20e"}.fa-dashcube:before{content:"\f210"}.fa-forumbee:before{content:"\f211"}.fa-leanpub:before{content:"\f212"}.fa-sellsy:before{content:"\f213"}.fa-shirtsinbulk:before{content:"\f214"}.fa-simplybuilt:before{content:"\f215"}.fa-skyatlas:before{content:"\f216"}.fa-cart-plus:before{content:"\f217"}.fa-cart-arrow-down:before{content:"\f218"}.fa-diamond:before{content:"\f219"}.fa-ship:before{content:"\f21a"}.fa-user-secret:before{content:"\f21b"}.fa-motorcycle:before{content:"\f21c"}.fa-street-view:before{content:"\f21d"}.fa-heartbeat:before{content:"\f21e"}.fa-venus:before{content:"\f221"}.fa-mars:before{content:"\f222"}.fa-mercury:before{content:"\f223"}.fa-intersex:before,.fa-transgender:before{content:"\f224"}.fa-transgender-alt:before{content:"\f225"}.fa-venus-double:before{content:"\f226"}.fa-mars-double:before{content:"\f227"}.fa-venus-mars:before{content:"\f228"}.fa-mars-stroke:before{content:"\f229"}.fa-mars-stroke-v:before{content:"\f22a"}.fa-mars-stroke-h:before{content:"\f22b"}.fa-neuter:before{content:"\f22c"}.fa-genderless:before{content:"\f22d"}.fa-facebook-official:before{content:"\f230"}.fa-pinterest-p:before{content:"\f231"}.fa-whatsapp:before{content:"\f232"}.fa-server:before{content:"\f233"}.fa-user-plus:before{content:"\f234"}.fa-user-times:before{content:"\f235"}.fa-hotel:before,.fa-bed:before{content:"\f236"}.fa-viacoin:before{content:"\f237"}.fa-train:before{content:"\f238"}.fa-subway:before{content:"\f239"}.fa-medium:before{content:"\f23a"}.fa-yc:before,.fa-y-combinator:before{content:"\f23b"}.fa-optin-monster:before{content:"\f23c"}.fa-opencart:before{content:"\f23d"}.fa-expeditedssl:before{content:"\f23e"}.fa-battery-4:before,.fa-battery:before,.fa-battery-full:before{content:"\f240"}.fa-battery-3:before,.fa-battery-three-quarters:before{content:"\f241"}.fa-battery-2:before,.fa-battery-half:before{content:"\f242"}.fa-battery-1:before,.fa-battery-quarter:before{content:"\f243"}.fa-battery-0:before,.fa-battery-empty:before{content:"\f244"}.fa-mouse-pointer:before{content:"\f245"}.fa-i-cursor:before{content:"\f246"}.fa-object-group:before{content:"\f247"}.fa-object-ungroup:before{content:"\f248"}.fa-sticky-note:before{content:"\f249"}.fa-sticky-note-o:before{content:"\f24a"}.fa-cc-jcb:before{content:"\f24b"}.fa-cc-diners-club:before{content:"\f24c"}.fa-clone:before{content:"\f24d"}.fa-balance-scale:before{content:"\f24e"}.fa-hourglass-o:before{content:"\f250"}.fa-hourglass-1:before,.fa-hourglass-start:before{content:"\f251"}.fa-hourglass-2:before,.fa-hourglass-half:before{content:"\f252"}.fa-hourglass-3:before,.fa-hourglass-end:before{content:"\f253"}.fa-hourglass:before{content:"\f254"}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:"\f255"}.fa-hand-stop-o:before,.fa-hand-paper-o:before{content:"\f256"}.fa-hand-scissors-o:before{content:"\f257"}.fa-hand-lizard-o:before{content:"\f258"}.fa-hand-spock-o:before{content:"\f259"}.fa-hand-pointer-o:before{content:"\f25a"}.fa-hand-peace-o:before{content:"\f25b"}.fa-trademark:before{content:"\f25c"}.fa-registered:before{content:"\f25d"}.fa-creative-commons:before{content:"\f25e"}.fa-gg:before{content:"\f260"}.fa-gg-circle:before{content:"\f261"}.fa-tripadvisor:before{content:"\f262"}.fa-odnoklassniki:before{content:"\f263"}.fa-odnoklassniki-square:before{content:"\f264"}.fa-get-pocket:before{content:"\f265"}.fa-wikipedia-w:before{content:"\f266"}.fa-safari:before{content:"\f267"}.fa-chrome:before{content:"\f268"}.fa-firefox:before{content:"\f269"}.fa-opera:before{content:"\f26a"}.fa-internet-explorer:before{content:"\f26b"}.fa-tv:before,.fa-television:before{content:"\f26c"}.fa-contao:before{content:"\f26d"}.fa-500px:before{content:"\f26e"}.fa-amazon:before{content:"\f270"}.fa-calendar-plus-o:before{content:"\f271"}.fa-calendar-minus-o:before{content:"\f272"}.fa-calendar-times-o:before{content:"\f273"}.fa-calendar-check-o:before{content:"\f274"}.fa-industry:before{content:"\f275"}.fa-map-pin:before{content:"\f276"}.fa-map-signs:before{content:"\f277"}.fa-map-o:before{content:"\f278"}.fa-map:before{content:"\f279"}.fa-commenting:before{content:"\f27a"}.fa-commenting-o:before{content:"\f27b"}.fa-houzz:before{content:"\f27c"}.fa-vimeo:before{content:"\f27d"}.fa-black-tie:before{content:"\f27e"}.fa-fonticons:before{content:"\f280"}.fa-reddit-alien:before{content:"\f281"}.fa-edge:before{content:"\f282"}.fa-credit-card-alt:before{content:"\f283"}.fa-codiepie:before{content:"\f284"}.fa-modx:before{content:"\f285"}.fa-fort-awesome:before{content:"\f286"}.fa-usb:before{content:"\f287"}.fa-product-hunt:before{content:"\f288"}.fa-mixcloud:before{content:"\f289"}.fa-scribd:before{content:"\f28a"}.fa-pause-circle:before{content:"\f28b"}.fa-pause-circle-o:before{content:"\f28c"}.fa-stop-circle:before{content:"\f28d"}.fa-stop-circle-o:before{content:"\f28e"}.fa-shopping-bag:before{content:"\f290"}.fa-shopping-basket:before{content:"\f291"}.fa-hashtag:before{content:"\f292"}.fa-bluetooth:before{content:"\f293"}.fa-bluetooth-b:before{content:"\f294"}.fa-percent:before{content:"\f295"}.fa-gitlab:before{content:"\f296"}.fa-wpbeginner:before{content:"\f297"}.fa-wpforms:before{content:"\f298"}.fa-envira:before{content:"\f299"}.fa-universal-access:before{content:"\f29a"}.fa-wheelchair-alt:before{content:"\f29b"}.fa-question-circle-o:before{content:"\f29c"}.fa-blind:before{content:"\f29d"}.fa-audio-description:before{content:"\f29e"}.fa-volume-control-phone:before{content:"\f2a0"}.fa-braille:before{content:"\f2a1"}.fa-assistive-listening-systems:before{content:"\f2a2"}.fa-asl-interpreting:before,.fa-american-sign-language-interpreting:before{content:"\f2a3"}.fa-deafness:before,.fa-hard-of-hearing:before,.fa-deaf:before{content:"\f2a4"}.fa-glide:before{content:"\f2a5"}.fa-glide-g:before{content:"\f2a6"}.fa-signing:before,.fa-sign-language:before{content:"\f2a7"}.fa-low-vision:before{content:"\f2a8"}.fa-viadeo:before{content:"\f2a9"}.fa-viadeo-square:before{content:"\f2aa"}.fa-snapchat:before{content:"\f2ab"}.fa-snapchat-ghost:before{content:"\f2ac"}.fa-snapchat-square:before{content:"\f2ad"}.fa-pied-piper:before{content:"\f2ae"}.fa-first-order:before{content:"\f2b0"}.fa-yoast:before{content:"\f2b1"}.fa-themeisle:before{content:"\f2b2"}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:"\f2b3"}.fa-fa:before,.fa-font-awesome:before{content:"\f2b4"}.fa-handshake-o:before{content:"\f2b5"}.fa-envelope-open:before{content:"\f2b6"}.fa-envelope-open-o:before{content:"\f2b7"}.fa-linode:before{content:"\f2b8"}.fa-address-book:before{content:"\f2b9"}.fa-address-book-o:before{content:"\f2ba"}.fa-vcard:before,.fa-address-card:before{content:"\f2bb"}.fa-vcard-o:before,.fa-address-card-o:before{content:"\f2bc"}.fa-user-circle:before{content:"\f2bd"}.fa-user-circle-o:before{content:"\f2be"}.fa-user-o:before{content:"\f2c0"}.fa-id-badge:before{content:"\f2c1"}.fa-drivers-license:before,.fa-id-card:before{content:"\f2c2"}.fa-drivers-license-o:before,.fa-id-card-o:before{content:"\f2c3"}.fa-quora:before{content:"\f2c4"}.fa-free-code-camp:before{content:"\f2c5"}.fa-telegram:before{content:"\f2c6"}.fa-thermometer-4:before,.fa-thermometer:before,.fa-thermometer-full:before{content:"\f2c7"}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:"\f2c8"}.fa-thermometer-2:before,.fa-thermometer-half:before{content:"\f2c9"}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:"\f2ca"}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:"\f2cb"}.fa-shower:before{content:"\f2cc"}.fa-bathtub:before,.fa-s15:before,.fa-bath:before{content:"\f2cd"}.fa-podcast:before{content:"\f2ce"}.fa-window-maximize:before{content:"\f2d0"}.fa-window-minimize:before{content:"\f2d1"}.fa-window-restore:before{content:"\f2d2"}.fa-times-rectangle:before,.fa-window-close:before{content:"\f2d3"}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:"\f2d4"}.fa-bandcamp:before{content:"\f2d5"}.fa-grav:before{content:"\f2d6"}.fa-etsy:before{content:"\f2d7"}.fa-imdb:before{content:"\f2d8"}.fa-ravelry:before{content:"\f2d9"}.fa-eercast:before{content:"\f2da"}.fa-microchip:before{content:"\f2db"}.fa-snowflake-o:before{content:"\f2dc"}.fa-superpowers:before{content:"\f2dd"}.fa-wpexplorer:before{content:"\f2de"}.fa-meetup:before{content:"\f2e0"}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0, 0, 0, 0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}
diff --git a/demos/streaming_asr_server/web/static/css/style.css b/demos/streaming_asr_server/web/static/css/style.css
deleted file mode 100644
index a3040718b..000000000
--- a/demos/streaming_asr_server/web/static/css/style.css
+++ /dev/null
@@ -1,453 +0,0 @@
-/*
-* @Author: baipengxia
-* @Date: 2021-03-12 11:44:28
-* @Last Modified by: baipengxia
-* @Last Modified time: 2021-03-12 15:14:24
-*/
-
-/** COMMON RESET **/
-* {
- -webkit-tap-highlight-color: rgba(0, 0, 0, 0);
-}
-
-body,
-h1,
-h2,
-h3,
-h4,
-h5,
-h6,
-hr,
-p,
-dl,
-dt,
-dd,
-ul,
-ol,
-li,
-fieldset,
-lengend,
-button,
-input,
-textarea,
-th,
-td {
- margin: 0;
- padding: 0;
- color: #000;
-}
-
-body {
- font-size: 14px;
-}
-html, body {
- min-width: 1200px;
-}
-
-button,
-input,
-select,
-textarea {
- font-size: 14px;
-}
-
-h1 {
- font-size: 18px;
-}
-
-h2 {
- font-size: 14px;
-}
-
-h3 {
- font-size: 14px;
-}
-
-ul,
-ol,
-li {
- list-style: none;
-}
-
-a {
- text-decoration: none;
-}
-
-a:hover {
- text-decoration: none;
-}
-
-fieldset,
-img {
- border: none;
-}
-
-table {
- border-collapse: collapse;
- border-spacing: 0;
-}
-
-i {
- font-style: normal;
-}
-
-label {
- position: inherit;
-}
-
-.clearfix:after {
- content: ".";
- display: block;
- height: 0;
- clear: both;
- visibility: hidden;
-}
-
-.clearfix {
- zoom: 1;
- display: block;
-}
-
-html,
-body {
- font-family: Tahoma, Arial, 'microsoft yahei', 'Roboto', 'Droid Sans', 'Helvetica Neue', 'Droid Sans Fallback', 'Heiti SC', 'Hiragino Sans GB', 'Simsun', 'sans-self';
-}
-
-
-
-.audio-banner {
- width: 100%;
- overflow: auto;
- padding: 0;
- background: url('../image/voice-dictation.svg');
- background-size: cover;
-}
-.weaper {
- width: 1200px;
- height: 155px;
- margin: 72px auto;
-}
-.text-content {
- width: 670px;
- height: 100%;
- float: left;
-}
-.text-content .title {
- font-size: 34px;
- font-family: 'PingFangSC-Medium';
- font-weight: 500;
- color: rgba(255, 255, 255, 1);
- line-height: 48px;
-}
-.text-content .con {
- font-size: 16px;
- font-family: PingFangSC-Light;
- font-weight: 300;
- color: rgba(255, 255, 255, 1);
- line-height: 30px;
-}
-.img-con {
- width: 416px;
- height: 100%;
- float: right;
-}
-.img-con img {
- width: 100%;
- height: 100%;
-}
-.con-container {
- margin-top: 34px;
-}
-
-.audio-advantage {
- background: #f8f9fa;
-}
-.asr-advantage {
- width: 1200px;
- margin: 0 auto;
-}
-.asr-advantage h2 {
- text-align: center;
- font-size: 22px;
- padding: 30px 0 0 0;
-}
-.asr-advantage > ul > li {
- box-sizing: border-box;
- padding: 0 16px;
- width: 33%;
- text-align: center;
- margin-bottom: 35px;
-}
-.asr-advantage > ul > li .icons{
- margin-top: 10px;
- margin-bottom: 20px;
- width: 42px;
- height: 42px;
-}
-.service-item-content {
- margin-top: 35px;
- display: flex;
- justify-content: center;
- flex-wrap: wrap;
-}
-.service-item-content img {
- width: 160px;
- vertical-align: bottom;
-}
-.service-item-content > li {
- box-sizing: border-box;
- padding: 0 16px;
- width: 33%;
- text-align: center;
- margin-bottom: 35px;
-}
-.service-item-content > li .service-item-content-title {
- line-height: 1.5;
- font-weight: 700;
- margin-top: 10px;
-}
-.service-item-content > li .service-item-content-desc {
- margin-top: 5px;
- line-height: 1.8;
- color: #657384;
-}
-
-
-.audio-scene-con {
- width: 100%;
- padding-bottom: 84px;
- background: #fff;
-}
-.audio-scene {
- overflow: auto;
- width: 1200px;
- background: #fff;
- text-align: center;
- padding: 0;
- margin: 0 auto;
-}
-.audio-scene h2 {
- padding: 30px 0 0 0;
- font-size: 22px;
- text-align: center;
-}
-
-.audio-experience {
- width: 100%;
- height: 538px;
- background: #fff;
- padding: 0;
- margin: 0;
- overflow: auto;
-}
-.asr-box {
- width: 1200px;
- height: 394px;
- margin: 64px auto;
-}
-.asr-box h2 {
- font-size: 22px;
- text-align: center;
- margin-bottom: 64px;
-}
-.voice-container {
- position: relative;
- width: 1200px;
- height: 308px;
- background: rgba(255, 255, 255, 1);
- border-radius: 8px;
- border: 1px solid rgba(225, 225, 225, 1);
-}
-.voice-container .voice {
- height: 236px;
- width: 100%;
- border-radius: 8px;
-}
-.voice-container .voice textarea {
- height: 100%;
- width: 100%;
- border: none;
- outline: none;
- border-radius: 8px;
- padding: 25px;
- font-size: 14px;
- box-sizing: border-box;
- resize: none;
-}
-.voice-input {
- width: 100%;
- height: 72px;
- box-sizing: border-box;
- padding-left: 35px;
- background: rgba(242, 244, 245, 1);
- border-radius: 8px;
- line-height: 72px;
-}
-.voice-input .el-select {
- width: 492px;
-}
-.start-voice {
- display: inline-block;
- margin-left: 10px;
-}
-.start-voice .time {
- margin-right: 25px;
-}
-.asr-advantage > ul > li {
- margin-bottom: 77px;
-}
-#msg {
- width: 100%;
- line-height: 40px;
- font-size: 14px;
- margin-left: 330px;
-}
-#captcha {
- margin-left: 350px !important;
- display: inline-block;
- position: relative;
-}
-.black {
- position: fixed;
- width: 100%;
- height: 100%;
- z-index: 5;
- background: rgba(0, 0, 0, 0.5);
- top: 0;
- left: 0;
-}
-.container {
- position: fixed;
- z-index: 6;
- top: 25%;
- left: 10%;
-}
-.audio-scene-con {
- width: 100%;
- padding-bottom: 84px;
- background: #fff;
-}
-#sound {
- color: #fff;
- cursor: pointer;
- background: #147ede;
- padding: 10px;
- margin-top: 30px;
- margin-left: 135px;
- width: 176px;
- height: 30px !important;
- text-align: center;
- line-height: 30px !important;
- border-radius: 10px;
-}
-.con-ten {
- position: absolute;
- width: 100%;
- height: 100%;
- z-index: 5;
- background: #fff;
- opacity: 0.5;
- top: 0;
- left: 0;
-}
-.websocket-url {
- width: 320px;
- height: 20px;
- border: 1px solid #dcdfe6;
- line-height: 20px;
- padding: 10px;
- border-radius: 4px;
-}
-.voice-btn {
- color: #fff;
- background-color: #409eff;
- font-weight: 500;
- padding: 12px 20px;
- font-size: 14px;
- border-radius: 4px;
- border: 0;
- cursor: pointer;
-}
-.voice-btn.end {
- display: none;
-}
-.result-text {
- background: #fff;
- padding: 20px;
-}
-.voice-footer {
- border-top: 1px solid #dddede;
- background: #f7f9fa;
- text-align: center;
- margin-bottom: 8px;
- color: #333;
- font-size: 12px;
- padding: 20px 0;
-}
-
-/** line animate **/
-.time-box {
- display: none;
- margin-left: 10px;
- width: 300px;
-}
-.total-time {
- font-size: 14px;
- color: #545454;
-}
-.voice-btn.end.show,
-.time-box.show {
- display: inline;
-}
-.start-taste-line {
- margin-right: 20px;
- display: inline-block;
-}
-.start-taste-line hr {
- background-color: #187cff;
- width: 3px;
- height: 8px;
- margin: 0 3px;
- display: inline-block;
- border: none;
-}
-.hr {
- animation: note 0.2s ease-in-out;
- animation-iteration-count: infinite;
- animation-direction: alternate;
-}
-.hr-one {
- animation-delay: -0.9s;
-}
-.hr-two {
- animation-delay: -0.8s;
-}
-.hr-three {
- animation-delay: -0.7s;
-}
-.hr-four {
- animation-delay: -0.6s;
-}
-.hr-five {
- animation-delay: -0.5s;
-}
-.hr-six {
- animation-delay: -0.4s;
-}
-.hr-seven {
- animation-delay: -0.3s;
-}
-.hr-eight {
- animation-delay: -0.2s;
-}
-.hr-nine {
- animation-delay: -0.1s;
-}
-@keyframes note {
- from {
- transform: scaleY(1);
- }
- to {
- transform: scaleY(4);
- }
-}
\ No newline at end of file
diff --git a/demos/streaming_asr_server/web/static/fonts/FontAwesome.otf b/demos/streaming_asr_server/web/static/fonts/FontAwesome.otf
deleted file mode 100644
index 401ec0f36..000000000
Binary files a/demos/streaming_asr_server/web/static/fonts/FontAwesome.otf and /dev/null differ
diff --git a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.eot b/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.eot
deleted file mode 100644
index e9f60ca95..000000000
Binary files a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.eot and /dev/null differ
diff --git a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.svg b/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.svg
deleted file mode 100644
index 6cd0326be..000000000
--- a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.svg
+++ /dev/null
@@ -1,1951 +0,0 @@
-
-
-
-
-Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016
- By ,,,
-Copyright Dave Gandy 2016. All rights reserved.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
diff --git a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.ttf b/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.ttf
deleted file mode 100644
index 35acda2fa..000000000
Binary files a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.ttf and /dev/null differ
diff --git a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.woff b/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.woff
deleted file mode 100644
index 400014a4b..000000000
Binary files a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.woff and /dev/null differ
diff --git a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.woff2 b/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.woff2
deleted file mode 100644
index 4d13fc604..000000000
Binary files a/demos/streaming_asr_server/web/static/fonts/fontawesome-webfont.woff2 and /dev/null differ
diff --git a/demos/streaming_asr_server/web/static/image/PaddleSpeech_logo.png b/demos/streaming_asr_server/web/static/image/PaddleSpeech_logo.png
deleted file mode 100644
index fb2527754..000000000
Binary files a/demos/streaming_asr_server/web/static/image/PaddleSpeech_logo.png and /dev/null differ
diff --git a/demos/streaming_asr_server/web/static/image/voice-dictation.svg b/demos/streaming_asr_server/web/static/image/voice-dictation.svg
deleted file mode 100644
index d35971499..000000000
--- a/demos/streaming_asr_server/web/static/image/voice-dictation.svg
+++ /dev/null
@@ -1,94 +0,0 @@
-
-
-
- 背景
- Created with Sketch.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
\ No newline at end of file
diff --git a/demos/streaming_asr_server/web/static/js/SoundRecognizer.js b/demos/streaming_asr_server/web/static/js/SoundRecognizer.js
deleted file mode 100644
index 5ef3d2e89..000000000
--- a/demos/streaming_asr_server/web/static/js/SoundRecognizer.js
+++ /dev/null
@@ -1,133 +0,0 @@
-SoundRecognizer = {
- rec: null,
- wave: null,
- SampleRate: 16000,
- testBitRate: 16,
- isCloseRecorder: false,
- SendInterval: 300,
- realTimeSendTryType: 'pcm',
- realTimeSendTryEncBusy: 0,
- realTimeSendTryTime: 0,
- realTimeSendTryNumber: 0,
- transferUploadNumberMax: 0,
- realTimeSendTryChunk: null,
- soundType: "pcm",
- init: function (config) {
- this.soundType = config.soundType || 'pcm';
- this.SampleRate = config.sampleRate || 16000;
- this.recwaveElm = config.recwaveElm || '';
- this.TransferUpload = config.translerCallBack || this.TransferProcess;
- this.initRecorder();
- },
- RealTimeSendTryReset: function (type) {
- this.realTimeSendTryType = type;
- this.realTimeSendTryTime = 0;
- },
- RealTimeSendTry: function (rec, isClose) {
- var that = this;
- var t1 = Date.now(), endT = 0, recImpl = Recorder.prototype;
- if (this.realTimeSendTryTime == 0) {
- this.realTimeSendTryTime = t1;
- this.realTimeSendTryEncBusy = 0;
- this.realTimeSendTryNumber = 0;
- this.transferUploadNumberMax = 0;
- this.realTimeSendTryChunk = null;
- }
- if (!isClose && t1 - this.realTimeSendTryTime < this.SendInterval) {
- return;//控制缓冲达到指定间隔才进行传输
- }
- this.realTimeSendTryTime = t1;
- var number = ++this.realTimeSendTryNumber;
-
- //借用SampleData函数进行数据的连续处理,采样率转换是顺带的
- var chunk = Recorder.SampleData(rec.buffers, rec.srcSampleRate, this.SampleRate, this.realTimeSendTryChunk, { frameType: isClose ? "" : this.realTimeSendTryType });
-
- //清理已处理完的缓冲数据,释放内存以支持长时间录音,最后完成录音时不能调用stop,因为数据已经被清掉了
- for (var i = this.realTimeSendTryChunk ? this.realTimeSendTryChunk.index : 0; i < chunk.index; i++) {
- rec.buffers[i] = null;
- }
- this.realTimeSendTryChunk = chunk;
-
- //没有新数据,或结束时的数据量太小,不能进行mock转码
- if (chunk.data.length == 0 || isClose && chunk.data.length < 2000) {
- this.TransferUpload(number, null, 0, null, isClose);
- return;
- }
- //实时编码队列阻塞处理
- if (!isClose) {
- if (this.realTimeSendTryEncBusy >= 2) {
- console.log("编码队列阻塞,已丢弃一帧", 1);
- return;
- }
- }
- this.realTimeSendTryEncBusy++;
-
- //通过mock方法实时转码成mp3、wav
- var encStartTime = Date.now();
- var recMock = Recorder({
- type: this.realTimeSendTryType
- , sampleRate: this.SampleRate //采样率
- , bitRate: this.testBitRate //比特率
- });
- recMock.mock(chunk.data, chunk.sampleRate);
- recMock.stop(function (blob, duration) {
- that.realTimeSendTryEncBusy && (that.realTimeSendTryEncBusy--);
- blob.encTime = Date.now() - encStartTime;
-
- //转码好就推入传输
- that.TransferUpload(number, blob, duration, recMock, isClose);
- }, function (msg) {
- that.realTimeSendTryEncBusy && (that.realTimeSendTryEncBusy--);
- //转码错误?没想到什么时候会产生错误!
- console.log("不应该出现的错误:" + msg, 1);
- });
- },
- recordClose: function () {
- try {
- this.rec.close(function () {
- this.isCloseRecorder = true;
- });
- this.RealTimeSendTry(this.rec, true);//最后一次发送
- } catch (ex) {
- // recordClose();
- }
- },
- recordEnd: function () {
- try {
- this.rec.stop(function (blob, time) {
- this.recordClose();
- }, function (s) {
- this.recordClose();
- });
- } catch (ex) {
- }
- },
- initRecorder: function () {
- var that = this;
- var rec = Recorder({
- type: that.soundType
- , bitRate: that.testBitRate
- , sampleRate: that.SampleRate
- , onProcess: function (buffers, level, time, sampleRate) {
- that.wave.input(buffers[buffers.length - 1], level, sampleRate);
- that.RealTimeSendTry(rec, false);//推入实时处理,因为是unknown格式,这里简化函数调用,没有用到buffers和bufferSampleRate,因为这些数据和rec.buffers是完全相同的。
- }
- });
-
- rec.open(function () {
- that.wave = Recorder.FrequencyHistogramView({
- elem: that.recwaveElm, lineCount: 90
- , position: 0
- , minHeight: 1
- , stripeEnable: false
- });
- rec.start();
- that.isCloseRecorder = false;
- that.RealTimeSendTryReset(that.soundType);//重置
- });
- this.rec = rec;
- },
- TransferProcess: function (number, blobOrNull, duration, blobRec, isClose) {
-
- }
-}
\ No newline at end of file
diff --git a/demos/streaming_asr_server/web/static/js/jquery-3.2.1.min.js b/demos/streaming_asr_server/web/static/js/jquery-3.2.1.min.js
deleted file mode 100644
index 644d35e27..000000000
--- a/demos/streaming_asr_server/web/static/js/jquery-3.2.1.min.js
+++ /dev/null
@@ -1,4 +0,0 @@
-/*! jQuery v3.2.1 | (c) JS Foundation and other contributors | jquery.org/license */
-!function(a,b){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=a.document?b(a,!0):function(a){if(!a.document)throw new Error("jQuery requires a window with a document");return b(a)}:b(a)}("undefined"!=typeof window?window:this,function(a,b){"use strict";var c=[],d=a.document,e=Object.getPrototypeOf,f=c.slice,g=c.concat,h=c.push,i=c.indexOf,j={},k=j.toString,l=j.hasOwnProperty,m=l.toString,n=m.call(Object),o={};function p(a,b){b=b||d;var c=b.createElement("script");c.text=a,b.head.appendChild(c).parentNode.removeChild(c)}var q="3.2.1",r=function(a,b){return new r.fn.init(a,b)},s=/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,t=/^-ms-/,u=/-([a-z])/g,v=function(a,b){return b.toUpperCase()};r.fn=r.prototype={jquery:q,constructor:r,length:0,toArray:function(){return f.call(this)},get:function(a){return null==a?f.call(this):a<0?this[a+this.length]:this[a]},pushStack:function(a){var b=r.merge(this.constructor(),a);return b.prevObject=this,b},each:function(a){return r.each(this,a)},map:function(a){return this.pushStack(r.map(this,function(b,c){return a.call(b,c,b)}))},slice:function(){return this.pushStack(f.apply(this,arguments))},first:function(){return this.eq(0)},last:function(){return this.eq(-1)},eq:function(a){var b=this.length,c=+a+(a<0?b:0);return this.pushStack(c>=0&&c0&&b-1 in a)}var x=function(a){var b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u="sizzle"+1*new Date,v=a.document,w=0,x=0,y=ha(),z=ha(),A=ha(),B=function(a,b){return a===b&&(l=!0),0},C={}.hasOwnProperty,D=[],E=D.pop,F=D.push,G=D.push,H=D.slice,I=function(a,b){for(var c=0,d=a.length;c+~]|"+K+")"+K+"*"),S=new RegExp("="+K+"*([^\\]'\"]*?)"+K+"*\\]","g"),T=new RegExp(N),U=new RegExp("^"+L+"$"),V={ID:new RegExp("^#("+L+")"),CLASS:new RegExp("^\\.("+L+")"),TAG:new RegExp("^("+L+"|[*])"),ATTR:new RegExp("^"+M),PSEUDO:new RegExp("^"+N),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+K+"*(even|odd|(([+-]|)(\\d*)n|)"+K+"*(?:([+-]|)"+K+"*(\\d+)|))"+K+"*\\)|)","i"),bool:new RegExp("^(?:"+J+")$","i"),needsContext:new RegExp("^"+K+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+K+"*((?:-\\d)?\\d*)"+K+"*\\)|)(?=[^-]|$)","i")},W=/^(?:input|select|textarea|button)$/i,X=/^h\d$/i,Y=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,$=/[+~]/,_=new RegExp("\\\\([\\da-f]{1,6}"+K+"?|("+K+")|.)","ig"),aa=function(a,b,c){var d="0x"+b-65536;return d!==d||c?b:d<0?String.fromCharCode(d+65536):String.fromCharCode(d>>10|55296,1023&d|56320)},ba=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ca=function(a,b){return b?"\0"===a?"\ufffd":a.slice(0,-1)+"\\"+a.charCodeAt(a.length-1).toString(16)+" ":"\\"+a},da=function(){m()},ea=ta(function(a){return a.disabled===!0&&("form"in a||"label"in a)},{dir:"parentNode",next:"legend"});try{G.apply(D=H.call(v.childNodes),v.childNodes),D[v.childNodes.length].nodeType}catch(fa){G={apply:D.length?function(a,b){F.apply(a,H.call(b))}:function(a,b){var c=a.length,d=0;while(a[c++]=b[d++]);a.length=c-1}}}function ga(a,b,d,e){var f,h,j,k,l,o,r,s=b&&b.ownerDocument,w=b?b.nodeType:9;if(d=d||[],"string"!=typeof a||!a||1!==w&&9!==w&&11!==w)return d;if(!e&&((b?b.ownerDocument||b:v)!==n&&m(b),b=b||n,p)){if(11!==w&&(l=Z.exec(a)))if(f=l[1]){if(9===w){if(!(j=b.getElementById(f)))return d;if(j.id===f)return d.push(j),d}else if(s&&(j=s.getElementById(f))&&t(b,j)&&j.id===f)return d.push(j),d}else{if(l[2])return G.apply(d,b.getElementsByTagName(a)),d;if((f=l[3])&&c.getElementsByClassName&&b.getElementsByClassName)return G.apply(d,b.getElementsByClassName(f)),d}if(c.qsa&&!A[a+" "]&&(!q||!q.test(a))){if(1!==w)s=b,r=a;else if("object"!==b.nodeName.toLowerCase()){(k=b.getAttribute("id"))?k=k.replace(ba,ca):b.setAttribute("id",k=u),o=g(a),h=o.length;while(h--)o[h]="#"+k+" "+sa(o[h]);r=o.join(","),s=$.test(a)&&qa(b.parentNode)||b}if(r)try{return G.apply(d,s.querySelectorAll(r)),d}catch(x){}finally{k===u&&b.removeAttribute("id")}}}return i(a.replace(P,"$1"),b,d,e)}function ha(){var a=[];function b(c,e){return a.push(c+" ")>d.cacheLength&&delete b[a.shift()],b[c+" "]=e}return b}function ia(a){return a[u]=!0,a}function ja(a){var b=n.createElement("fieldset");try{return!!a(b)}catch(c){return!1}finally{b.parentNode&&b.parentNode.removeChild(b),b=null}}function ka(a,b){var c=a.split("|"),e=c.length;while(e--)d.attrHandle[c[e]]=b}function la(a,b){var c=b&&a,d=c&&1===a.nodeType&&1===b.nodeType&&a.sourceIndex-b.sourceIndex;if(d)return d;if(c)while(c=c.nextSibling)if(c===b)return-1;return a?1:-1}function ma(a){return function(b){var c=b.nodeName.toLowerCase();return"input"===c&&b.type===a}}function na(a){return function(b){var c=b.nodeName.toLowerCase();return("input"===c||"button"===c)&&b.type===a}}function oa(a){return function(b){return"form"in b?b.parentNode&&b.disabled===!1?"label"in b?"label"in b.parentNode?b.parentNode.disabled===a:b.disabled===a:b.isDisabled===a||b.isDisabled!==!a&&ea(b)===a:b.disabled===a:"label"in b&&b.disabled===a}}function pa(a){return ia(function(b){return b=+b,ia(function(c,d){var e,f=a([],c.length,b),g=f.length;while(g--)c[e=f[g]]&&(c[e]=!(d[e]=c[e]))})})}function qa(a){return a&&"undefined"!=typeof a.getElementsByTagName&&a}c=ga.support={},f=ga.isXML=function(a){var b=a&&(a.ownerDocument||a).documentElement;return!!b&&"HTML"!==b.nodeName},m=ga.setDocument=function(a){var b,e,g=a?a.ownerDocument||a:v;return g!==n&&9===g.nodeType&&g.documentElement?(n=g,o=n.documentElement,p=!f(n),v!==n&&(e=n.defaultView)&&e.top!==e&&(e.addEventListener?e.addEventListener("unload",da,!1):e.attachEvent&&e.attachEvent("onunload",da)),c.attributes=ja(function(a){return a.className="i",!a.getAttribute("className")}),c.getElementsByTagName=ja(function(a){return a.appendChild(n.createComment("")),!a.getElementsByTagName("*").length}),c.getElementsByClassName=Y.test(n.getElementsByClassName),c.getById=ja(function(a){return o.appendChild(a).id=u,!n.getElementsByName||!n.getElementsByName(u).length}),c.getById?(d.filter.ID=function(a){var b=a.replace(_,aa);return function(a){return a.getAttribute("id")===b}},d.find.ID=function(a,b){if("undefined"!=typeof b.getElementById&&p){var c=b.getElementById(a);return c?[c]:[]}}):(d.filter.ID=function(a){var b=a.replace(_,aa);return function(a){var c="undefined"!=typeof a.getAttributeNode&&a.getAttributeNode("id");return c&&c.value===b}},d.find.ID=function(a,b){if("undefined"!=typeof b.getElementById&&p){var c,d,e,f=b.getElementById(a);if(f){if(c=f.getAttributeNode("id"),c&&c.value===a)return[f];e=b.getElementsByName(a),d=0;while(f=e[d++])if(c=f.getAttributeNode("id"),c&&c.value===a)return[f]}return[]}}),d.find.TAG=c.getElementsByTagName?function(a,b){return"undefined"!=typeof b.getElementsByTagName?b.getElementsByTagName(a):c.qsa?b.querySelectorAll(a):void 0}:function(a,b){var c,d=[],e=0,f=b.getElementsByTagName(a);if("*"===a){while(c=f[e++])1===c.nodeType&&d.push(c);return d}return f},d.find.CLASS=c.getElementsByClassName&&function(a,b){if("undefined"!=typeof b.getElementsByClassName&&p)return b.getElementsByClassName(a)},r=[],q=[],(c.qsa=Y.test(n.querySelectorAll))&&(ja(function(a){o.appendChild(a).innerHTML=" ",a.querySelectorAll("[msallowcapture^='']").length&&q.push("[*^$]="+K+"*(?:''|\"\")"),a.querySelectorAll("[selected]").length||q.push("\\["+K+"*(?:value|"+J+")"),a.querySelectorAll("[id~="+u+"-]").length||q.push("~="),a.querySelectorAll(":checked").length||q.push(":checked"),a.querySelectorAll("a#"+u+"+*").length||q.push(".#.+[+~]")}),ja(function(a){a.innerHTML=" ";var b=n.createElement("input");b.setAttribute("type","hidden"),a.appendChild(b).setAttribute("name","D"),a.querySelectorAll("[name=d]").length&&q.push("name"+K+"*[*^$|!~]?="),2!==a.querySelectorAll(":enabled").length&&q.push(":enabled",":disabled"),o.appendChild(a).disabled=!0,2!==a.querySelectorAll(":disabled").length&&q.push(":enabled",":disabled"),a.querySelectorAll("*,:x"),q.push(",.*:")})),(c.matchesSelector=Y.test(s=o.matches||o.webkitMatchesSelector||o.mozMatchesSelector||o.oMatchesSelector||o.msMatchesSelector))&&ja(function(a){c.disconnectedMatch=s.call(a,"*"),s.call(a,"[s!='']:x"),r.push("!=",N)}),q=q.length&&new RegExp(q.join("|")),r=r.length&&new RegExp(r.join("|")),b=Y.test(o.compareDocumentPosition),t=b||Y.test(o.contains)?function(a,b){var c=9===a.nodeType?a.documentElement:a,d=b&&b.parentNode;return a===d||!(!d||1!==d.nodeType||!(c.contains?c.contains(d):a.compareDocumentPosition&&16&a.compareDocumentPosition(d)))}:function(a,b){if(b)while(b=b.parentNode)if(b===a)return!0;return!1},B=b?function(a,b){if(a===b)return l=!0,0;var d=!a.compareDocumentPosition-!b.compareDocumentPosition;return d?d:(d=(a.ownerDocument||a)===(b.ownerDocument||b)?a.compareDocumentPosition(b):1,1&d||!c.sortDetached&&b.compareDocumentPosition(a)===d?a===n||a.ownerDocument===v&&t(v,a)?-1:b===n||b.ownerDocument===v&&t(v,b)?1:k?I(k,a)-I(k,b):0:4&d?-1:1)}:function(a,b){if(a===b)return l=!0,0;var c,d=0,e=a.parentNode,f=b.parentNode,g=[a],h=[b];if(!e||!f)return a===n?-1:b===n?1:e?-1:f?1:k?I(k,a)-I(k,b):0;if(e===f)return la(a,b);c=a;while(c=c.parentNode)g.unshift(c);c=b;while(c=c.parentNode)h.unshift(c);while(g[d]===h[d])d++;return d?la(g[d],h[d]):g[d]===v?-1:h[d]===v?1:0},n):n},ga.matches=function(a,b){return ga(a,null,null,b)},ga.matchesSelector=function(a,b){if((a.ownerDocument||a)!==n&&m(a),b=b.replace(S,"='$1']"),c.matchesSelector&&p&&!A[b+" "]&&(!r||!r.test(b))&&(!q||!q.test(b)))try{var d=s.call(a,b);if(d||c.disconnectedMatch||a.document&&11!==a.document.nodeType)return d}catch(e){}return ga(b,n,null,[a]).length>0},ga.contains=function(a,b){return(a.ownerDocument||a)!==n&&m(a),t(a,b)},ga.attr=function(a,b){(a.ownerDocument||a)!==n&&m(a);var e=d.attrHandle[b.toLowerCase()],f=e&&C.call(d.attrHandle,b.toLowerCase())?e(a,b,!p):void 0;return void 0!==f?f:c.attributes||!p?a.getAttribute(b):(f=a.getAttributeNode(b))&&f.specified?f.value:null},ga.escape=function(a){return(a+"").replace(ba,ca)},ga.error=function(a){throw new Error("Syntax error, unrecognized expression: "+a)},ga.uniqueSort=function(a){var b,d=[],e=0,f=0;if(l=!c.detectDuplicates,k=!c.sortStable&&a.slice(0),a.sort(B),l){while(b=a[f++])b===a[f]&&(e=d.push(f));while(e--)a.splice(d[e],1)}return k=null,a},e=ga.getText=function(a){var b,c="",d=0,f=a.nodeType;if(f){if(1===f||9===f||11===f){if("string"==typeof a.textContent)return a.textContent;for(a=a.firstChild;a;a=a.nextSibling)c+=e(a)}else if(3===f||4===f)return a.nodeValue}else while(b=a[d++])c+=e(b);return c},d=ga.selectors={cacheLength:50,createPseudo:ia,match:V,attrHandle:{},find:{},relative:{">":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(a){return a[1]=a[1].replace(_,aa),a[3]=(a[3]||a[4]||a[5]||"").replace(_,aa),"~="===a[2]&&(a[3]=" "+a[3]+" "),a.slice(0,4)},CHILD:function(a){return a[1]=a[1].toLowerCase(),"nth"===a[1].slice(0,3)?(a[3]||ga.error(a[0]),a[4]=+(a[4]?a[5]+(a[6]||1):2*("even"===a[3]||"odd"===a[3])),a[5]=+(a[7]+a[8]||"odd"===a[3])):a[3]&&ga.error(a[0]),a},PSEUDO:function(a){var b,c=!a[6]&&a[2];return V.CHILD.test(a[0])?null:(a[3]?a[2]=a[4]||a[5]||"":c&&T.test(c)&&(b=g(c,!0))&&(b=c.indexOf(")",c.length-b)-c.length)&&(a[0]=a[0].slice(0,b),a[2]=c.slice(0,b)),a.slice(0,3))}},filter:{TAG:function(a){var b=a.replace(_,aa).toLowerCase();return"*"===a?function(){return!0}:function(a){return a.nodeName&&a.nodeName.toLowerCase()===b}},CLASS:function(a){var b=y[a+" "];return b||(b=new RegExp("(^|"+K+")"+a+"("+K+"|$)"))&&y(a,function(a){return b.test("string"==typeof a.className&&a.className||"undefined"!=typeof a.getAttribute&&a.getAttribute("class")||"")})},ATTR:function(a,b,c){return function(d){var e=ga.attr(d,a);return null==e?"!="===b:!b||(e+="","="===b?e===c:"!="===b?e!==c:"^="===b?c&&0===e.indexOf(c):"*="===b?c&&e.indexOf(c)>-1:"$="===b?c&&e.slice(-c.length)===c:"~="===b?(" "+e.replace(O," ")+" ").indexOf(c)>-1:"|="===b&&(e===c||e.slice(0,c.length+1)===c+"-"))}},CHILD:function(a,b,c,d,e){var f="nth"!==a.slice(0,3),g="last"!==a.slice(-4),h="of-type"===b;return 1===d&&0===e?function(a){return!!a.parentNode}:function(b,c,i){var j,k,l,m,n,o,p=f!==g?"nextSibling":"previousSibling",q=b.parentNode,r=h&&b.nodeName.toLowerCase(),s=!i&&!h,t=!1;if(q){if(f){while(p){m=b;while(m=m[p])if(h?m.nodeName.toLowerCase()===r:1===m.nodeType)return!1;o=p="only"===a&&!o&&"nextSibling"}return!0}if(o=[g?q.firstChild:q.lastChild],g&&s){m=q,l=m[u]||(m[u]={}),k=l[m.uniqueID]||(l[m.uniqueID]={}),j=k[a]||[],n=j[0]===w&&j[1],t=n&&j[2],m=n&&q.childNodes[n];while(m=++n&&m&&m[p]||(t=n=0)||o.pop())if(1===m.nodeType&&++t&&m===b){k[a]=[w,n,t];break}}else if(s&&(m=b,l=m[u]||(m[u]={}),k=l[m.uniqueID]||(l[m.uniqueID]={}),j=k[a]||[],n=j[0]===w&&j[1],t=n),t===!1)while(m=++n&&m&&m[p]||(t=n=0)||o.pop())if((h?m.nodeName.toLowerCase()===r:1===m.nodeType)&&++t&&(s&&(l=m[u]||(m[u]={}),k=l[m.uniqueID]||(l[m.uniqueID]={}),k[a]=[w,t]),m===b))break;return t-=e,t===d||t%d===0&&t/d>=0}}},PSEUDO:function(a,b){var c,e=d.pseudos[a]||d.setFilters[a.toLowerCase()]||ga.error("unsupported pseudo: "+a);return e[u]?e(b):e.length>1?(c=[a,a,"",b],d.setFilters.hasOwnProperty(a.toLowerCase())?ia(function(a,c){var d,f=e(a,b),g=f.length;while(g--)d=I(a,f[g]),a[d]=!(c[d]=f[g])}):function(a){return e(a,0,c)}):e}},pseudos:{not:ia(function(a){var b=[],c=[],d=h(a.replace(P,"$1"));return d[u]?ia(function(a,b,c,e){var f,g=d(a,null,e,[]),h=a.length;while(h--)(f=g[h])&&(a[h]=!(b[h]=f))}):function(a,e,f){return b[0]=a,d(b,null,f,c),b[0]=null,!c.pop()}}),has:ia(function(a){return function(b){return ga(a,b).length>0}}),contains:ia(function(a){return a=a.replace(_,aa),function(b){return(b.textContent||b.innerText||e(b)).indexOf(a)>-1}}),lang:ia(function(a){return U.test(a||"")||ga.error("unsupported lang: "+a),a=a.replace(_,aa).toLowerCase(),function(b){var c;do if(c=p?b.lang:b.getAttribute("xml:lang")||b.getAttribute("lang"))return c=c.toLowerCase(),c===a||0===c.indexOf(a+"-");while((b=b.parentNode)&&1===b.nodeType);return!1}}),target:function(b){var c=a.location&&a.location.hash;return c&&c.slice(1)===b.id},root:function(a){return a===o},focus:function(a){return a===n.activeElement&&(!n.hasFocus||n.hasFocus())&&!!(a.type||a.href||~a.tabIndex)},enabled:oa(!1),disabled:oa(!0),checked:function(a){var b=a.nodeName.toLowerCase();return"input"===b&&!!a.checked||"option"===b&&!!a.selected},selected:function(a){return a.parentNode&&a.parentNode.selectedIndex,a.selected===!0},empty:function(a){for(a=a.firstChild;a;a=a.nextSibling)if(a.nodeType<6)return!1;return!0},parent:function(a){return!d.pseudos.empty(a)},header:function(a){return X.test(a.nodeName)},input:function(a){return W.test(a.nodeName)},button:function(a){var b=a.nodeName.toLowerCase();return"input"===b&&"button"===a.type||"button"===b},text:function(a){var b;return"input"===a.nodeName.toLowerCase()&&"text"===a.type&&(null==(b=a.getAttribute("type"))||"text"===b.toLowerCase())},first:pa(function(){return[0]}),last:pa(function(a,b){return[b-1]}),eq:pa(function(a,b,c){return[c<0?c+b:c]}),even:pa(function(a,b){for(var c=0;c=0;)a.push(d);return a}),gt:pa(function(a,b,c){for(var d=c<0?c+b:c;++d1?function(b,c,d){var e=a.length;while(e--)if(!a[e](b,c,d))return!1;return!0}:a[0]}function va(a,b,c){for(var d=0,e=b.length;d-1&&(f[j]=!(g[j]=l))}}else r=wa(r===g?r.splice(o,r.length):r),e?e(null,g,r,i):G.apply(g,r)})}function ya(a){for(var b,c,e,f=a.length,g=d.relative[a[0].type],h=g||d.relative[" "],i=g?1:0,k=ta(function(a){return a===b},h,!0),l=ta(function(a){return I(b,a)>-1},h,!0),m=[function(a,c,d){var e=!g&&(d||c!==j)||((b=c).nodeType?k(a,c,d):l(a,c,d));return b=null,e}];i1&&ua(m),i>1&&sa(a.slice(0,i-1).concat({value:" "===a[i-2].type?"*":""})).replace(P,"$1"),c,i0,e=a.length>0,f=function(f,g,h,i,k){var l,o,q,r=0,s="0",t=f&&[],u=[],v=j,x=f||e&&d.find.TAG("*",k),y=w+=null==v?1:Math.random()||.1,z=x.length;for(k&&(j=g===n||g||k);s!==z&&null!=(l=x[s]);s++){if(e&&l){o=0,g||l.ownerDocument===n||(m(l),h=!p);while(q=a[o++])if(q(l,g||n,h)){i.push(l);break}k&&(w=y)}c&&((l=!q&&l)&&r--,f&&t.push(l))}if(r+=s,c&&s!==r){o=0;while(q=b[o++])q(t,u,g,h);if(f){if(r>0)while(s--)t[s]||u[s]||(u[s]=E.call(i));u=wa(u)}G.apply(i,u),k&&!f&&u.length>0&&r+b.length>1&&ga.uniqueSort(i)}return k&&(w=y,j=v),t};return c?ia(f):f}return h=ga.compile=function(a,b){var c,d=[],e=[],f=A[a+" "];if(!f){b||(b=g(a)),c=b.length;while(c--)f=ya(b[c]),f[u]?d.push(f):e.push(f);f=A(a,za(e,d)),f.selector=a}return f},i=ga.select=function(a,b,c,e){var f,i,j,k,l,m="function"==typeof a&&a,n=!e&&g(a=m.selector||a);if(c=c||[],1===n.length){if(i=n[0]=n[0].slice(0),i.length>2&&"ID"===(j=i[0]).type&&9===b.nodeType&&p&&d.relative[i[1].type]){if(b=(d.find.ID(j.matches[0].replace(_,aa),b)||[])[0],!b)return c;m&&(b=b.parentNode),a=a.slice(i.shift().value.length)}f=V.needsContext.test(a)?0:i.length;while(f--){if(j=i[f],d.relative[k=j.type])break;if((l=d.find[k])&&(e=l(j.matches[0].replace(_,aa),$.test(i[0].type)&&qa(b.parentNode)||b))){if(i.splice(f,1),a=e.length&&sa(i),!a)return G.apply(c,e),c;break}}}return(m||h(a,n))(e,b,!p,c,!b||$.test(a)&&qa(b.parentNode)||b),c},c.sortStable=u.split("").sort(B).join("")===u,c.detectDuplicates=!!l,m(),c.sortDetached=ja(function(a){return 1&a.compareDocumentPosition(n.createElement("fieldset"))}),ja(function(a){return a.innerHTML=" ","#"===a.firstChild.getAttribute("href")})||ka("type|href|height|width",function(a,b,c){if(!c)return a.getAttribute(b,"type"===b.toLowerCase()?1:2)}),c.attributes&&ja(function(a){return a.innerHTML=" ",a.firstChild.setAttribute("value",""),""===a.firstChild.getAttribute("value")})||ka("value",function(a,b,c){if(!c&&"input"===a.nodeName.toLowerCase())return a.defaultValue}),ja(function(a){return null==a.getAttribute("disabled")})||ka(J,function(a,b,c){var d;if(!c)return a[b]===!0?b.toLowerCase():(d=a.getAttributeNode(b))&&d.specified?d.value:null}),ga}(a);r.find=x,r.expr=x.selectors,r.expr[":"]=r.expr.pseudos,r.uniqueSort=r.unique=x.uniqueSort,r.text=x.getText,r.isXMLDoc=x.isXML,r.contains=x.contains,r.escapeSelector=x.escape;var y=function(a,b,c){var d=[],e=void 0!==c;while((a=a[b])&&9!==a.nodeType)if(1===a.nodeType){if(e&&r(a).is(c))break;d.push(a)}return d},z=function(a,b){for(var c=[];a;a=a.nextSibling)1===a.nodeType&&a!==b&&c.push(a);return c},A=r.expr.match.needsContext;function B(a,b){return a.nodeName&&a.nodeName.toLowerCase()===b.toLowerCase()}var C=/^<([a-z][^\/\0>:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i,D=/^.[^:#\[\.,]*$/;function E(a,b,c){return r.isFunction(b)?r.grep(a,function(a,d){return!!b.call(a,d,a)!==c}):b.nodeType?r.grep(a,function(a){return a===b!==c}):"string"!=typeof b?r.grep(a,function(a){return i.call(b,a)>-1!==c}):D.test(b)?r.filter(b,a,c):(b=r.filter(b,a),r.grep(a,function(a){return i.call(b,a)>-1!==c&&1===a.nodeType}))}r.filter=function(a,b,c){var d=b[0];return c&&(a=":not("+a+")"),1===b.length&&1===d.nodeType?r.find.matchesSelector(d,a)?[d]:[]:r.find.matches(a,r.grep(b,function(a){return 1===a.nodeType}))},r.fn.extend({find:function(a){var b,c,d=this.length,e=this;if("string"!=typeof a)return this.pushStack(r(a).filter(function(){for(b=0;b1?r.uniqueSort(c):c},filter:function(a){return this.pushStack(E(this,a||[],!1))},not:function(a){return this.pushStack(E(this,a||[],!0))},is:function(a){return!!E(this,"string"==typeof a&&A.test(a)?r(a):a||[],!1).length}});var F,G=/^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]+))$/,H=r.fn.init=function(a,b,c){var e,f;if(!a)return this;if(c=c||F,"string"==typeof a){if(e="<"===a[0]&&">"===a[a.length-1]&&a.length>=3?[null,a,null]:G.exec(a),!e||!e[1]&&b)return!b||b.jquery?(b||c).find(a):this.constructor(b).find(a);if(e[1]){if(b=b instanceof r?b[0]:b,r.merge(this,r.parseHTML(e[1],b&&b.nodeType?b.ownerDocument||b:d,!0)),C.test(e[1])&&r.isPlainObject(b))for(e in b)r.isFunction(this[e])?this[e](b[e]):this.attr(e,b[e]);return this}return f=d.getElementById(e[2]),f&&(this[0]=f,this.length=1),this}return a.nodeType?(this[0]=a,this.length=1,this):r.isFunction(a)?void 0!==c.ready?c.ready(a):a(r):r.makeArray(a,this)};H.prototype=r.fn,F=r(d);var I=/^(?:parents|prev(?:Until|All))/,J={children:!0,contents:!0,next:!0,prev:!0};r.fn.extend({has:function(a){var b=r(a,this),c=b.length;return this.filter(function(){for(var a=0;a-1:1===c.nodeType&&r.find.matchesSelector(c,a))){f.push(c);break}return this.pushStack(f.length>1?r.uniqueSort(f):f)},index:function(a){return a?"string"==typeof a?i.call(r(a),this[0]):i.call(this,a.jquery?a[0]:a):this[0]&&this[0].parentNode?this.first().prevAll().length:-1},add:function(a,b){return this.pushStack(r.uniqueSort(r.merge(this.get(),r(a,b))))},addBack:function(a){return this.add(null==a?this.prevObject:this.prevObject.filter(a))}});function K(a,b){while((a=a[b])&&1!==a.nodeType);return a}r.each({parent:function(a){var b=a.parentNode;return b&&11!==b.nodeType?b:null},parents:function(a){return y(a,"parentNode")},parentsUntil:function(a,b,c){return y(a,"parentNode",c)},next:function(a){return K(a,"nextSibling")},prev:function(a){return K(a,"previousSibling")},nextAll:function(a){return y(a,"nextSibling")},prevAll:function(a){return y(a,"previousSibling")},nextUntil:function(a,b,c){return y(a,"nextSibling",c)},prevUntil:function(a,b,c){return y(a,"previousSibling",c)},siblings:function(a){return z((a.parentNode||{}).firstChild,a)},children:function(a){return z(a.firstChild)},contents:function(a){return B(a,"iframe")?a.contentDocument:(B(a,"template")&&(a=a.content||a),r.merge([],a.childNodes))}},function(a,b){r.fn[a]=function(c,d){var e=r.map(this,b,c);return"Until"!==a.slice(-5)&&(d=c),d&&"string"==typeof d&&(e=r.filter(d,e)),this.length>1&&(J[a]||r.uniqueSort(e),I.test(a)&&e.reverse()),this.pushStack(e)}});var L=/[^\x20\t\r\n\f]+/g;function M(a){var b={};return r.each(a.match(L)||[],function(a,c){b[c]=!0}),b}r.Callbacks=function(a){a="string"==typeof a?M(a):r.extend({},a);var b,c,d,e,f=[],g=[],h=-1,i=function(){for(e=e||a.once,d=b=!0;g.length;h=-1){c=g.shift();while(++h-1)f.splice(c,1),c<=h&&h--}),this},has:function(a){return a?r.inArray(a,f)>-1:f.length>0},empty:function(){return f&&(f=[]),this},disable:function(){return e=g=[],f=c="",this},disabled:function(){return!f},lock:function(){return e=g=[],c||b||(f=c=""),this},locked:function(){return!!e},fireWith:function(a,c){return e||(c=c||[],c=[a,c.slice?c.slice():c],g.push(c),b||i()),this},fire:function(){return j.fireWith(this,arguments),this},fired:function(){return!!d}};return j};function N(a){return a}function O(a){throw a}function P(a,b,c,d){var e;try{a&&r.isFunction(e=a.promise)?e.call(a).done(b).fail(c):a&&r.isFunction(e=a.then)?e.call(a,b,c):b.apply(void 0,[a].slice(d))}catch(a){c.apply(void 0,[a])}}r.extend({Deferred:function(b){var c=[["notify","progress",r.Callbacks("memory"),r.Callbacks("memory"),2],["resolve","done",r.Callbacks("once memory"),r.Callbacks("once memory"),0,"resolved"],["reject","fail",r.Callbacks("once memory"),r.Callbacks("once memory"),1,"rejected"]],d="pending",e={state:function(){return d},always:function(){return f.done(arguments).fail(arguments),this},"catch":function(a){return e.then(null,a)},pipe:function(){var a=arguments;return r.Deferred(function(b){r.each(c,function(c,d){var e=r.isFunction(a[d[4]])&&a[d[4]];f[d[1]](function(){var a=e&&e.apply(this,arguments);a&&r.isFunction(a.promise)?a.promise().progress(b.notify).done(b.resolve).fail(b.reject):b[d[0]+"With"](this,e?[a]:arguments)})}),a=null}).promise()},then:function(b,d,e){var f=0;function g(b,c,d,e){return function(){var h=this,i=arguments,j=function(){var a,j;if(!(b=f&&(d!==O&&(h=void 0,i=[a]),c.rejectWith(h,i))}};b?k():(r.Deferred.getStackHook&&(k.stackTrace=r.Deferred.getStackHook()),a.setTimeout(k))}}return r.Deferred(function(a){c[0][3].add(g(0,a,r.isFunction(e)?e:N,a.notifyWith)),c[1][3].add(g(0,a,r.isFunction(b)?b:N)),c[2][3].add(g(0,a,r.isFunction(d)?d:O))}).promise()},promise:function(a){return null!=a?r.extend(a,e):e}},f={};return r.each(c,function(a,b){var g=b[2],h=b[5];e[b[1]]=g.add,h&&g.add(function(){d=h},c[3-a][2].disable,c[0][2].lock),g.add(b[3].fire),f[b[0]]=function(){return f[b[0]+"With"](this===f?void 0:this,arguments),this},f[b[0]+"With"]=g.fireWith}),e.promise(f),b&&b.call(f,f),f},when:function(a){var b=arguments.length,c=b,d=Array(c),e=f.call(arguments),g=r.Deferred(),h=function(a){return function(c){d[a]=this,e[a]=arguments.length>1?f.call(arguments):c,--b||g.resolveWith(d,e)}};if(b<=1&&(P(a,g.done(h(c)).resolve,g.reject,!b),"pending"===g.state()||r.isFunction(e[c]&&e[c].then)))return g.then();while(c--)P(e[c],h(c),g.reject);return g.promise()}});var Q=/^(Eval|Internal|Range|Reference|Syntax|Type|URI)Error$/;r.Deferred.exceptionHook=function(b,c){a.console&&a.console.warn&&b&&Q.test(b.name)&&a.console.warn("jQuery.Deferred exception: "+b.message,b.stack,c)},r.readyException=function(b){a.setTimeout(function(){throw b})};var R=r.Deferred();r.fn.ready=function(a){return R.then(a)["catch"](function(a){r.readyException(a)}),this},r.extend({isReady:!1,readyWait:1,ready:function(a){(a===!0?--r.readyWait:r.isReady)||(r.isReady=!0,a!==!0&&--r.readyWait>0||R.resolveWith(d,[r]))}}),r.ready.then=R.then;function S(){d.removeEventListener("DOMContentLoaded",S),
-a.removeEventListener("load",S),r.ready()}"complete"===d.readyState||"loading"!==d.readyState&&!d.documentElement.doScroll?a.setTimeout(r.ready):(d.addEventListener("DOMContentLoaded",S),a.addEventListener("load",S));var T=function(a,b,c,d,e,f,g){var h=0,i=a.length,j=null==c;if("object"===r.type(c)){e=!0;for(h in c)T(a,b,h,c[h],!0,f,g)}else if(void 0!==d&&(e=!0,r.isFunction(d)||(g=!0),j&&(g?(b.call(a,d),b=null):(j=b,b=function(a,b,c){return j.call(r(a),c)})),b))for(;h1,null,!0)},removeData:function(a){return this.each(function(){X.remove(this,a)})}}),r.extend({queue:function(a,b,c){var d;if(a)return b=(b||"fx")+"queue",d=W.get(a,b),c&&(!d||Array.isArray(c)?d=W.access(a,b,r.makeArray(c)):d.push(c)),d||[]},dequeue:function(a,b){b=b||"fx";var c=r.queue(a,b),d=c.length,e=c.shift(),f=r._queueHooks(a,b),g=function(){r.dequeue(a,b)};"inprogress"===e&&(e=c.shift(),d--),e&&("fx"===b&&c.unshift("inprogress"),delete f.stop,e.call(a,g,f)),!d&&f&&f.empty.fire()},_queueHooks:function(a,b){var c=b+"queueHooks";return W.get(a,c)||W.access(a,c,{empty:r.Callbacks("once memory").add(function(){W.remove(a,[b+"queue",c])})})}}),r.fn.extend({queue:function(a,b){var c=2;return"string"!=typeof a&&(b=a,a="fx",c--),arguments.length\x20\t\r\n\f]+)/i,la=/^$|\/(?:java|ecma)script/i,ma={option:[1,""," "],thead:[1,""],col:[2,""],tr:[2,""],td:[3,""],_default:[0,"",""]};ma.optgroup=ma.option,ma.tbody=ma.tfoot=ma.colgroup=ma.caption=ma.thead,ma.th=ma.td;function na(a,b){var c;return c="undefined"!=typeof a.getElementsByTagName?a.getElementsByTagName(b||"*"):"undefined"!=typeof a.querySelectorAll?a.querySelectorAll(b||"*"):[],void 0===b||b&&B(a,b)?r.merge([a],c):c}function oa(a,b){for(var c=0,d=a.length;c-1)e&&e.push(f);else if(j=r.contains(f.ownerDocument,f),g=na(l.appendChild(f),"script"),j&&oa(g),c){k=0;while(f=g[k++])la.test(f.type||"")&&c.push(f)}return l}!function(){var a=d.createDocumentFragment(),b=a.appendChild(d.createElement("div")),c=d.createElement("input");c.setAttribute("type","radio"),c.setAttribute("checked","checked"),c.setAttribute("name","t"),b.appendChild(c),o.checkClone=b.cloneNode(!0).cloneNode(!0).lastChild.checked,b.innerHTML="",o.noCloneChecked=!!b.cloneNode(!0).lastChild.defaultValue}();var ra=d.documentElement,sa=/^key/,ta=/^(?:mouse|pointer|contextmenu|drag|drop)|click/,ua=/^([^.]*)(?:\.(.+)|)/;function va(){return!0}function wa(){return!1}function xa(){try{return d.activeElement}catch(a){}}function ya(a,b,c,d,e,f){var g,h;if("object"==typeof b){"string"!=typeof c&&(d=d||c,c=void 0);for(h in b)ya(a,h,c,d,b[h],f);return a}if(null==d&&null==e?(e=c,d=c=void 0):null==e&&("string"==typeof c?(e=d,d=void 0):(e=d,d=c,c=void 0)),e===!1)e=wa;else if(!e)return a;return 1===f&&(g=e,e=function(a){return r().off(a),g.apply(this,arguments)},e.guid=g.guid||(g.guid=r.guid++)),a.each(function(){r.event.add(this,b,e,d,c)})}r.event={global:{},add:function(a,b,c,d,e){var f,g,h,i,j,k,l,m,n,o,p,q=W.get(a);if(q){c.handler&&(f=c,c=f.handler,e=f.selector),e&&r.find.matchesSelector(ra,e),c.guid||(c.guid=r.guid++),(i=q.events)||(i=q.events={}),(g=q.handle)||(g=q.handle=function(b){return"undefined"!=typeof r&&r.event.triggered!==b.type?r.event.dispatch.apply(a,arguments):void 0}),b=(b||"").match(L)||[""],j=b.length;while(j--)h=ua.exec(b[j])||[],n=p=h[1],o=(h[2]||"").split(".").sort(),n&&(l=r.event.special[n]||{},n=(e?l.delegateType:l.bindType)||n,l=r.event.special[n]||{},k=r.extend({type:n,origType:p,data:d,handler:c,guid:c.guid,selector:e,needsContext:e&&r.expr.match.needsContext.test(e),namespace:o.join(".")},f),(m=i[n])||(m=i[n]=[],m.delegateCount=0,l.setup&&l.setup.call(a,d,o,g)!==!1||a.addEventListener&&a.addEventListener(n,g)),l.add&&(l.add.call(a,k),k.handler.guid||(k.handler.guid=c.guid)),e?m.splice(m.delegateCount++,0,k):m.push(k),r.event.global[n]=!0)}},remove:function(a,b,c,d,e){var f,g,h,i,j,k,l,m,n,o,p,q=W.hasData(a)&&W.get(a);if(q&&(i=q.events)){b=(b||"").match(L)||[""],j=b.length;while(j--)if(h=ua.exec(b[j])||[],n=p=h[1],o=(h[2]||"").split(".").sort(),n){l=r.event.special[n]||{},n=(d?l.delegateType:l.bindType)||n,m=i[n]||[],h=h[2]&&new RegExp("(^|\\.)"+o.join("\\.(?:.*\\.|)")+"(\\.|$)"),g=f=m.length;while(f--)k=m[f],!e&&p!==k.origType||c&&c.guid!==k.guid||h&&!h.test(k.namespace)||d&&d!==k.selector&&("**"!==d||!k.selector)||(m.splice(f,1),k.selector&&m.delegateCount--,l.remove&&l.remove.call(a,k));g&&!m.length&&(l.teardown&&l.teardown.call(a,o,q.handle)!==!1||r.removeEvent(a,n,q.handle),delete i[n])}else for(n in i)r.event.remove(a,n+b[j],c,d,!0);r.isEmptyObject(i)&&W.remove(a,"handle events")}},dispatch:function(a){var b=r.event.fix(a),c,d,e,f,g,h,i=new Array(arguments.length),j=(W.get(this,"events")||{})[b.type]||[],k=r.event.special[b.type]||{};for(i[0]=b,c=1;c=1))for(;j!==this;j=j.parentNode||this)if(1===j.nodeType&&("click"!==a.type||j.disabled!==!0)){for(f=[],g={},c=0;c-1:r.find(e,this,null,[j]).length),g[e]&&f.push(d);f.length&&h.push({elem:j,handlers:f})}return j=this,i\x20\t\r\n\f]*)[^>]*)\/>/gi,Aa=/
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
PaddleSpeech Serving简介
-
- PaddleSpeech 是基于飞桨 PaddlePaddle 的语音方向的开源模型库,用于语音和音频中的各种关键任务的开发。PaddleSpeech Serving是基于python + fastapi 的语音算法模型的C/S类型后端服务,旨在统一paddle speech下的各语音算子来对外提供后端服务。
-
-
-
-
-
-
-
-
-
-
-
-
-