Skip to content

AvatarWorld/wenet

 
 

Repository files navigation

WeNet

We share net together. We borrowed a lot of code from ESPnet, and we refered to OpenTransformer for batch inference.

The main motivation of WeNet is to close the gap between research and production end-to-end (E2E) speech recognition models, to reduce the effort of productionizing E2E models, and to explore better E2E models for production.

Highlights

  • Unified solution for streaming and non-streaming ASR: WeNet implements Unified Two Pass (U2) framework to achieve accurate, fast and unified E2E model, which is favorable for industry adoption.
  • Light weight: WeNet is designed specifically for E2E speech recognition, with clean and simple code. It is all based on PyTorch and its corresponding ecosystem. It has no dependency on Kaldi, which simplifies installation and usage.
  • Production ready: The python code of WeNet meets the requirements of TorchScript, so the model trained by WeNet can be directly exported by Torch JIT and use LibTorch for inference. There is no gap between the research model and production model. Neither model conversion nor additional code is required for model inference.
  • Portable runtime: Several demos will be provided to show how to host WeNet trained models on different platforms, including server (x86) and embedded (ARM in Android platforms).

Performance Benchmark

Please see examples/$dataset/s0/README.md for WeNet benchmark on different speech datasets.

Documentation

You can visit Docs for WeNet Sphinx documentation. Or please read tutorials below:

Installation

  • Clone the repo
git clone https://github.com/mobvoi/wenet.git
conda create -n wenet python=3.8
conda activate wenet
pip install -r requirements.txt
conda install pytorch==1.6.0 cudatoolkit=10.1 torchaudio -c pytorch

Discuss & Communication

In addition to discussing on Github issues, we create a WeChat group for better discuss and quick response. Please scan the following QR code by WeChat to join the chat group.

Wenet chat group

About

Transformer based ASR Engine.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 63.3%
  • C++ 23.8%
  • Shell 4.9%
  • Java 3.7%
  • Perl 1.8%
  • Starlark 1.3%
  • CMake 1.2%