2023-02-04 02:58:47
https://twitter.com/alphacep/status/1621612504840273928
NeMo 1.15 is out now! Theres a whole bunch of powerful ASR features added in this release including Hybrid CTC-RNNT models, Multiblank Transducer, Multi Head Attention Adapters, Conformer longformer inference, and a Beam Search API!
First, we dicuss Hybrid CTC-RNNT models. We can train a single model with both losses, and then perform inference with either decoder. It turns out, we can attain better CTC results, and converge 40-50% faster for CTC head when jointly trained.
Next up, we have Multiblank Transducers supported in NeMo. It is an extension of RNNT loss - in which tokens can jump multiple timesteps per predicted token, allowing for highly efficient inference - even at sample level ! Refer to the paper here
With this change, you can now easily train a multi blank RNNT model and obtain better WER but also much faster inference than regular RNNT models.
Next up, we now support Multi Head Attention Adapters in NeMo ASR. With this approach, now any NeMo module can be retrofitted into an adapter module. We see significant parameter efficiency when compared to Houlsby Adapter. With the newly updated scripts for adapter training, we can now easily train either Linear adapters or MHA adapters from the same script. More details can be found in the PR
Long form audio transcription has long been a challange for Conformer based ASR models, because of the attention component. So we now support Longformer based transcriptions - even for pre-trained models ! You can use the transcribe_speech script for this! We find that if you further finetune the model after conversion to Longformer attention, you can recover most of the WER and still get excellent long audio transcription of up to 30-40 minutes in one shot forward pass.
A long-asked feature is to support beam search in NeMo ASR in a easy to use way. So we unified the way we do CTC beam search with external libraries with the simple model.transcribe() method! You can simply update the config, and then transcribe !
We also begin support for AIStore as a framework for terabyte-scale datasets as a scalable solution to train ASR models on enormous real world datasets.
62 viewsedited 23:58