SpecAugment, Optuna, N-gram language modelling, and bug fixes
Pre-release
Pre-release
EDIT: Overnight, pytorch released a version that caused an error. Do not use this version! I'll update and release v0.2.1.
Too many bits-and-bobs to count. I should keep a change log. Main features seem to be
- Optuna hooks for data loaders (see pydrobert-param)
- N-gram language modelling on the GPU. Can read in ARPA-LM files. Good for shallow fusion AM+LM.
- SpecAugment layer for, well, SpecAugment
- A variety of fixes and new features to my data loaders.