Skip to content

Releases: sdrobert/pydrobert-pytorch

Refactored for Python 3, breaking backwards compatibility

11 Mar 21:44
Compare
Choose a tag to compare

A considerable amount of refactoring occurred for this build, chiefly to get
rid of Python 2.7 support. While the functionality did not change much for this
version, we have switched from a pkgutil-style pydrobert namespace to
PEP-420-style namespaces. As a result, this package is not
backwards-compatible with previous pydrobert packages!
Make sure that if any
of the following are installed, they exceed the following version thresholds:

  • pydrobert-param >0.2.0
  • pydrobert-kaldi >0.5.3
  • pydrobert-speech >0.1.0

Miscellaneous other stuff:

  • Type hints everywhere
  • Shifted python source to src/
  • Black-formatted remaining source
  • Removed future dependency
  • Shifted most of the configuration to setup.cfg, leaving only a shell
    in setup.py to remain compatible with Conda builds
  • Added pyproject.toml for PEP 517.
  • tox.ini for TOX testing
  • Switched to AppVeyor for CI
  • Added changelog :D

Bug fix from v0.2.0

01 Mar 21:35
6523c92
Compare
Choose a tag to compare
Bug fix from v0.2.0 Pre-release
Pre-release

This release is essentially identical to v0.2.0, modulo a bug fix.

Too many bits-and-bobs to count. I should keep a change log. Main features seem to be

  • Optuna hooks for data loaders (see pydrobert-param)
  • N-gram language modelling on the GPU. Can read in ARPA-LM files. Good for shallow fusion AM+LM.
  • SpecAugment layer for, well, SpecAugment
  • A variety of fixes and new features to my data loaders.

This is the last release that will support Python 2.7.

SpecAugment, Optuna, N-gram language modelling, and bug fixes

25 Feb 17:22
b17a038
Compare
Choose a tag to compare

EDIT: Overnight, pytorch released a version that caused an error. Do not use this version! I'll update and release v0.2.1.

Too many bits-and-bobs to count. I should keep a change log. Main features seem to be

  • Optuna hooks for data loaders (see pydrobert-param)
  • N-gram language modelling on the GPU. Can read in ARPA-LM files. Good for shallow fusion AM+LM.
  • SpecAugment layer for, well, SpecAugment
  • A variety of fixes and new features to my data loaders.

More features! Attention, edit-distance based objectives, and more

06 Aug 20:50
Compare
Choose a tag to compare

Welp, the previous version should've been 0.1.0, but now we're at 0.1.0.

Besides bug fixes, this version includes

  • The layers submodule, which currently includes attention mechansims for seq2seq. This also includes transformer network attention
  • Convert from torch data dirs to NIST TRN and CTM, and back again
  • Edit-distance based losses and reward functions, including Optimal Character Distillation
  • Cleaner documentation on website, with tutorials

Lots of new stuff

27 Jun 00:18
Compare
Choose a tag to compare
Lots of new stuff Pre-release
Pre-release

Welp, I guess I never published the v0.0.1 tag... I just released it. What's new? Probably

  • pydrobert.torch.data has been cleaned up. There are lots of tools for transducing data. Plus there are examples.
  • pydrobert.torch.estimators has been added. The functions therein can be used for discretely sampling data. More examples.
  • miscellaneous bug fixes

Enjoy