Skip to content

v0.0.12

Latest
Compare
Choose a tag to compare
@ziatdinovmax ziatdinovmax released this 25 Feb 02:00
· 18 commits to main since this release

This release introduces partially Bayesian Transformers and neuron-level control over model stochasticity.

Key Additions:

Partially Bayesian Transformers: Transformer neural networks are at the heart of modern AI systems and are increasingly used in physical sciences. However, robust uncertainty quantification with Transformers remains challenging. While replacing all weights with probabilistic distributions and using advanced sampling techniques works for smaller networks, this approach is computationally prohibitive for Transformers. Our new partially Bayesian Transformer implementation allows you to selectively make specific modules (embedding, attention, etc.) probabilistic while keeping others deterministic, significantly reducing computational costs while still delivering reliable uncertainty quantification.

Fine-grained Stochasticity Control: Even with only some layers probabilistic, training deep learning models can be resource-intensive. You can now specify exactly which weights in particular layers should be stochastic, providing a finer control over the computational cost vs. uncertainty trade-off.

What's Changed

Full Changelog: 0.0.10...0.0.12