Skip to content

A performant, memory-efficient checkpointing library for PyTorch applications, designed with large, complex distributed workloads in mind.

License

Notifications You must be signed in to change notification settings

pytorch/torchsnapshot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

0a23047 · Dec 6, 2024
Nov 5, 2024
Oct 29, 2024
Aug 2, 2023
Mar 8, 2024
Dec 6, 2024
Dec 6, 2024
Oct 27, 2022
Apr 2, 2024
Aug 29, 2022
Apr 2, 2024
Jun 9, 2022
Jun 16, 2022
Jun 9, 2022
Oct 31, 2023
May 20, 2024
Aug 29, 2022
Oct 18, 2022
Aug 1, 2024
Oct 21, 2022

Repository files navigation

TorchSnapshot (Beta Release)

build status pypi version conda version pypi nightly version codecov bsd license

A performant, memory-efficient checkpointing library for PyTorch applications, designed with large, complex distributed workloads in mind.

Install

Requires Python >= 3.8 and PyTorch >= 2.0.0

From pip:

# Stable
pip install torchsnapshot
# Or, using conda
conda install -c conda-forge torchsnapshot

# Nightly
pip install --pre torchsnapshot-nightly

From source:

git clone https://github.com/pytorch/torchsnapshot
cd torchsnapshot
pip install -r requirements.txt
python setup.py install

Why TorchSnapshot

Performance

  • TorchSnapshot provides a fast checkpointing implementation employing various optimizations, including zero-copy serialization for most tensor types, overlapped device-to-host copy and storage I/O, parallelized storage I/O.
  • TorchSnapshot greatly speeds up checkpointing for DistributedDataParallel workloads by distributing the write load across all ranks (benchmark).
  • When host memory is abundant, TorchSnapshot allows training to resume before all storage I/O completes, reducing the time blocked by checkpoint saving.

Memory Usage

  • TorchSnapshot's memory usage adapts to the host's available resources, greatly reducing the chance of out-of-memory issues when saving and loading checkpoints.
  • TorchSnapshot supports efficient random access to individual objects within a snapshot, even when the snapshot is stored in a cloud object storage.

Usability

  • Simple APIs that are consistent between distributed and non-distributed workloads.
  • Out of the box integration with commonly used cloud object storage systems.
  • Automatic resharding (elasticity) on world size change for supported workloads (more details).

Security

  • Secure tensor serialization without pickle dependency [WIP].

Getting Started

from torchsnapshot import Snapshot

# Taking a snapshot
app_state = {"model": model, "optimizer": optimizer}
snapshot = Snapshot.take(path="/path/to/snapshot", app_state=app_state)

# Restoring from a snapshot
snapshot.restore(app_state=app_state)

See the documentation for more details.

License

torchsnapshot is BSD licensed, as found in the LICENSE file.

About

A performant, memory-efficient checkpointing library for PyTorch applications, designed with large, complex distributed workloads in mind.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published