Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

snakeoil.compression can't cleanly handle if the parallelization compressor changes during runtime #98

Open
ferringb opened this issue Jan 19, 2024 · 0 comments
Labels

Comments

@ferringb
Copy link
Contributor

See

try:
lbzip2_path = process.find_binary("lbzip2")
lbzip2_compress_args = (f"-n{multiprocessing.cpu_count()}",)
lbzip2_decompress_args = lbzip2_compress_args
parallelizable = True
except process.CommandNotFound:
lbzip2_path = None
parallelizable = False
lbzip2_compress_args = lbzip2_decompress_args = ()

Note that upon module import it detects lbzip2 and the code uses it from that point forward.

This will break if someone does something like merging bzip2 and unmerging lbzip2 without the module being reloaded. Alternatively, if one does a merge that adds lbzip2, it still won't use lbzip2 for that process w/out an explicit module reload.

This can be fixed via moving the check for the binary to invocation time; that'll also allow for pbzip2 support to be added properly.

@ferringb ferringb added the bug label Jan 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant