Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batched implementation #26

Open
jacarvalho opened this issue Jun 15, 2023 · 2 comments
Open

Batched implementation #26

jacarvalho opened this issue Jun 15, 2023 · 2 comments

Comments

@jacarvalho
Copy link

Hey,
First, thanks a lot for the repo!
Is there a way to use cholespy with a batch of matrices? If not, what would you suggest as the best approach?
Thanks!

@bathal1
Copy link
Collaborator

bathal1 commented Jun 16, 2023

Hi,

Cholespy supports batches of vectors, but for a fixed matrix. The reason is that the amount and structure of computation required to solve the system varies depending on the sparsity pattern of the matrix, so this makes it a poor fit for parallel execution on the GPU.

If you have batches of matrices, chances are that you are not using a sparse representation, so this package is probably not the right one for your application.

@jacarvalho
Copy link
Author

I have a batch of sparse matrices that, unfortunately, are not stored as sparse matrices since pytorch still has very poor support of operations for these kinds of matrices. (It would be a lot of work to restructure the code at this point, just for a marginal gain.)
One thing I thought of was to construct a block diagonal of the batch of sparse matrices, which would increase memory but hopefully would make it faster to solve with your method than using torch.cholesky_solve.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants