Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Have a "revert to factory default" setting for that resets LBT to point to the BLAS that shipped with Julia (for 1.6 and earlier) #58

Closed
carstenbauer opened this issue Dec 17, 2020 · 17 comments

Comments

@carstenbauer
Copy link
Member

Is it possible to switch back to OpenBLAS? If this is a feature it should be documented.

I see that there is a enable_openblas_startup() function in install.jl: https://github.com/JuliaComputing/MKL.jl/blob/b6283ca3def34ee6329023e8e8ec2700c2328ff8/src/install.jl#L104

How should one call this function to revert the changes?

@carstenbauer
Copy link
Member Author

carstenbauer commented Dec 17, 2020

I (seem to have) managed to revert back to openblas by setting ENV["USEBLAS64"]=true (otherwise there is an error) and changing the line enable_mkl_startup() to enable_openblas_startup(). Afterwards I have

julia> BLAS.vendor()
:openblas64

again.

Would be nice to have this work without having to adjust build.jl.

@ViralBShah
Copy link
Contributor

I revert by deleting and reinstalling Julia - since it doesn't affect the packages or anything. I'd rather get the default build from julialang.org instead of building my own system image, if I am using the standard configuration.

@carstenbauer
Copy link
Member Author

Well, you are still free to reinstall Julia if we add this option. Note that one might be using MKL.jl in combination with a self-compiled julia in which case you would have to recompile entirely.

I'd rather get the default build from julialang.org instead of building my own system image, if I am using the standard configuration.

May I ask why? In what way is the default system image better than a self-built one?

@ViralBShah
Copy link
Contributor

ViralBShah commented Dec 21, 2020

The default system image is built with a great amount of care. So unless I want something out of the ordinary, I personally prefer to use the download provided at julialang.org.

@carstenbauer
Copy link
Member Author

With the new libblastrampoline infrastructure of Julia 1.7 this becomes irrelevant.

@ViralBShah
Copy link
Contributor

I will note that with the LBT stuff, we currently allow loading MKL to override OpenBLAS, but do we need a function to go in the reverse direction? Just restarting Julia will give you OpenBLAS by default (unless you load MKL).

@carstenbauer
Copy link
Member Author

I think that would be great, if only for comparing OpenBLAS and MKL performance when benchmarking a function. For symmetry we might want to add two functions openblas() and mkl()?

@ViralBShah
Copy link
Contributor

It is currently done in the __init__() of MKL.jl. I am trying out BLIS when I come back to this next. I am thinking perhaps a switch_blas API, which in base Julia gets you back to OpenBLAS. MKL.jl, BLIS.jl etc. can then extend it for their own purposes.

@staticfloat - Any thoughts?

@staticfloat
Copy link
Member

I'm thinking we should take these lines, wrap them in a function like reset_vendor() and then users can call LinearAlgebra.reset_vendor(). So to switch to MKL, you call using MKL or once it's loaded you can call MKL.use_mkl() or something, and then if you want to reset, you would call LinearAlgebra.reset_vendor().

@carstenbauer
Copy link
Member Author

Can we reopen this to not forget about it? The last few posts support reopening this.

I just stumbled over it again in the context of setting up a centralised Julia depot on a HPC cluster. The depot contains a centralised MKL.jl installation. It also contains a central startup.jl into which I want to put using MKL (as this is what most users want when loading the Julia-Intel module). However, currently, this would force users to use MKL as there is no go_back_to_using_openblas() function. Leaving the using MKL to the user is of course annoying as well (he/she has to potentially type it in every session).

@carstenbauer
Copy link
Member Author

Do we want the function reset_vendor() to live in LinearAlgebra or MKL.jl?

@ViralBShah
Copy link
Contributor

MKL.jl, I think. Will get it in people's hands sooner.

@ViralBShah
Copy link
Contributor

I'm changing my mind on this. I think the reset_vendor() should be in LinearAlgebra so that you can get the Julia default BLAS that shipped. PRs welcome in base!

@ViralBShah ViralBShah changed the title Reverting back to OpenBLAS? Have a "revert to factory default" setting for that resets LBT to point to the BLAS that shipped with Julia Sep 18, 2021
@ViralBShah ViralBShah changed the title Have a "revert to factory default" setting for that resets LBT to point to the BLAS that shipped with Julia Have a "revert to factory default" setting for that resets LBT to point to the BLAS that shipped with Julia (for 1.6 and earlier) Sep 19, 2021
@ViralBShah
Copy link
Contributor

At this point, given that we have moved to LBT since 1.7, I think the easiest way to reset on 1.6 is to reinstall Julia.

@carstenbauer
Copy link
Member Author

I agree, but for LBT (i.e. Julia >=1.7) we still need a way to revert back to the default BLAS within a session. In fact, I still want a way to switch back and forth. Locally, I have a Julia PR that provides a BLAS.reset_config(). I hope to find the time to finish it soon. Only thing left is a PR to MKL.jl then which adds a MKL.activate() or whatever we want to call it.

@giordano
Copy link
Contributor

I usually do LinearAlgebra.__init__(), or you can call directly what that function does.

@carstenbauer
Copy link
Member Author

Oh I see that we have #90 for tracking these efforts. So let's close this one in favor of #90.

PS: @giordano, yeah, my reset_config() is based on the code in __init__().

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants