You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As explained in numpy/numpy#13136 (comment) we might want to control libraries that are specific to a given Python module (e.g. numpy.linalg.lapack_lite and not all Python packages that have been imported in the current Python program).
Filtering by specific Python module should also further reduce the overhead of the context manager. This will require some refactoring though.
The text was updated successfully, but these errors were encountered:
In complement we should make it possible to check if two python modules share a common thread pool to know whether or not we should limit the inner threadpool size to avoid subscription in case of nested parallelism.
Example: sklearn kmeans implementation in Cython (with OpenMP prange) might want to call into scipy BLAS . If the latter is OpenBLAS compiled with libgomp, and the outer prange loop is also handled by the same instance of the libgomp runtime, we do not want to limit the number of threads of libgomp to 1. Instead we want to let the libgomp runtime handle the nested parallelism itself.
On the other hand if the inner BLAS calls does not use OpenMP or uses an different OpenMP runtime than the outer prange loop, then we want to resize the inner threadpool to 1 thread to protect against oversubscription for the duration of the prange loop.
As explained in numpy/numpy#13136 (comment) we might want to control libraries that are specific to a given Python module (e.g.
numpy.linalg.lapack_lite
and not all Python packages that have been imported in the current Python program).Filtering by specific Python module should also further reduce the overhead of the context manager. This will require some refactoring though.
The text was updated successfully, but these errors were encountered: