Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using SacessOptimizer on objectives with parameter priors leads to pickle error #1465

Open
stephanmg opened this issue Sep 16, 2024 · 2 comments · Fixed by #1467
Open

Using SacessOptimizer on objectives with parameter priors leads to pickle error #1465

stephanmg opened this issue Sep 16, 2024 · 2 comments · Fixed by #1467
Assignees
Labels
bug Something isn't working

Comments

@stephanmg
Copy link
Contributor

stephanmg commented Sep 16, 2024

Bug description
Using a benchmark model with parameter priors leads to pickle error, see log below.

Expected behavior
No pickle error and parameter estimation should start.

To reproduce
Pick for instance the Schwen_PONE2014 benchmark model try to load and estimate with scipy or sacess optimizer.

"Remedy": Remove parameter priors from parameters.tsv works, but obviously then runs parameter estimation without prior information which is not desirable.

Latest confirmed working version of pyPESTO was 0.4.0 for me, might be the starting point of a git bisect.

Environment

  • Operating system: Unicorn cluster
  • pypesto version: 0.5.3
  • Python version: 3.11.1

Log:

main()
Traceback (most recent call last):
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/optimize_new.py", line 303, in <module>
    main()
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/optimize_new.py", line 286, in main
    problem, result = run_optimization(
                      ^^^^^^^^^^^^^^^^^
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/optimize_new.py", line 112, in run_optimization
    result = optimizer.minimize(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/myvenv/lib/python3.11/site-packages/pypesto/optimize/ess/sacess.py", line 253, in minimize
    p.start()
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/context.py", line 288, in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 47, in _launch
    reduction.dump(process_obj, fp)
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object '_prior_densities.<locals>.log_f'
Traceback (most recent call last):
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/optimize_new.py", line 303, in <module>
    main()
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/optimize_new.py", line 286, in main
    problem, result = run_optimization(
                      ^^^^^^^^^^^^^^^^^
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/optimize_new.py", line 112, in run_optimization
    result = optimizer.minimize(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/sgrein/optimizer_benchmark_sacess_reimpl/myvenv/lib/python3.11/site-packages/pypesto/optimize/ess/sacess.py", line 253, in minimize
    p.start()
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/context.py", line 288, in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 47, in _launch
    reduction.dump(process_obj, fp)
  File "/opt/ohpc/pub/spack/develop/opt/spack/linux-rocky8-zen2/gcc-12.2.0/python-3.11.1-efo4f5x5f4rnsqletfnwby5wslhnru2o/lib/python3.11/multiprocessing/reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object '_prior_densities.<locals>.log_f'
@stephanmg stephanmg added the bug Something isn't working label Sep 16, 2024
@dweindl dweindl changed the title Using benchmark models with parameter priors leads to pickle error Using SacessOptimizer on objectives with parameter priors leads to pickle error Sep 16, 2024
@dweindl dweindl self-assigned this Sep 16, 2024
@dweindl
Copy link
Member

dweindl commented Sep 16, 2024

This is specifically a problem with SacessOptimizer.

This problem was probably introduced in #1353 (pypesto 0.5.0).

As a quick workaround, you can try using SacessOptimizer(..., mp_start_method="fork") if supported in your setup. This might potentially cause other issues, but it worked for me in a simple test.

For a proper fix, SacessOptimizer needs to use cloudpickle instead of pickle, as is already the case in MultiProcessEngine.

dweindl added a commit to dweindl/pyPESTO that referenced this issue Sep 16, 2024
Cloudpickle is able to handle more complex objects than pickle.

See ICB-DCM#1465
Closes ICB-DCM#1465

If the new process are forked, we could skip the pickling, but at this point, I don't think that's necessary.
@stephanmg
Copy link
Contributor Author

Workaround works for me.

@dweindl dweindl linked a pull request Sep 24, 2024 that will close this issue
dweindl added a commit that referenced this issue Sep 26, 2024
Cloudpickle is able to handle more complex objects than pickle.

See #1465
Closes #1465
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants