Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create Parameters with arbitrary shape #303

Open
quaquel opened this issue Oct 31, 2023 · 0 comments
Open

Create Parameters with arbitrary shape #303

quaquel opened this issue Oct 31, 2023 · 0 comments

Comments

@quaquel
Copy link
Owner

quaquel commented Oct 31, 2023

What is the problem
Parameters (i.e., levers, constants, and uncertainties) currently must be scalars in the workbench. This means that if you have 100 identical parameters (e.g., as in the intertemporal version of the lake problem) you need to create 100 parameters, all with the same lower and upper bound, but with a slightly different name. It also means in your model you need to collect these 100 parameters and put them back into the appropriate container.

So from the lake problem

    lake_model.levers = [RealParameter(f"l{i}", 0, 0.1) for i in range(lake_model.time_horizon)]

and something like

    decisions = [kwargs.pop[f'l{i}'] for i in range(time_horizon)]

envisioned solution
What if instead of having to create all parameters yourself and collecting their sampled values back into the appropriate container, all of this could be offloaded to the workbench? In fact, rhodium supports this for Levers, which take an optional length keyword argument. I suggest generalizing this idea by having an optional shape keyword argument. So building on the lake problem example, imagine that the following code just works;

    lake_model.levers = [RealParameter(decisions, 0, 0.1, shape=(100,)) ]

and we can now just use decisions as a keyword argument on the lake model function removing the need for the second code block.

Implementation details
To make something like this work, there needs to be a distinction between the public-facing parameters and the internal scalar representation of these. So, when executing the above code, it triggers the generation of a hundred implicit parameters so they can be sampled/optimized. After having created experiments, we would call the 'parent' uncertainties to process these samples and do any transformations. The resulting modified experiment would then be run. This processing would be most likely a generalization of what happens already with categorical parameters where integers are mapped back to the corresponding category.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants