Skip to content

Commit

Permalink
examples: Speedup the lake_problem function by ~30x (#301)
Browse files Browse the repository at this point in the history
Speed up the lake_problem function by around 30x.

Benchmark: Average duration of 100 lake_model experiments: 0.940 seconds vs 0.030 seconds (speed up: 31.41).

I validated the output and it is consistent with the previous output.

The improved performance might allow more and broader analysis to be done in the exercises.

Also print a warning on the decisions KeyError
  • Loading branch information
EwoutH authored Oct 30, 2023
1 parent ff192a1 commit 9ccdb99
Showing 1 changed file with 30 additions and 20 deletions.
50 changes: 30 additions & 20 deletions ema_workbench/examples/example_lake_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,37 +35,47 @@ def lake_problem(
decisions = [kwargs[str(i)] for i in range(100)]
except KeyError:
decisions = [0] * 100
print("No valid decisions found, using 0 water release every year as default")

Pcrit = brentq(lambda x: x**q / (1 + x**q) - b * x, 0.01, 1.5)
nvars = len(decisions)
X = np.zeros((nvars,))
average_daily_P = np.zeros((nvars,))
decisions = np.array(decisions)
reliability = 0.0

for _ in range(nsamples):
X[0] = 0.0
# Calculate the critical pollution level (Pcrit)
Pcrit = brentq(lambda x: x**q / (1 + x**q) - b * x, 0.01, 1.5)

natural_inflows = np.random.lognormal(
math.log(mean**2 / math.sqrt(stdev**2 + mean**2)),
math.sqrt(math.log(1.0 + stdev**2 / mean**2)),
size=nvars,
# Generate natural inflows using lognormal distribution
natural_inflows = np.random.lognormal(
mean=math.log(mean**2 / math.sqrt(stdev**2 + mean**2)),
sigma=math.sqrt(math.log(1.0 + stdev**2 / mean**2)),
size=(nsamples, nvars),
)

# Initialize the pollution level matrix X
X = np.zeros((nsamples, nvars))

# Loop through time to compute the pollution levels
for t in range(1, nvars):
X[:, t] = (
(1 - b) * X[:, t - 1]
+ (X[:, t - 1] ** q / (1 + X[:, t - 1] ** q))
+ decisions[t - 1]
+ natural_inflows[:, t - 1]
)

for t in range(1, nvars):
X[t] = (
(1 - b) * X[t - 1]
+ X[t - 1] ** q / (1 + X[t - 1] ** q)
+ decisions[t - 1]
+ natural_inflows[t - 1]
)
average_daily_P[t] += X[t] / float(nsamples)
# Calculate the average daily pollution for each time step
average_daily_P = np.mean(X, axis=0)

reliability += np.sum(X < Pcrit) / float(nsamples * nvars)
# Calculate the reliability (probability of the pollution level being below Pcrit)
reliability = np.sum(X < Pcrit) / float(nsamples * nvars)

# Calculate the maximum pollution level (max_P)
max_P = np.max(average_daily_P)

# Calculate the utility by discounting the decisions using the discount factor (delta)
utility = np.sum(alpha * decisions * np.power(delta, np.arange(nvars)))
inertia = np.sum(np.absolute(np.diff(decisions)) < 0.02) / float(nvars - 1)

# Calculate the inertia (the fraction of time steps with changes larger than 0.02)
inertia = np.sum(np.abs(np.diff(decisions)) > 0.02) / float(nvars - 1)

return max_P, utility, inertia, reliability

Expand Down

0 comments on commit 9ccdb99

Please sign in to comment.