Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Armijo: Work with ISTA and FISTA and new default #1934

Merged
merged 19 commits into from
Oct 8, 2024
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 12 additions & 1 deletion Wrappers/Python/cil/optimisation/algorithms/FISTA.py
Original file line number Diff line number Diff line change
Expand Up @@ -213,8 +213,19 @@ def update_objective(self):
.. math:: f(x) + g(x)

"""
self.loss.append(self.f(self.x_old) + self.g(self.x_old))
self.loss.append(self.objective_function(self.x_old))

def objective_function(self, x):
""" Calculates the objective

.. math:: f(x) + g(x)

Parameters
----------
x : DataContainer

"""
return self.f(x) + self.g(x)

class FISTA(ISTA):

Expand Down
14 changes: 11 additions & 3 deletions Wrappers/Python/cil/optimisation/utilities/StepSizeMethods.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,9 @@ class ArmijoStepSizeRule(StepSizeRule):
The amount the step_size is reduced if the criterion is not met
max_iterations: integer, optional, default is numpy.ceil (2 * numpy.log10(alpha) / numpy.log10(2))
The maximum number of iterations to find a suitable step size
warmstart: Boolean, default is True
If `warmstart = True` the initial step size at each Armijo iteration is the calculated step size from the last iteration. If `warmstart = False` at each Armijo iteration, the initial step size is reset to the original, large `alpha`.
casperdcl marked this conversation as resolved.
Show resolved Hide resolved
In the case of *well-behaved* convex functions, `warmstart = True` is likely to be computationally less expensive. In the case of non-convex functions, or particularly tricky functions, setting `warmstart = False` may be beneficial.

Reference
------------
Expand All @@ -91,21 +94,23 @@ class ArmijoStepSizeRule(StepSizeRule):

"""

def __init__(self, alpha=1e6, beta=0.5, max_iterations=None):
def __init__(self, alpha=1e6, beta=0.5, max_iterations=None, warmstart=True):
'''Initialises the step size rule
'''

self.alpha_orig = alpha
if self.alpha_orig is None: # Can be removed when alpha and beta are deprecated in GD
self.alpha_orig = 1e6

self.alpha = self.alpha_orig
self.beta = beta
if self.beta is None: # Can be removed when alpha and beta are deprecated in GD
self.beta = 0.5

self.max_iterations = max_iterations
if self.max_iterations is None:
self.max_iterations = numpy.ceil(2 * numpy.log10(self.alpha_orig) / numpy.log10(2))

self.warmstart=warmstart

def get_step_size(self, algorithm):
"""
Expand All @@ -117,7 +122,9 @@ def get_step_size(self, algorithm):

"""
k = 0
self.alpha = self.alpha_orig
if not self.warmstart:
self.alpha = self.alpha_orig

f_x = algorithm.objective_function(algorithm.solution)

self.x_armijo = algorithm.solution.copy()
Expand All @@ -137,6 +144,7 @@ def get_step_size(self, algorithm):
if k == self.max_iterations:
raise ValueError(
'Could not find a proper step_size in {} loops. Consider increasing alpha or max_iterations.'.format(self.max_iterations))

return self.alpha


Expand Down
Loading
Loading