pysr3.solvers module
Implements general purpose numerical solvers, like PGD
- class pysr3.solvers.FakePGDSolver(tol=0.0001, max_iter=1000, fixed_step_len=1, update_prox_every=1)
Bases:
object
This class is designed for the situations where the oracle can provide the optimal solution by itself, e.g. when it’s accessible analytically. It’s also used for PracticalSR3 methods, when the relaxed variables are updated together with the original ones inside the oracle’s subroutine.
Initializes the solver
- Parameters:
tol (float) – tolerance for internal routines
max_iter (int) – maximal number of iterations for internal routines
fixed_step_len (float) – step-size
update_prox_every (int) – how often should the oracle update the relaxed variable (every X steps).
- optimize(x0, oracle=None, regularizer: Regularizer | None = None, logger: Logger | None = None, **kwargs)
Solves the optimization problem for
Loss(x) = oracle(x) + regularizer(x)
- Parameters:
x0 (ndarray) – starting point of the optimizer.
oracle (LinearLMEOracle) – provides the value and the gradient of the smooth part of the loss.
regularizer (Regularizer) – provides the value and the proximal operator of the non-smooth part of the loss.
logger (Logger) – logs the progress (loss, convergence, etc).
- Returns:
x (ndarray) – the minimum.
- class pysr3.solvers.PGDSolver(tol=0.0001, max_iter=1000, stepping='fixed', fixed_step_len=1)
Bases:
object
Implements a general Proximal Gradient Descent solver.
Creates an instance of the solver.
- Parameters:
tol (float) – Tolerance for the stop-criterion: norm(x - x0) is less than tol.
max_iter (int) – Maximum number of iterations that the solver is allowed to make.
stepping (str) – Stepping policy. Can be either “line-search” or “fixed”.
fixed_step_len (float) – Length of the step size. If stepping=”fixed” then this step-size is always used. If stepping=”line-search” then the line-search starts shrinking the step from this step size.
- optimize(x0, oracle: LinearLMEOracle | None = None, regularizer: Regularizer | None = None, logger: Logger | None = None)
Solves the optimization problem for
Loss(x) = oracle(x) + regularizer(x)
- Parameters:
x0 (ndarray) – starting point of the optimizer.
oracle (LinearLMEOracle) – provides the value and the gradient of the smooth part of the loss.
regularizer (Regularizer) – provides the value and the proximal operator of the non-smooth part of the loss.
logger (Logger) – logs the progress (loss, convergence, etc).
- Returns:
x (ndarray) – the minimum.