Generators¶
Abstract Base Generator¶
-
class
glompo.generators.basegenerator.
BaseGenerator
[source]¶ Base generator from which all generators must inherit to be compatible with GloMPO.
-
logger
¶ logging.Logger
instance into which status messages may be added.Type: logging.Logger
-
generate
(manager: GloMPOManager) → numpy.ndarray[source]¶ Returns a vector representing a location in input space. The returned array serves as a starting point for an optimizer.
Parameters: manager – GloMPOManager
instance which is managing the optimization. Its attributes can be accessed when determining the convergence criteria.
-
Simple Generators¶
For convenience, GloMPO comes bundled with several simple generators already included.
-
class
glompo.generators.
ExploitExploreGenerator
(bounds: Sequence[Tuple[float, float]], max_func_calls: int, focus: float = 1)[source]¶ Bases:
glompo.generators.basegenerator.BaseGenerator
This generator blends a randomly generated point with the location of an existing optimizer. The optimizer is chosen based on a roulette selection.
Parameters: - bounds – Min and max bounds for each parameter.
- max_func_calls – Maximum function calls allowed for the optimization, at and beyond this point there is a 100% chance that a previously evaluated point will be returned by the generator. If the optimization is not limited by the number of function calls, provide an estimate.
- focus – The blend parameter between random point and incumbent points.
Notes
focus is used as follows:
p=(f_calls / max_f_calls) ** focus
At
p=0
the random point is taken. Atp=1
the incumbent is chosen. Iffocus < 1
points are more like the incumbent, iffocus > 1
points are more like the random. Default isfocus = 1
which has a linear growth from random to incumbent. The new point is calculated as:new_pt = p*incumbent_pt + (1-p)*random_pt.
-
class
glompo.generators.
IncumbentGenerator
(bounds: Sequence[Tuple[float, float]])[source]¶ Bases:
glompo.generators.basegenerator.BaseGenerator
Starts a new optimizer at
GloMPOManager.result['x']
. A random vector is generated if this is indeterminate.
-
class
glompo.generators.
PerturbationGenerator
(x0: Sequence[float], bounds: Sequence[Tuple[float, float]], scale: Sequence[float])[source]¶ Bases:
glompo.generators.basegenerator.BaseGenerator
Randomly generates parameter vectors near a given point. Draws samples from a truncated multivariate normal distributed centered around a provided vector and bound by given bounds. Good for parametrisation efforts where a good candidate is already available, however, this may drastically limit the exploratory nature of GloMPO.
Parameters: - x0 – Center point for each parameter
- bounds – Min and max bounds for each parameter
- scale – Standard deviation of each parameter. Used here to control how wide the generator should explore around the mean.
-
class
glompo.generators.
RandomGenerator
(bounds: Sequence[Tuple[float, float]])[source]¶ Bases:
glompo.generators.basegenerator.BaseGenerator
Generates random points. Points are drawn from a uniform distribution within given bounds.
-
class
glompo.generators.
SinglePointGenerator
(bounds: Sequence[Tuple[float, float]], x: Optional[Sequence[float]] = None)[source]¶ Bases:
glompo.generators.basegenerator.BaseGenerator
Always returns the same point. Either provided during initialisation or otherwise randomly generated.
Advanced Generators¶
GloMPO also comes bundled with two more advanced sampling strategies.
-
class
glompo.generators.annealing.
AnnealingGenerator
(bounds: Sequence[Tuple[float, float]], task, qa: float = -5.0, qv: float = 2.62, initial_temp: float = 5230, restart_temp_ratio: float = 2e-05, seed: Union[None, int, Iterable, float] = None)[source]¶ Wrapper around the core of
scipy.optimize.dual_annealing()
.The algorithm is adapted directly from Scipy and directly uses its internal code. Each generation performs several function evaluations to select a location from which to start a new optimizer. The dual-annealing methodology is followed as closely as possible but given GloMPO’s parallel and asynchronous behaviour, some adjustments are needed.
For each generation:
- The ‘state’ of the optimizer is updated to the best seen thus far by any of the manager’s children.
- The temperature is decreased. If it reaches a critically low level it is reset back to the initial temperature.
- Run the internal annealing chain. If the ‘state’ is different from the start location of the previous optimizer, it is returned.
- Otherwise, the annealing chain is repeated (a maximum of 5 times). If a new location is still not found, the temperature is reset and the procedure returns to Step 3.
Danger
This generator performs function evaluations everytime
generate()
is called! This is not the typical GloMPO design intention. If one is using slow functions, this could significantly impede the manager!Parameters: - bounds – Sequence of (min, max) pairs for each parameter in the search space.
- task – The optimization function.
- qa – The accept distribution parameter.
- qv – The visiting distribution parameter.
- initial_temp – Initial temperature. Larger values generate larger step sizes.
- restart_temp_ratio – The value of
temperature / initial_temp
at which the temperature is reset to the initial value. - seed – Seed for the random number generator for reproducibility.
-
class
glompo.generators.basinhopping.
BasinHoppingGenerator
(temperature: float = 1, max_step_size: float = 0.5, target_accept_rate=0.5, interval=5, factor=0.9)[source]¶ Monte-Carlo sampling strategy used by the Basin-Hopping algorithm. Represents the ‘outer’ algorithm used by basin-hopping to select locations at which to start local optimizers.
Parameters: - temperature – Parameter for the accept or reject criterion. Higher values mean larger jumps will be accepted from the generator’s starting point.
- max_step_size – Maximum jump size allowed.
- target_accept_rate – The target acceptance rate. The rate is calculated as the ratio between the number of optimizers which found a better minimum than the manager’s incumbent, and the number of optimizers started.
- interval – The number of
generate()
calls between adjustments of the step size based on the acceptance rate. - factor – The factor by which the step size is adjusted when an adjustment is done.
Notes
The generator attempts to closely follow the basin-hopping sampling strategy, however, due to GloMPO’s inherent parallelism, several adjustments are made. The generation algorithm works as follows:
- Calls to
generate()
will return a random vector if the manager does not yet have an incumbent solution. - If an incumbent exists, the generator’s ‘location’ will be placed there.
- From the other children, one is uniformly randomly selected, and its best solution selected. There is a Monte- Carlo chance that the generator’s ‘location’ will be moved to this location. In this way some diversity is maintained and an optimizer may be started in a different region.
- If the new optimizer is a multiple of interval, than the step size is grown or shrunk based on if the realised acceptance rate is above or below the target acceptance rate.
- A vector is returned which adds or subtracts a uniform random value between zero and step size to each element of the generator’s ‘location’.
See also
References
Adapted from: SciPY basin-hopping algorithm implementation. https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.basinhopping.html#scipy.optimize.basinhopping