Skip to content
111 changes: 111 additions & 0 deletions docs/source/_snippets/user_guide/optimizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -225,6 +225,117 @@ def objective(params):
# [end:optuna_tpe]


# ============================================================================
# Scipy Backend
# ============================================================================

# [start:scipy_imports]
from hyperactive.opt.scipy import (
ScipyDifferentialEvolution, # Global: population-based
ScipyDualAnnealing, # Global: simulated annealing variant
ScipyBasinhopping, # Global: random perturbations + local search
ScipySHGO, # Global: finds multiple local minima
ScipyDirect, # Global: deterministic DIRECT algorithm
ScipyNelderMead, # Local: simplex-based
ScipyPowell, # Local: conjugate direction method
)
# [end:scipy_imports]


# Scipy uses continuous search spaces (tuples instead of arrays)
scipy_search_space = {
"x": (-5.0, 5.0),
"y": (-5.0, 5.0),
}


# [start:scipy_differential_evolution]
from hyperactive.opt.scipy import ScipyDifferentialEvolution

optimizer = ScipyDifferentialEvolution(
param_space=scipy_search_space,
n_iter=100,
experiment=objective,
strategy="best1bin",
random_state=42,
)
# [end:scipy_differential_evolution]


# [start:scipy_dual_annealing]
from hyperactive.opt.scipy import ScipyDualAnnealing

optimizer = ScipyDualAnnealing(
param_space=scipy_search_space,
n_iter=100,
experiment=objective,
random_state=42,
)
# [end:scipy_dual_annealing]


# [start:scipy_basinhopping]
from hyperactive.opt.scipy import ScipyBasinhopping

optimizer = ScipyBasinhopping(
param_space=scipy_search_space,
n_iter=50,
experiment=objective,
minimizer_method="Nelder-Mead",
random_state=42,
)
# [end:scipy_basinhopping]


# [start:scipy_shgo]
from hyperactive.opt.scipy import ScipySHGO

optimizer = ScipySHGO(
param_space=scipy_search_space,
n_iter=3,
experiment=objective,
n=50,
sampling_method="simplicial",
)
# [end:scipy_shgo]


# [start:scipy_direct]
from hyperactive.opt.scipy import ScipyDirect

optimizer = ScipyDirect(
param_space=scipy_search_space,
n_iter=200,
experiment=objective,
locally_biased=True,
)
# [end:scipy_direct]


# [start:scipy_nelder_mead]
from hyperactive.opt.scipy import ScipyNelderMead

optimizer = ScipyNelderMead(
param_space=scipy_search_space,
n_iter=200,
experiment=objective,
random_state=42,
)
# [end:scipy_nelder_mead]


# [start:scipy_powell]
from hyperactive.opt.scipy import ScipyPowell

optimizer = ScipyPowell(
param_space=scipy_search_space,
n_iter=200,
experiment=objective,
random_state=42,
)
# [end:scipy_powell]


# ============================================================================
# Configuration Examples
# ============================================================================
Expand Down
5 changes: 4 additions & 1 deletion docs/source/api_reference/optimizers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The :mod:`hyperactive.opt` module contains optimization algorithms for hyperpara
All optimizers inherit from :class:`~hyperactive.base.BaseOptimizer` and share the same interface:
the ``solve()`` method to run optimization, and configuration via the ``experiment`` and ``search_space`` parameters.

Hyperactive provides optimizers from three backends:
Hyperactive provides optimizers from four backends:

.. list-table::
:widths: 25 75
Expand All @@ -20,6 +20,8 @@ Hyperactive provides optimizers from three backends:
- Native gradient-free optimization algorithms (21 optimizers)
* - :doc:`optuna`
- Interface to Optuna's samplers (8 optimizers)
* - :doc:`scipy`
- Scipy.optimize algorithms for continuous spaces (7 optimizers)
* - :doc:`sklearn`
- sklearn-compatible search interfaces (2 optimizers)

Expand All @@ -28,4 +30,5 @@ Hyperactive provides optimizers from three backends:

gfo
optuna
scipy
sklearn
37 changes: 37 additions & 0 deletions docs/source/api_reference/optimizers/scipy.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
.. _optimizers_scipy_ref:

Scipy
=====

.. currentmodule:: hyperactive.opt

The Scipy backend provides an interface to `scipy.optimize <https://docs.scipy.org/doc/scipy/reference/optimize.html>`_
algorithms for continuous parameter optimization.

.. note::

Scipy optimizers only support **continuous parameter spaces** (tuples).
For discrete or categorical parameters, use GFO or Optuna backends.

Global Optimizers
-----------------

.. autosummary::
:toctree: ../auto_generated/
:template: class.rst

ScipyDifferentialEvolution
ScipyDualAnnealing
ScipyBasinhopping
ScipySHGO
ScipyDirect

Local Optimizers
----------------

.. autosummary::
:toctree: ../auto_generated/
:template: class.rst

ScipyNelderMead
ScipyPowell
5 changes: 5 additions & 0 deletions docs/source/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ on GitHub.
examples/population_based
examples/sequential_model_based
examples/optuna_backend
examples/scipy_backend
examples/sklearn_backend
examples/integrations
examples/other
Expand Down Expand Up @@ -61,6 +62,10 @@ Backend Examples
Examples using Optuna's samplers including TPE, CMA-ES, NSGA-II/III,
and Gaussian Process optimization.

:ref:`examples_scipy_backend`
Examples using scipy.optimize algorithms including Differential Evolution,
Dual Annealing, Basin-hopping, SHGO, DIRECT, Nelder-Mead, and Powell.

:ref:`examples_sklearn_backend`
Scikit-learn compatible interfaces as drop-in replacements for
GridSearchCV and RandomizedSearchCV.
Expand Down
104 changes: 104 additions & 0 deletions docs/source/examples/scipy_backend.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
.. _examples_scipy_backend:

=============
Scipy Backend
=============

Hyperactive provides wrappers for scipy.optimize algorithms, enabling
well-tested, production-grade optimization for continuous parameter spaces.

.. note::

Scipy must be installed separately:

.. code-block:: bash

pip install scipy
# or
pip install hyperactive[all_extras]


Available Optimizers
--------------------

The Scipy backend provides 7 optimizers divided into global and local methods.

**Global Optimizers** (5 algorithms):

.. list-table::
:header-rows: 1
:widths: 30 70

* - Optimizer
- Description
* - ``ScipyDifferentialEvolution``
- Population-based global optimizer. Robust for multi-modal landscapes.
* - ``ScipyDualAnnealing``
- Combines classical simulated annealing with local search.
* - ``ScipyBasinhopping``
- Random perturbations with local minimization. Good for finding global minima.
* - ``ScipySHGO``
- Simplicial Homology Global Optimization. Finds multiple local minima.
* - ``ScipyDirect``
- Deterministic DIRECT algorithm. No random seed required.

**Local Optimizers** (2 algorithms):

.. list-table::
:header-rows: 1
:widths: 30 70

* - Optimizer
- Description
* - ``ScipyNelderMead``
- Simplex-based optimizer. Fast for smooth functions.
* - ``ScipyPowell``
- Conjugate direction method. Often faster than Nelder-Mead.


Quick Example
-------------

Scipy optimizers require continuous parameter spaces defined as tuples:

.. code-block:: python

from hyperactive.opt.scipy import ScipyDifferentialEvolution

# Define a continuous search space (tuples, not arrays)
param_space = {
"x": (-5.0, 5.0),
"y": (-5.0, 5.0),
}

def objective(params):
x, y = params["x"], params["y"]
return -(x**2 + y**2) # Maximize (minimize negative)

optimizer = ScipyDifferentialEvolution(
param_space=param_space,
n_iter=100,
experiment=objective,
random_state=42,
)

best_params = optimizer.solve()
print(f"Best parameters: {best_params}")


When to Use Scipy Backend
-------------------------

The Scipy backend is useful when:

- **Continuous parameters only**: Your search space has no categorical or discrete values
- **Production-grade algorithms**: You need well-tested, reliable implementations
- **Specific scipy features**: You want scipy's differential evolution or simulated annealing
- **Deterministic optimization**: Use ``ScipyDirect`` for reproducible results without random seeds


See Also
--------

- :ref:`user_guide_optimizers_scipy` - Detailed guide with all optimizer examples
- :ref:`optimizers_scipy_ref` - API reference for all Scipy optimizers
23 changes: 19 additions & 4 deletions docs/source/user_guide/optimizers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Optimizers
==========

Hyperactive provides 31 algorithms across 5 categories and 3 backends.
Hyperactive provides 38 algorithms across 5 categories and 4 backends.
Optimizers navigate the search space to find optimal parameters. Each implements a
different strategy for balancing exploration (trying diverse regions) and exploitation
(refining promising solutions). Local search methods like Hill Climbing work well for
Expand All @@ -20,10 +20,10 @@ Algorithm Landscape

<div class="theme-aware-diagram">
<img src="../../_static/diagrams/optimizer_taxonomy_light.svg"
alt="Hyperactive optimizer taxonomy showing 31 algorithms across GFO, Optuna, and sklearn backends"
alt="Hyperactive optimizer taxonomy showing 38 algorithms across GFO, Optuna, Scipy, and sklearn backends"
class="only-light" />
<img src="../../_static/diagrams/optimizer_taxonomy_dark.svg"
alt="Hyperactive optimizer taxonomy showing 31 algorithms across GFO, Optuna, and sklearn backends"
alt="Hyperactive optimizer taxonomy showing 38 algorithms across GFO, Optuna, Scipy, and sklearn backends"
class="only-dark" />
</div>

Expand Down Expand Up @@ -133,6 +133,17 @@ Algorithm Categories

*TPEOptimizer, CmaEsOptimizer, GPOptimizer, NSGAIIOptimizer, and more*

.. grid-item-card:: Scipy Backend
:link: scipy
:link-type: doc
:class-card: sd-border-secondary

**7 algorithms**
^^^
Scipy.optimize algorithms for continuous parameter spaces.

*DifferentialEvolution, DualAnnealing, Basinhopping, SHGO, Direct, NelderMead, Powell*

----

Scenario Reference
Expand Down Expand Up @@ -163,8 +174,11 @@ Detailed recommendations based on problem characteristics:
- ``GridSearch``
- Exhaustive coverage when feasible
* - Continuous parameters
- ``BayesianOptimizer``, ``CmaEsOptimizer``
- ``BayesianOptimizer``, ``CmaEsOptimizer``, ``ScipyDifferentialEvolution``
- Designed for smooth, continuous spaces
* - Continuous only (scipy)
- ``ScipyDualAnnealing``, ``ScipyBasinhopping``, ``ScipyNelderMead``
- Production-grade scipy.optimize implementations
* - Mixed parameter types
- ``TPEOptimizer``, ``RandomSearch``
- Handle categorical + continuous well
Expand All @@ -191,4 +205,5 @@ All optimizers share common parameters and configuration options.
population_based
sequential_model_based
optuna
scipy
configuration
Loading
Loading