Skip to content

Commit 824072a

Browse files
committed
[DOC] add multiple pages and small corrections
1 parent 653efc8 commit 824072a

24 files changed

Lines changed: 1550 additions & 8 deletions

docs/source/_snippets/user_guide/optimizers.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -56,6 +56,18 @@ def objective(params):
5656
# [end:repulsing_hill_climbing]
5757

5858

59+
# [start:stochastic_hill_climbing]
60+
from hyperactive.opt.gfo import StochasticHillClimbing
61+
62+
optimizer = StochasticHillClimbing(
63+
search_space=search_space,
64+
n_iter=100,
65+
experiment=objective,
66+
p_accept=0.3, # Probability of accepting worse solutions
67+
)
68+
# [end:stochastic_hill_climbing]
69+
70+
5971
# [start:downhill_simplex]
6072
from hyperactive.opt.gfo import DownhillSimplexOptimizer
6173

docs/source/_templates/class.rst

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,6 @@
55

66
.. autoclass:: {{ objname }}
77

8-
.. include:: {{module}}.{{objname}}.examples
9-
108
.. raw:: html
119

1210
<div class="clearer"></div>

docs/source/_templates/function.rst

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,6 @@
55

66
.. autofunction:: {{ objname }}
77

8-
.. include:: {{module}}.{{objname}}.examples
9-
108
.. raw:: html
119

1210
<div class="clearer"></div>

docs/source/examples/integrations.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,8 +33,8 @@ For time series forecasting and classification with sktime:
3333
pip install hyperactive[sktime-integration]
3434
3535
36-
Installation
37-
------------
36+
Installing Extras
37+
-----------------
3838

3939
Install integration extras as needed:
4040

docs/source/faq.rst

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
.. _faq:
2+
3+
==========================
4+
Frequently Asked Questions
5+
==========================
6+
7+
This section answers common questions about Hyperactive. For migration from v4,
8+
see the :ref:`user_guide_migration`.
9+
10+
.. toctree::
11+
:maxdepth: 1
12+
13+
faq/getting_started
14+
faq/search_space
15+
faq/common_issues
16+
faq/advanced_usage
17+
faq/integrations
18+
faq/getting_help
19+
20+
21+
Overview
22+
--------
23+
24+
:ref:`faq_getting_started`
25+
Choosing optimizers, iteration counts, and understanding maximization.
26+
27+
:ref:`faq_search_space`
28+
Defining continuous, discrete, and mixed parameter spaces.
29+
30+
:ref:`faq_common_issues`
31+
Slow optimization, reproducibility, handling errors.
32+
33+
:ref:`faq_advanced_usage`
34+
Parallel execution, callbacks, parameter constraints.
35+
36+
:ref:`faq_integrations`
37+
Using Hyperactive with PyTorch, XGBoost, and other frameworks.
38+
39+
:ref:`faq_getting_help`
40+
Where to report bugs and get support.

docs/source/faq/advanced_usage.rst

Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
.. _faq_advanced_usage:
2+
3+
==============
4+
Advanced Usage
5+
==============
6+
7+
Can I run optimizations in parallel?
8+
------------------------------------
9+
10+
Currently, Hyperactive v5 runs single optimizer instances.
11+
For parallel evaluation of candidates, consider using Optuna
12+
backend optimizers which support parallel trials:
13+
14+
.. code-block:: python
15+
16+
from hyperactive.opt.optuna import TPEOptimizer
17+
18+
optimizer = TPEOptimizer(
19+
search_space=search_space,
20+
n_iter=100,
21+
experiment=objective,
22+
# Optuna handles parallelization
23+
)
24+
25+
26+
Can I save and resume optimization?
27+
-----------------------------------
28+
29+
This feature is planned but not yet available in v5. As a workaround,
30+
you can log results during optimization and use them as initial points
31+
for a new run.
32+
33+
34+
Are callbacks supported?
35+
------------------------
36+
37+
User-defined callbacks during optimization are not currently supported in v5.
38+
The Optuna backend has internal early-stopping callbacks, but there's no
39+
general callback interface for tracking progress or modifying behavior during
40+
optimization.
41+
42+
For progress monitoring, you can add logging inside your objective function:
43+
44+
.. code-block:: python
45+
46+
iteration = 0
47+
48+
def objective(params):
49+
global iteration
50+
iteration += 1
51+
score = evaluate_model(params)
52+
print(f"Iteration {iteration}: score={score:.4f}")
53+
return score
54+
55+
56+
How do I add constraints between parameters?
57+
--------------------------------------------
58+
59+
Handle constraints in your objective function by returning a poor score
60+
for invalid combinations:
61+
62+
.. code-block:: python
63+
64+
def objective(params):
65+
# Constraint: min_samples_split must be >= min_samples_leaf
66+
if params["min_samples_split"] < params["min_samples_leaf"]:
67+
return -np.inf # Invalid configuration
68+
69+
return evaluate_model(params)

docs/source/faq/common_issues.rst

Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
.. _faq_common_issues:
2+
3+
=============
4+
Common Issues
5+
=============
6+
7+
Why is my optimization slow?
8+
----------------------------
9+
10+
**Slow objective function**: The optimizer only controls search strategy.
11+
If each evaluation takes a long time, consider:
12+
13+
- Reducing cross-validation folds
14+
- Using a subset of training data for tuning
15+
- Simplifying your model during search
16+
17+
**Large search space**: More combinations require more iterations.
18+
Consider reducing parameter granularity or using smarter optimizers
19+
like Bayesian optimization.
20+
21+
**Too many iterations**: Start with fewer iterations and increase
22+
if needed.
23+
24+
25+
Why does my score vary between runs?
26+
------------------------------------
27+
28+
Optimization algorithms are stochastic. To get reproducible results,
29+
set a random seed:
30+
31+
.. code-block:: python
32+
33+
optimizer = HillClimbing(
34+
search_space=search_space,
35+
n_iter=100,
36+
experiment=objective,
37+
random_state=42, # Set seed for reproducibility
38+
)
39+
40+
41+
My objective function returns NaN or raises exceptions
42+
------------------------------------------------------
43+
44+
Handle invalid configurations in your objective function:
45+
46+
.. code-block:: python
47+
48+
def objective(params):
49+
try:
50+
score = evaluate_model(params)
51+
if np.isnan(score):
52+
return -np.inf # Return worst possible score
53+
return score
54+
except Exception:
55+
return -np.inf # Return worst possible score on error
56+
57+
58+
How do I see what parameters were tried?
59+
----------------------------------------
60+
61+
Access the search history after optimization:
62+
63+
.. code-block:: python
64+
65+
best_params = optimizer.solve()
66+
67+
# Access results
68+
print(f"Best parameters: {optimizer.best_params_}")
69+
print(f"Best score: {optimizer.best_score_}")
70+
71+
# Full search history (if available)
72+
# Check optimizer attributes for search_data or similar

docs/source/faq/getting_help.rst

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
.. _faq_getting_help:
2+
3+
============
4+
Getting Help
5+
============
6+
7+
Where can I report bugs or request features?
8+
--------------------------------------------
9+
10+
Open an issue on `GitHub <https://github.com/SimonBlanke/Hyperactive/issues>`_.
11+
12+
13+
Where can I get help?
14+
---------------------
15+
16+
- Check the :ref:`examples` for code samples
17+
- Read the :ref:`user_guide` for detailed explanations
18+
- Join the `Discord <https://discord.gg/7uKdHfdcJG>`_ community
19+
- Search or ask on `GitHub Discussions <https://github.com/SimonBlanke/Hyperactive/discussions>`_
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
.. _faq_getting_started:
2+
3+
===============
4+
Getting Started
5+
===============
6+
7+
Which optimizer should I use?
8+
-----------------------------
9+
10+
For most problems, start with one of these recommendations:
11+
12+
**Small search spaces (<100 combinations)**
13+
Use :class:`~hyperactive.opt.gfo.GridSearch` to exhaustively evaluate all options.
14+
15+
**General-purpose optimization**
16+
:class:`~hyperactive.opt.gfo.BayesianOptimizer` works well for expensive
17+
objective functions where you want to minimize evaluations.
18+
19+
**Fast, simple problems**
20+
:class:`~hyperactive.opt.gfo.HillClimbing` or
21+
:class:`~hyperactive.opt.gfo.RandomSearch` are good starting points.
22+
23+
**High-dimensional spaces**
24+
Population-based methods like :class:`~hyperactive.opt.gfo.ParticleSwarmOptimizer`
25+
or :class:`~hyperactive.opt.gfo.EvolutionStrategyOptimizer` handle many
26+
parameters well.
27+
28+
See :ref:`user_guide_optimizers` for detailed guidance on choosing optimizers.
29+
30+
31+
How many iterations do I need?
32+
------------------------------
33+
34+
This depends on your search space size and objective function:
35+
36+
- **Rule of thumb**: Start with ``n_iter = 10 * number_of_parameters``
37+
- **Expensive functions**: Use fewer iterations with Bayesian optimization
38+
- **Fast functions**: Use more iterations with simpler optimizers
39+
40+
You can monitor progress and stop early if the score plateaus.
41+
42+
43+
Does Hyperactive minimize or maximize?
44+
--------------------------------------
45+
46+
**Hyperactive maximizes** the objective function. If you want to minimize,
47+
return the negative of your metric:
48+
49+
.. code-block:: python
50+
51+
def objective(params):
52+
error = compute_error(params)
53+
return -error # Negate to minimize

docs/source/faq/integrations.rst

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
.. _faq_integrations:
2+
3+
============
4+
Integrations
5+
============
6+
7+
Can I use Hyperactive with PyTorch (not Lightning)?
8+
---------------------------------------------------
9+
10+
Yes, create a custom objective function:
11+
12+
.. code-block:: python
13+
14+
import torch
15+
16+
def objective(params):
17+
model = MyPyTorchModel(
18+
hidden_size=params["hidden_size"],
19+
dropout=params["dropout"],
20+
)
21+
# Train and evaluate your model
22+
train_model(model, train_loader)
23+
accuracy = evaluate_model(model, val_loader)
24+
return accuracy
25+
26+
27+
How does Hyperactive compare to Optuna?
28+
---------------------------------------
29+
30+
**Hyperactive with native GFO backend**:
31+
32+
- Simple, unified API
33+
- Wide variety of optimization algorithms
34+
- Great for hyperparameter tuning
35+
36+
**Hyperactive with Optuna backend**:
37+
38+
- Access Optuna's algorithms through Hyperactive's interface
39+
- Combine the strengths of both libraries
40+
41+
**Pure Optuna**:
42+
43+
- More features (pruning, distributed, database storage)
44+
- Larger community and ecosystem
45+
- More configuration options
46+
47+
Choose based on your needs: Hyperactive for simplicity, Optuna for
48+
advanced features.
49+
50+
51+
Can I use Hyperactive with other ML frameworks?
52+
-----------------------------------------------
53+
54+
Yes, any framework works with custom objective functions:
55+
56+
.. code-block:: python
57+
58+
# XGBoost example
59+
import xgboost as xgb
60+
61+
def objective(params):
62+
model = xgb.XGBClassifier(**params)
63+
scores = cross_val_score(model, X, y, cv=3)
64+
return scores.mean()

0 commit comments

Comments
 (0)