Skip to content

Commit a43d069

Browse files
committed
[DOC] cleanup readme
1 parent fafe4d3 commit a43d069

File tree

1 file changed

+22
-256
lines changed

1 file changed

+22
-256
lines changed

README.md

Lines changed: 22 additions & 256 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,28 @@ package alongside direct interfaces to Optuna and scikit-learn optimizers, suppo
2929
pip install hyperactive
3030
```
3131

32-
## :zap: Quickstart
32+
## Key Concepts
33+
34+
### Experiment-Based Architecture
35+
36+
Hyperactive v5 introduces a clean separation between optimization algorithms and optimization problems through the **experiment abstraction**:
37+
38+
- **Experiments** define *what* to optimize (the objective function and evaluation logic)
39+
- **Optimizers** define *how* to optimize (the search strategy and algorithm)
40+
41+
This design allows you to:
42+
- Mix and match any optimizer with any experiment type
43+
- Create reusable experiment definitions for common ML tasks
44+
- Easily switch between different optimization strategies
45+
- Build complex optimization workflows with consistent interfaces
46+
47+
**Built-in experiments include:**
48+
- `SklearnCvExperiment` - Cross-validation for sklearn estimators
49+
- `SktimeForecastingExperiment` - Time series forecasting optimization
50+
- Custom function experiments (pass any callable as experiment)
51+
52+
53+
## Quickstart
3354

3455
### Maximizing a custom function
3556

@@ -140,261 +161,11 @@ best_params = tuned_svc.best_params_
140161
best_estimator = tuned_svc.best_estimator_
141162
```
142163

143-
## :bulb: Key Concepts
144-
145-
### Experiment-Based Architecture
146-
147-
Hyperactive v5 introduces a clean separation between optimization algorithms and optimization problems through the **experiment abstraction**:
148-
149-
- **Experiments** define *what* to optimize (the objective function and evaluation logic)
150-
- **Optimizers** define *how* to optimize (the search strategy and algorithm)
151-
152-
This design allows you to:
153-
- Mix and match any optimizer with any experiment type
154-
- Create reusable experiment definitions for common ML tasks
155-
- Easily switch between different optimization strategies
156-
- Build complex optimization workflows with consistent interfaces
157-
158-
**Built-in experiments include:**
159-
- `SklearnCvExperiment` - Cross-validation for sklearn estimators
160-
- `SktimeForecastingExperiment` - Time series forecasting optimization
161-
- Custom function experiments (pass any callable as experiment)
162-
163-
<img src="./docs/images/bayes_convex.gif" align="right" width="500">
164-
165-
## Overview
166-
167-
<h3 align="center">
168-
Hyperactive features a collection of optimization algorithms that can be used for a variety of optimization problems. The following table shows examples of its capabilities:
169-
</h3>
170-
171-
172-
<br>
173-
174-
<table>
175-
<tbody>
176-
<tr align="center" valign="center">
177-
<td>
178-
<strong>Optimization Techniques</strong>
179-
<img src="./docs/images/blue.jpg"/>
180-
</td>
181-
<td>
182-
<strong>Framework Integrations</strong>
183-
<img src="./docs/images/blue.jpg"/>
184-
</td>
185-
<td>
186-
<strong>Optimization Applications</strong>
187-
<img src="./docs/images/blue.jpg"/>
188-
</td>
189-
</tr>
190-
<tr/>
191-
<tr valign="top">
192-
<td>
193-
<a><b>Local Search:</b></a>
194-
<ul>
195-
<li><a href="./examples/gfo/hill_climbing_example.py">Hill Climbing</a></li>
196-
<li><a href="./examples/gfo/repulsing_hill_climbing_example.py">Repulsing Hill Climbing</a></li>
197-
<li><a href="./examples/gfo/simulated_annealing_example.py">Simulated Annealing</a></li>
198-
<li><a href="./examples/gfo/downhill_simplex_example.py">Downhill Simplex Optimizer</a></li>
199-
</ul><br>
200-
<a><b>Global Search:</b></a>
201-
<ul>
202-
<li><a href="./examples/gfo/random_search_example.py">Random Search</a></li>
203-
<li><a href="./examples/gfo/grid_search_example.py">Grid Search</a></li>
204-
<li><a href="./examples/gfo/random_restart_hill_climbing_example.py">Random Restart Hill Climbing</a></li>
205-
<li><a href="./examples/gfo/stochastic_hill_climbing_example.py">Stochastic Hill Climbing</a></li>
206-
<li><a href="./examples/gfo/powells_method_example.py">Powell's Method</a></li>
207-
<li><a href="./examples/gfo/pattern_search_example.py">Pattern Search</a></li>
208-
</ul><br>
209-
<a><b>Population Methods:</b></a>
210-
<ul>
211-
<li><a href="./examples/gfo/parallel_tempering_example.py">Parallel Tempering</a></li>
212-
<li><a href="./examples/gfo/particle_swarm_example.py">Particle Swarm Optimizer</a></li>
213-
<li><a href="./examples/gfo/spiral_optimization_example.py">Spiral Optimization</a></li>
214-
<li><a href="./examples/gfo/genetic_algorithm_example.py">Genetic Algorithm</a></li>
215-
<li><a href="./examples/gfo/evolution_strategy_example.py">Evolution Strategy</a></li>
216-
<li><a href="./examples/gfo/differential_evolution_example.py">Differential Evolution</a></li>
217-
</ul><br>
218-
<a><b>Sequential Methods:</b></a>
219-
<ul>
220-
<li><a href="./examples/gfo/bayesian_optimization_example.py">Bayesian Optimization</a></li>
221-
<li><a href="./examples/gfo/lipschitz_optimizer_example.py">Lipschitz Optimization</a></li>
222-
<li><a href="./examples/gfo/direct_algorithm_example.py">Direct Algorithm</a></li>
223-
<li><a href="./examples/gfo/tree_structured_parzen_estimators_example.py">Tree of Parzen Estimators</a></li>
224-
<li><a href="./examples/gfo/forest_optimizer_example.py">Forest Optimizer</a>
225-
[<a href="#references">ref</a>] </li>
226-
</ul><br>
227-
<a><b>Optuna Backend:</b></a>
228-
<ul>
229-
<li><a href="./examples/optuna/tpe_sampler_example.py">TPE Optimizer</a></li>
230-
<li><a href="./examples/optuna/random_sampler_example.py">Random Optimizer</a></li>
231-
<li><a href="./examples/optuna/cmaes_sampler_example.py">CMA-ES Optimizer</a></li>
232-
<li><a href="./examples/optuna/gp_sampler_example.py">Gaussian Process Optimizer</a></li>
233-
<li><a href="./examples/optuna/grid_sampler_example.py">Grid Optimizer</a></li>
234-
<li><a href="./examples/optuna/nsga_ii_sampler_example.py">NSGA-II Optimizer</a></li>
235-
<li><a href="./examples/optuna/nsga_iii_sampler_example.py">NSGA-III Optimizer</a></li>
236-
<li><a href="./examples/optuna/qmc_sampler_example.py">QMC Optimizer</a></li>
237-
</ul>
238-
</td>
239-
<td>
240-
<a><b>AI and Machine Learning:</b></a>
241-
<ul>
242-
<li><a href="./examples/integrations/README.md">scikit-learn</a></li>
243-
<li><a href="./examples/integrations/README.md">sktime forecasting</a></li>
244-
<li><a href="./examples/integrations/README.md">sktime time series classification</a></li>
245-
</ul>
246-
</td>
247-
<td>
248-
249-
</td>
250-
</tr>
251-
</tbody>
252-
</table>
253-
254-
The examples above are not necessarily done with realistic datasets or training procedures.
255-
The purpose is fast execution of the solution proposal and giving the user ideas for interesting usecases.
256-
257-
258-
<br>
259-
260-
## Sideprojects and Tools
261-
262-
The following packages are designed to support Hyperactive and expand its use cases.
263-
264-
| Package | Description |
265-
|-------------------------------------------------------------------------------|--------------------------------------------------------------------------------------|
266-
| [Search-Data-Collector](https://github.com/SimonBlanke/search-data-collector) | Simple tool to save search-data during or after the optimization run into csv-files. |
267-
| [Search-Data-Explorer](https://github.com/SimonBlanke/search-data-explorer) | Visualize search-data with plotly inside a streamlit dashboard.
268-
269-
270-
271-
<br>
272-
273-
## FAQ
274-
275-
#### Known Errors + Solutions
276-
277-
<details>
278-
<summary><b> Read this before opening a bug-issue </b></summary>
279-
280-
<br>
281-
282-
- <b>Are you sure the bug is located in Hyperactive? </b>
283-
284-
The error might be located in the optimization-backend.
285-
Look at the error message from the command line. <b>If</b> one of the last messages look like this:
286-
- File "/.../gradient_free_optimizers/...", line ...
287-
288-
<b>Then</b> you should post the bug report in:
289-
- https://github.com/SimonBlanke/Gradient-Free-Optimizers
290-
291-
<br>Otherwise</b> you can post the bug report in Hyperactive
292-
293-
- <b>Do you have the correct Hyperactive version? </b>
294-
295-
Every major version update (e.g. v2.2 -> v3.0) the API of Hyperactive changes.
296-
Check which version of Hyperactive you have. If your major version is older you have two options:
297-
298-
<b>Recommended:</b> You could just update your Hyperactive version with:
299-
```bash
300-
pip install hyperactive --upgrade
301-
```
302-
This way you can use all the new documentation and examples from the current repository.
303-
304-
Or you could continue using the old version and use an old repository branch as documentation.
305-
You can do that by selecting the corresponding branch. (top right of the repository. The default is "main")
306-
So if your major version is older (e.g. v2.1.0) you can select the 2.x.x branch to get the old repository for that version.
307-
308-
- <b>Provide example code for error reproduction </b>
309-
To understand and fix the issue I need an example code to reproduce the error.
310-
I must be able to just copy the code into a py-file and execute it to reproduce the error.
311-
312-
</details>
313-
314-
315-
<details>
316-
<summary> MemoryError: Unable to allocate ... for an array with shape (...) </summary>
317-
318-
<br>
319-
320-
This is expected of the current implementation of smb-optimizers. For all Sequential model based algorithms you have to keep your eyes on the search space size:
321-
```python
322-
search_space_size = 1
323-
for value_ in search_space.values():
324-
search_space_size *= len(value_)
325-
326-
print("search_space_size", search_space_size)
327-
```
328-
Reduce the search space size to resolve this error.
329-
330-
</details>
331-
332-
333-
<details>
334-
<summary> TypeError: cannot pickle '_thread.RLock' object </summary>
335-
336-
<br>
337-
338-
This typically means your search space or parameter suggestions include non-serializable
339-
objects (e.g., classes, bound methods, lambdas, local functions, locks). Ensure that all
340-
values in `search_space`/`param_space` are plain Python/scientific types such as ints,
341-
floats, strings, lists/tuples, or numpy arrays. Avoid closures and non-top-level callables
342-
in parameter values.
343-
344-
Hyperactive v5 does not expose a global “distribution” switch. If you parallelize outside
345-
Hyperactive (e.g., with joblib/dask/ray), choose an appropriate backend and make sure the
346-
objective and arguments are picklable for process-based backends.
347-
348-
</details>
349-
350-
351-
<details>
352-
<summary> Command line full of warnings </summary>
353-
354-
<br>
355-
356-
Very often warnings from sklearn or numpy. Those warnings do not correlate with bad performance from Hyperactive. Your code will most likely run fine. Those warnings are very difficult to silence.
357-
358-
It should help to put this at the very top of your script:
359-
```python
360-
def warn(*args, **kwargs):
361-
pass
362-
363-
364-
import warnings
365-
366-
warnings.warn = warn
367-
```
368-
369-
</details>
370164

371165

372-
<details>
373-
<summary> Warning: Not enough initial positions for population size </summary>
374-
375-
<br>
376-
377-
This warning occurs because the optimizer needs more initial positions to generate a
378-
population for the search. In v5, initial positions are controlled via the optimizer’s
379-
`initialize` parameter.
380-
```python
381-
# This is how it looks per default
382-
initialize = {"grid": 4, "random": 2, "vertices": 4}
383-
384-
# You could set it to this for a maximum population of 20
385-
initialize = {"grid": 4, "random": 12, "vertices": 4}
386-
```
387-
388-
</details>
389166

390167

391168

392-
<br>
393-
394-
## References
395-
396-
#### [dto] [Scikit-Optimize](https://github.com/scikit-optimize/scikit-optimize/blob/master/skopt/learning/forest.py)
397-
398169
<br>
399170

400171
## Citing Hyperactive
@@ -407,8 +178,3 @@ initialize = {"grid": 4, "random": 12, "vertices": 4}
407178
}
408179

409180

410-
<br>
411-
412-
## License
413-
414-
[![LICENSE](https://img.shields.io/github/license/SimonBlanke/Hyperactive?style=for-the-badge)](https://github.com/SimonBlanke/Hyperactive/blob/main/LICENSE)

0 commit comments

Comments
 (0)