This repository contains the codebase for STORM-OPF: Robustness Benchmarking for AC-OPF Methods.
The core implementation of the Scenario Generator Module and the Evaluator Module is located in src/.
All baseline implementations are provided under src/baselines. Each baseline directory includes its own README.md file with instructions for training and evaluation.
The overall STORM-OPF workflow is as follows:
-
Choose the correct env Using virtual environments is recommended, since different baselines may require different dependencies. In this implementation, all baselines use the same environment specified in
env_opf.yml, except for the OPF-DNN baseline. The OPF-DNN baseline requires a separate environment defined insrc/baselines/OPF_DNN/env_opf_dnn.yml. -
Generate stress-test scenarios.
First, choose the stress scenario(s) on which you want to evaluate an OPF solver. In this work, we define 10 scenarios, including thein_distributioncase. If you want to use these predefined scenarios, runsrc/create_scenario_bank.pyto generate the scenario networks. Seesrc/README.mdfor additional details.To add a new stress scenario:
- add the scenario name to
config.json, and - register and implement the scenario in
src/stress_test_scenarios.py.
- add the scenario name to
-
Train and evaluate baselines.
The repository currently includes 5 baselines. To train or evaluate any of them, refer to the correspondingREADME.mdfile in that baseline's folder undersrc/baselines. -
Add a new baseline.
To integrate a new baseline into STORM-OPF:- place the baseline implementation inside
src/baselines, - use the generated training and test scenarios to prepare the required datasets, and
- extend the appropriate evaluator from the relevant
src/global_evaluation_...file depending on the OPF setting, such as AC-OPF, SC-ACOPF, DC-OPF, or SC-DCOPF.
- place the baseline implementation inside
