You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-`OFFLINE_PROVIDERS` - Use this type if your provider offers static machine configurations that may be collected and
54
-
published on a daily basis. Examples: `aws`, `gcp`, `azure`, etc. These providers offer many machine configurations,
55
-
but they are not updated frequently.
56
-
-`ONLINE_PROVIDERS` - Use this type if your provider offers dynamic machine configurations that are available at the very moment when you fetch configurations (e.g., GPU marketplaces).
57
-
Examples: `tensordock`, `vast`, etc.
56
+
Add your provider in the following places:
57
+
- Either `OFFLINE_PROVIDERS` or `ONLINE_PROVIDERS` in `src/gpuhunt/_internal/catalog.py`.
58
+
- The `python -m gpuhunt` command in `src/gpuhunt/__main__.py`.
59
+
-(offline providers) The CI workflow in `.github/workflows/catalogs.yml`.
60
+
- (online providers) The default catalog in `src/gpuhunt/_internal/default.py`.
58
61
59
-
### 1.4. Add data quality tests
62
+
### 1.5. Add data quality tests
60
63
61
-
If the provider is registered via `OFFLINE_PROVIDERS`, you can add data quality tests
62
-
under `src/integrity_tests/`.
64
+
For offline providers, you can add data quality tests under `src/integrity_tests/`.
65
+
Data quality tests are run after collecting offline catalogs to ensure their integrity.
63
66
64
67
Refer to examples: [test_datacrunch.py](https://github.com/dstackai/gpuhunt/blob/main/src/integrity_tests/test_datacrunch.py),
[azure](https://github.com/dstackai/dstack/blob/master/src/dstack/_internal/core/backends/azure/__init__.py), etc.
113
117
114
118
##### 2.4.4. Create the backend compute class
115
119
116
120
Under the backend directory you've created, create the `compute.py` file and define the
117
121
backend compute class there (should extend `dstack._internal.core.backends.base.compute.Compute`).
118
122
119
-
You'll have to implement `get_offers`, `create_instance`, `run_job` and `terminate_instance`.
123
+
You'll have to implement `get_offers`, `run_job` and `terminate_instance`.
124
+
You may need to implement `update_provisioning_data`, see its docstring for details.
120
125
121
-
The `create_instance` method is required for the pool feature. If you implement the `create_instance` method, you should add the provider name to `BACKENDS_WITH_CREATE_INSTANCE_SUPPORT`. (`src/dstack/_internal/server/services/runs.py`).
126
+
For VM-based backends, also implement the `create_instance` method and add the backend name to
[azure](https://github.com/dstackai/dstack/blob/master/src/dstack/_internal/core/backends/azure/config.py), etc.
157
163
158
164
##### 2.4.7. Import config model classes
159
165
160
166
Ensure the config model classes are imported
161
167
into [`src/dstack/_internal/core/models/backends/__init__.py`](https://github.com/dstackai/dstack/blob/master/src/dstack/_internal/core/models/backends/__init__.py).
162
168
163
-
[//]: #(TODO: The backend configuration is overly complex and needs simplification: https://github.com/dstackai/dstack/issues/888)
164
-
165
169
##### 2.4.8. Create the configurator class
166
170
167
171
Create the file with the backend name under `src/dstack/_internal/server/services/backends/configurators`(https://github.com/dstackai/dstack/blob/master/src/dstack/_internal/server/services/backends/configurators)
168
172
and define the backend configurator class (must extend `dstack._internal.server.services.backends.configurators.base.Configurator`).
169
173
170
174
Refer to examples: [datacrunch](https://github.com/dstackai/dstack/blob/master/src/dstack/_internal/server/services/backends/configurators/datacrunch.py),
0 commit comments