You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
index_url: "https://pypi.org/simple"# Optional custom PyPI index
77
+
notebook_scoped_libraries: false # Set to true for notebook-scoped installation
73
78
```
74
79
75
80
### Configuration
@@ -173,6 +178,65 @@ grants:
173
178
manage: []
174
179
```
175
180
181
+
#### Python Packages
182
+
183
+
You can install Python packages for your models using the `packages` configuration. There are two ways to install packages:
184
+
185
+
##### Cluster-level installation (default)
186
+
187
+
By default, packages are installed at the cluster level using Databricks libraries. This is the traditional approach where packages are installed when the cluster starts.
- Packages are available for the entire cluster lifecycle
200
+
- Faster model execution (no installation overhead per run)
201
+
202
+
**Limitations:**
203
+
- Requires cluster restart to update packages
204
+
- All tasks on the cluster share the same package versions
205
+
206
+
##### Notebook-scoped installation
207
+
208
+
When `notebook_scoped_libraries: true`, packages are installed at the notebook level using `%pip install` magic commands. This prepends installation commands to your compiled code.
**Note:** For Databricks Runtime 13.0 and above, `dbutils.library.restartPython()` is automatically added after package installation to ensure packages are properly loaded.
239
+
176
240
#### Post hooks
177
241
178
242
It is possible to add in python hooks by using the `config.python_job_config.post_hook_tasks`
0 commit comments