You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/interpret/python/examples/quantile-regression.ipynb
+3-27Lines changed: 3 additions & 27 deletions
Original file line number
Diff line number
Diff line change
@@ -3,23 +3,7 @@
3
3
{
4
4
"cell_type": "markdown",
5
5
"metadata": {},
6
-
"source": [
7
-
"# EBM Internals - Quantile Regression\n",
8
-
"\n",
9
-
"This notebook covers quantile regression using pinball loss with Explainable Boosting Machines. For standard regression internals, see [Part 1](./ebm-internals-regression.ipynb). For classification, see [Part 2](./ebm-internals-classification.ipynb). For multiclass, see [Part 3](./ebm-internals-multiclass.ipynb).\n",
10
-
"\n",
11
-
"Standard regression models (e.g. with RMSE) predict the conditional mean of the target. Quantile regression instead predicts a specific quantile (e.g. median, 10th percentile, 90th percentile). This is useful for:\n",
12
-
"\n",
13
-
"- **Prediction intervals**: Fit models at the 10th and 90th percentiles to get an 80% prediction interval.\n",
14
-
"- **Asymmetric risk**: When over-predicting and under-predicting have different costs.\n",
15
-
"- **Robustness**: Median regression (alpha=0.5) is more robust to outliers than mean regression.\n",
16
-
"\n",
17
-
"EBMs support quantile regression via the `\"quantile\"` objective, which uses the pinball loss (also called the quantile loss). The pinball loss for quantile alpha is:\n",
"This loss penalizes under-predictions by a factor of alpha and over-predictions by a factor of (1 - alpha), causing the model to learn the alpha-quantile of the conditional distribution."
22
-
]
6
+
"source": "# EBM Internals - Quantile Regression\n\nThis notebook covers quantile regression using pinball loss with Explainable Boosting Machines.\n\nStandard regression models (e.g. with RMSE) predict the conditional mean of the target. Quantile regression instead predicts a specific quantile (e.g. median, 10th percentile, 90th percentile). This is useful for:\n\n- **Prediction intervals**: Fit models at the 10th and 90th percentiles to get an 80% prediction interval.\n- **Asymmetric risk**: When over-predicting and under-predicting have different costs.\n- **Robustness**: Median regression (alpha=0.5) is more robust to outliers than mean regression.\n\nEBMs support quantile regression via the `\"quantile\"` objective, which uses the pinball loss (also called the quantile loss). The pinball loss for quantile alpha is:\n\n$$L(y, \\hat{y}) = \\begin{cases} \\alpha \\cdot (y - \\hat{y}) & \\text{if } y \\geq \\hat{y} \\\\ (1 - \\alpha) \\cdot (\\hat{y} - y) & \\text{if } y < \\hat{y} \\end{cases}$$\n\nThis loss penalizes under-predictions by a factor of alpha and over-predictions by a factor of (1 - alpha), causing the model to learn the alpha-quantile of the conditional distribution."
23
7
},
24
8
{
25
9
"cell_type": "code",
@@ -285,15 +269,7 @@
285
269
{
286
270
"cell_type": "markdown",
287
271
"metadata": {},
288
-
"source": [
289
-
"## Summary\n",
290
-
"\n",
291
-
"- Use `objective=\"quantile:alpha=0.5\"` for median regression, or any alpha in (0, 1) for other quantiles.\n",
292
-
"- The prediction mechanism is identical to standard regression EBMs (intercept + additive score lookups). Only the training loss function changes.\n",
"- All EBM interpretability tools (global/local explanations) work with quantile models.\n",
295
-
"- For the complete prediction code that handles interactions, missing values, and all model types, see [Part 3](./ebm-internals-multiclass.ipynb)."
296
-
]
272
+
"source": "## Summary\n\n- Use `objective=\"quantile:alpha=0.5\"` for median regression, or any alpha in (0, 1) for other quantiles.\n- The prediction mechanism is identical to standard regression EBMs (intercept + additive score lookups). Only the training loss function changes.\n- Fitting multiple quantile models (e.g. alpha=0.1 and alpha=0.9) provides interpretable prediction intervals.\n- All EBM interpretability tools (global/local explanations) work with quantile models."
0 commit comments