Skip to content

Commit cfba126

Browse files
committed
fix documentation issue, and format with ruff
1 parent 635300d commit cfba126

2 files changed

Lines changed: 4 additions & 28 deletions

File tree

docs/interpret/python/examples/quantile-regression.ipynb

Lines changed: 3 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -3,23 +3,7 @@
33
{
44
"cell_type": "markdown",
55
"metadata": {},
6-
"source": [
7-
"# EBM Internals - Quantile Regression\n",
8-
"\n",
9-
"This notebook covers quantile regression using pinball loss with Explainable Boosting Machines. For standard regression internals, see [Part 1](./ebm-internals-regression.ipynb). For classification, see [Part 2](./ebm-internals-classification.ipynb). For multiclass, see [Part 3](./ebm-internals-multiclass.ipynb).\n",
10-
"\n",
11-
"Standard regression models (e.g. with RMSE) predict the conditional mean of the target. Quantile regression instead predicts a specific quantile (e.g. median, 10th percentile, 90th percentile). This is useful for:\n",
12-
"\n",
13-
"- **Prediction intervals**: Fit models at the 10th and 90th percentiles to get an 80% prediction interval.\n",
14-
"- **Asymmetric risk**: When over-predicting and under-predicting have different costs.\n",
15-
"- **Robustness**: Median regression (alpha=0.5) is more robust to outliers than mean regression.\n",
16-
"\n",
17-
"EBMs support quantile regression via the `\"quantile\"` objective, which uses the pinball loss (also called the quantile loss). The pinball loss for quantile alpha is:\n",
18-
"\n",
19-
"$$L(y, \\hat{y}) = \\begin{cases} \\alpha \\cdot (y - \\hat{y}) & \\text{if } y \\geq \\hat{y} \\\\ (1 - \\alpha) \\cdot (\\hat{y} - y) & \\text{if } y < \\hat{y} \\end{cases}$$\n",
20-
"\n",
21-
"This loss penalizes under-predictions by a factor of alpha and over-predictions by a factor of (1 - alpha), causing the model to learn the alpha-quantile of the conditional distribution."
22-
]
6+
"source": "# EBM Internals - Quantile Regression\n\nThis notebook covers quantile regression using pinball loss with Explainable Boosting Machines.\n\nStandard regression models (e.g. with RMSE) predict the conditional mean of the target. Quantile regression instead predicts a specific quantile (e.g. median, 10th percentile, 90th percentile). This is useful for:\n\n- **Prediction intervals**: Fit models at the 10th and 90th percentiles to get an 80% prediction interval.\n- **Asymmetric risk**: When over-predicting and under-predicting have different costs.\n- **Robustness**: Median regression (alpha=0.5) is more robust to outliers than mean regression.\n\nEBMs support quantile regression via the `\"quantile\"` objective, which uses the pinball loss (also called the quantile loss). The pinball loss for quantile alpha is:\n\n$$L(y, \\hat{y}) = \\begin{cases} \\alpha \\cdot (y - \\hat{y}) & \\text{if } y \\geq \\hat{y} \\\\ (1 - \\alpha) \\cdot (\\hat{y} - y) & \\text{if } y < \\hat{y} \\end{cases}$$\n\nThis loss penalizes under-predictions by a factor of alpha and over-predictions by a factor of (1 - alpha), causing the model to learn the alpha-quantile of the conditional distribution."
237
},
248
{
259
"cell_type": "code",
@@ -285,15 +269,7 @@
285269
{
286270
"cell_type": "markdown",
287271
"metadata": {},
288-
"source": [
289-
"## Summary\n",
290-
"\n",
291-
"- Use `objective=\"quantile:alpha=0.5\"` for median regression, or any alpha in (0, 1) for other quantiles.\n",
292-
"- The prediction mechanism is identical to standard regression EBMs (intercept + additive score lookups). Only the training loss function changes.\n",
293-
"- Fitting multiple quantile models (e.g. alpha=0.1 and alpha=0.9) provides interpretable prediction intervals.\n",
294-
"- All EBM interpretability tools (global/local explanations) work with quantile models.\n",
295-
"- For the complete prediction code that handles interactions, missing values, and all model types, see [Part 3](./ebm-internals-multiclass.ipynb)."
296-
]
272+
"source": "## Summary\n\n- Use `objective=\"quantile:alpha=0.5\"` for median regression, or any alpha in (0, 1) for other quantiles.\n- The prediction mechanism is identical to standard regression EBMs (intercept + additive score lookups). Only the training loss function changes.\n- Fitting multiple quantile models (e.g. alpha=0.1 and alpha=0.9) provides interpretable prediction intervals.\n- All EBM interpretability tools (global/local explanations) work with quantile models."
297273
}
298274
],
299275
"metadata": {
@@ -317,4 +293,4 @@
317293
},
318294
"nbformat": 4,
319295
"nbformat_minor": 4
320-
}
296+
}

python/interpret-core/interpret/utils/_misc.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ def normalize_objective(objective):
2828
FutureWarning,
2929
stacklevel=2,
3030
)
31-
objective = new_name + objective[len(old_name):]
31+
objective = new_name + objective[len(old_name) :]
3232
break
3333
return objective
3434

0 commit comments

Comments
 (0)