You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The control vs email campaigns analysis produces the following comparison:
151
+
152
+
.. image:: ../_static/hillstorm_dte_control.png
153
+
:alt:Hillstrom Email Campaigns vs Control Analysis
154
+
:width:800px
155
+
:align:center
156
+
157
+
**Interpreting the Control Comparison Results**: These plots show how each email campaign performs against the no-email control group across different spending levels:
158
+
159
+
**Women's Email vs Control**:
160
+
- **Positive DTE values** indicate that Women's email campaign increases the probability of spending at those levels compared to no email
161
+
- **Distribution pattern** shows where Women's email is most effective in driving customer spending
162
+
- **Confidence intervals** reveal statistical significance of the treatment effects
163
+
164
+
**Men's Email vs Control**:
165
+
- **Comparative effectiveness** can be assessed by comparing the magnitude and patterns of effects
166
+
- **Different spending ranges** may show varying campaign effectiveness
167
+
- **Statistical significance** indicated by confidence intervals not crossing zero
168
+
169
+
**Key Control Analysis Findings**:
170
+
171
+
1. **Campaign Effectiveness**: Both campaigns show positive effects compared to no email, confirming that email marketing drives incremental spending
172
+
173
+
2. **Differential Patterns**: The shape and magnitude of effects differ between campaigns, revealing:
174
+
- Which campaign has stronger overall effects
175
+
- Different spending ranges where each campaign excels
176
+
- Varying confidence in treatment effects across spending levels
177
+
178
+
3. **Business Implications**:
179
+
- **ROI Assessment**: Compare effect sizes to determine which campaign provides better return on investment
180
+
- **Customer Segmentation**: Identify spending ranges where each campaign is most/least effective
181
+
- **Resource Allocation**: Data-driven decisions on campaign budget allocation
182
+
183
+
4. **Statistical Rigor**: Confidence intervals provide guidance on where observed differences are statistically reliable vs. potentially due to sampling variation
184
+
185
+
This analysis answers the fundamental question: "Do email campaigns work?" and establishes the baseline effectiveness of each campaign against no email.
186
+
187
+
Direct Campaign Comparison: Men's vs Women's Email
@@ -138,7 +234,7 @@ The analysis produces the following distribution treatment effects visualization
138
234
:width:800px
139
235
:align:center
140
236
141
-
**Interpreting the Results**: The plot shows the distribution treatment effects (DTE) comparing Women's vs Men's email campaigns across different spending levels. Key observations:
237
+
**Interpreting the Campaign Comparison Results**: The plot shows the distribution treatment effects (DTE) comparing Women's vs Men's email campaigns across different spending levels. Key observations:
142
238
143
239
- **Positive DTE values** (above zero line) indicate that Women's email campaign increases the probability of spending at that level compared to Men's campaign
144
240
- **Confidence intervals** (shaded areas) show statistical uncertainty - where intervals don't cross zero, effects are statistically significant
@@ -152,15 +248,15 @@ Revenue Category Analysis with PTE
@@ -204,70 +300,6 @@ The Probability Treatment Effects analysis produces the following visualization:
204
300
205
301
This granular analysis helps marketers understand not just which campaign generates more revenue overall, but specifically which spending behaviors each campaign drives.
The control vs email campaigns analysis produces the following comparison:
235
-
236
-
.. image:: ../_static/hillstorm_dte_control.png
237
-
:alt:Hillstrom Email Campaigns vs Control Analysis
238
-
:width:800px
239
-
:align:center
240
-
241
-
**Interpreting the Control Comparison Results**: These side-by-side plots show how each email campaign performs against the no-email control group across different spending levels:
242
-
243
-
**Men's Email vs Control (Top Panel)**:
244
-
- **Positive DTE values** indicate that Men's email campaign increases the probability of spending at those levels compared to no email
245
-
- **Distribution pattern** shows where Men's email is most effective in driving customer spending
246
-
- **Confidence intervals** reveal statistical significance of the treatment effects
247
-
248
-
**Women's Email vs Control (Bottom Panel)**:
249
-
- **Comparative effectiveness** can be assessed by comparing the magnitude and patterns of effects
250
-
- **Different spending ranges** may show varying campaign effectiveness
251
-
- **Statistical significance** indicated by confidence intervals not crossing zero
252
-
253
-
**Key Control Analysis Findings**:
254
-
255
-
1. **Campaign Effectiveness**: Both campaigns show positive effects compared to no email, confirming that email marketing drives incremental spending
256
-
257
-
2. **Differential Patterns**: The shape and magnitude of effects differ between campaigns, revealing:
258
-
- Which campaign has stronger overall effects
259
-
- Different spending ranges where each campaign excels
260
-
- Varying confidence in treatment effects across spending levels
261
-
262
-
3. **Business Implications**:
263
-
- **ROI Assessment**: Compare effect sizes to determine which campaign provides better return on investment
264
-
- **Customer Segmentation**: Identify spending ranges where each campaign is most/least effective
265
-
- **Resource Allocation**: Data-driven decisions on campaign budget allocation
266
-
267
-
4. **Statistical Rigor**: Confidence intervals provide guidance on where observed differences are statistically reliable vs. potentially due to sampling variation
268
-
269
-
This analysis answers the fundamental question: "Do email campaigns work?" and more importantly, "Which one works better and for which customer segments?"
270
-
271
303
**Key Findings**: Using the real Hillstrom dataset with 64,000 customers, the distributional analysis reveals nuanced patterns in how email campaigns affect customer spending. The analysis goes beyond simple average comparisons to show how treatment effects vary across the entire spending distribution, providing insights into which customer segments respond best to different campaign types. This demonstrates the power of distribution treatment effect analysis for understanding heterogeneous responses in digital marketing experiments.
0 commit comments