You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
observability(docs): adopt phase terminology across framework
Changed "stage" to "phase" throughout observability framework to better
reflect the non-linear, iterative nature of the model. Phases can be
revisited and worked on concurrently, unlike sequential stages.
Changes:
- Framework page: Updated all section headings from "Stage X:" to "Phase"
format, removed numbering from navigation cards, updated prose
- All 5 phase guides: Added phase context to subtitle frontmatter
(e.g., "This is the INSTRUMENT phase of the observability framework")
- Removed numbered stage references throughout
Also includes from earlier consistency review:
- Framework: Added Test Suites deprecated label, Simulations pre-release
- Instrumentation: Removed Call Analysis recommendation, reordered nav cards,
added back-link, added inter-stage bridge, removed decorative emoji
- Testing strategies: Added prerequisite reference to instrumentation
- Extraction patterns: Removed decorative emojis from comparison table
Copy file name to clipboardExpand all lines: fern/observability/extraction-patterns.mdx
+6-4Lines changed: 6 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,9 +1,11 @@
1
1
---
2
2
title: Choosing your extraction pattern
3
-
subtitle: Understand the three architectural patterns for getting data out of Vapi
3
+
subtitle: Understand the three architectural patterns for getting data out of Vapi. This is the **EXTRACT phase** of the [observability framework](/observability/framework).
4
4
slug: observability/extraction-patterns
5
5
---
6
6
7
+
<spanclassName="internal-note">This page is in Rough Draft stage</span>
8
+
7
9
## Why extraction is an architectural choice
8
10
9
11
Unlike traditional observability platforms (DataDog, New Relic) where data flows automatically from instrumentation to monitoring, **Vapi requires you to choose how data gets extracted** for analysis.
@@ -31,9 +33,9 @@ Vapi offers three architectural patterns for extracting observability data from
|**Dashboard Native**| Use Vapi's built-in Boards with scalar Structured Outputs for real-time dashboards |⚡ Minimal (no infrastructure) | Basic (scalar fields only) | Solo founders, non-technical teams, startups |
35
-
|**Webhook-to-External**| Build custom post-call processing that captures data via webhooks and exports to your data warehouse |🛠️ High (requires backend infrastructure) | Rich (full object schemas, nested data) | Engineering teams, enterprises with existing data platforms |
36
-
|**Hybrid**| Combine both approaches - use Boards for operational metrics, webhooks for deep analysis |⚙️ Medium (partial infrastructure) | Flexible (mix of scalar and object data) | Growing teams balancing simplicity and power |
36
+
|**Dashboard Native**| Use Vapi's built-in Boards with scalar Structured Outputs for real-time dashboards | Minimal (no infrastructure) | Basic (scalar fields only) | Solo founders, non-technical teams, startups |
37
+
|**Webhook-to-External**| Build custom post-call processing that captures data via webhooks and exports to your data warehouse | High (requires backend infrastructure) | Rich (full object schemas, nested data) | Engineering teams, enterprises with existing data platforms |
38
+
|**Hybrid**| Combine both approaches - use Boards for operational metrics, webhooks for deep analysis | Medium (partial infrastructure) | Flexible (mix of scalar and object data) | Growing teams balancing simplicity and power |
37
39
38
40
**How to choose**: Start with Dashboard Native (fastest setup). Migrate to Hybrid or Webhook-to-External as your analytics needs grow or when you need features like Scorecard visualization or external BI tools.
Copy file name to clipboardExpand all lines: fern/observability/instrumentation.mdx
+21-11Lines changed: 21 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Instrumentation
3
-
subtitle: Configure your assistant to capture operational and business metrics
3
+
subtitle: Configure your assistant to capture operational and business metrics. This is the **INSTRUMENT phase** of the [observability framework](/observability/framework).
4
4
slug: observability/instrumentation
5
5
---
6
6
@@ -39,6 +39,8 @@ Think of instrumentation as installing sensors in your assistant:
39
39
- What metrics will help you debug failures?
40
40
- What data do you need for compliance or reporting?
41
41
42
+
The schemas you define here become the assertions your Evals validate in the [TEST stage](/observability/testing-strategies).
43
+
42
44
The "Instrumentation tools at a glance" section below shows how to configure custom instrumentation.
43
45
44
46
---
@@ -48,7 +50,7 @@ The "Instrumentation tools at a glance" section below shows how to configure cus
Copy file name to clipboardExpand all lines: fern/observability/monitoring.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Monitoring & Operating
3
-
subtitle: Visualize trends, track operational health, and ensure production reliability
3
+
subtitle: Visualize trends, track operational health, and ensure production reliability. This is the **MONITOR phase** of the [observability framework](/observability/framework).
@@ -59,7 +59,7 @@ Vapi's observability tools support a 5-stage progression:
59
59
60
60
### This is a maturity progression, not a linear checklist
61
61
62
-
You don't complete one stage and never return to it. Observability is **continuous**:
62
+
You don't complete one phase and never return to it. Observability is **continuous**:
63
63
64
64
-**Instrument** as you build new features
65
65
-**Test** after every change
@@ -69,17 +69,17 @@ You don't complete one stage and never return to it. Observability is **continuo
69
69
70
70
**For teams just starting**: Begin with INSTRUMENT + TEST (validate before production). Add EXTRACT + MONITOR as you scale. OPTIMIZE becomes natural once you have data flowing.
<spanclassName="vapi-validation">Is "maturity model" the right framing? Should we emphasize iteration more explicitly? How do customer segments (startups vs enterprises) typically progress through these stages?</span>
74
+
<spanclassName="vapi-validation">Is "maturity model" the right framing? Should we emphasize iteration more explicitly? How do customer segments (startups vs enterprises) typically progress through these phases?</span>
75
75
76
76
---
77
77
78
78
## How this framework maps to Vapi tools
79
79
80
-
Each stage uses specific Vapi features. Here's a quick reference:
80
+
Each phase uses specific Vapi features. Here's a quick reference:
81
81
82
-
### Stage 1: INSTRUMENT
82
+
### INSTRUMENT Phase
83
83
84
84
Configure your assistant to capture operational and business metrics.
85
85
@@ -89,17 +89,17 @@ Configure your assistant to capture operational and business metrics.
89
89
90
90
---
91
91
92
-
### Stage 2: TEST
92
+
### TEST Phase
93
93
94
94
Validate your assistant works correctly before production deployment.
95
95
96
-
**What you'll use**: Evals, Simulations, Test Suites
96
+
**What you'll use**: Evals, Simulations (Pre-release), Test Suites (⚠️ Deprecated)
Copy file name to clipboardExpand all lines: fern/observability/optimization-workflows.mdx
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,12 @@
1
1
---
2
2
title: Optimization workflows
3
-
subtitle: Use observability data to continuously improve your assistant
3
+
subtitle: Use observability data to continuously improve your assistant. This is the **OPTIMIZE phase** of the [observability framework](/observability/framework).
4
4
slug: observability/optimization-workflows
5
5
---
6
6
7
7
<spanclassName="internal-note">This page is in Skeleton Draft stage - structure and scope for review, detailed content to be developed in iteration 2</span>
8
8
9
+
9
10
## What is optimization?
10
11
11
12
**Optimization** is the continuous improvement loop: using observability data to refine prompts, improve tool calls, and enhance conversation flows.
Copy file name to clipboardExpand all lines: fern/observability/testing-strategies.mdx
+3-1Lines changed: 3 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Testing strategies
3
-
subtitle: Validate your assistant works correctly before deploying to production
3
+
subtitle: Validate your assistant works correctly before deploying to production. This is the **TEST phase** of the [observability framework](/observability/framework).
4
4
slug: observability/testing-strategies
5
5
---
6
6
@@ -18,6 +18,8 @@ Unlike traditional software testing (unit tests, integration tests), voice AI te
18
18
-**Edge cases** — How does the system handle interruptions, unclear requests, or unexpected inputs?
19
19
-**Regression** — Do changes break existing functionality?
20
20
21
+
Testing assumes you've already instrumented your assistant with Structured Outputs (see [Instrumentation](/observability/instrumentation)).
22
+
21
23
<spanclassName="vapi-validation">What other specific validation and/or testing uniqueness have clients reported when working with voice AI testing?</span>
0 commit comments