Skip to content

Commit 14c47d8

Browse files
lucaspimentelclaudeCopilot
authored
[Azure Functions] Fix span parenting in ASP.NET Core integration (#7628)
## Summary of changes Fixes incorrect span parenting in isolated Azure Functions when using ASP.NET Core integration for HTTP triggers. Worker process spans are now correctly parented to the ASP.NET Core request span, instead of being incorrectly parented to the root host span. ## Reason for change When using isolated Azure Functions with ASP.NET Core integration and HTTP proxying enabled, spans created in the worker process were being parented to the wrong span, causing disconnected or incorrectly structured traces. This made it difficult to understand the complete request flow and latency attribution. **Current (incorrect) behavior:** ``` ROOT: azure_functions.invoke: GET /api/httptest [HOST] ├─ http.request: GET localhost:40521/api/HttpTest [HOST → WORKER] └─ azure_functions.invoke: Http HttpTest [WORKER] ❌ wrong parent └─ (worker child spans) ``` **Fixed (correct) behavior:** ``` ROOT: azure_functions.invoke: GET /api/httptest [HOST] └─ http.request: GET localhost:40521/api/HttpTest [HOST → WORKER] └─ aspnet_core.request [WORKER] └─ azure_functions.invoke: Http HttpTest [WORKER] ✅ correct parent └─ (worker child spans) ``` ## Implementation details The root cause was that `AsyncLocal` context doesn't flow correctly through Azure Functions middleware, causing the worker's `azure_functions.invoke` span to have no local parent. The instrumentation would then fall back to extracting trace context from gRPC message headers, which contained **stale context** (the host's root span context), resulting in incorrect parenting. **The Fix**: Use `HttpContext.Items` as an explicit bridge to pass scope between ASP.NET Core and Azure Functions middleware layers: 1. **Store scope in HttpContext.Items** (`AspNetCoreHttpRequestHandler.cs:159-171`) - After creating the `aspnet_core.request` scope, store it in `HttpContext.Items[HttpContextActiveScopeKey]` - Only done when running in Azure Functions isolated worker 2. **Skip stale gRPC header extraction** (`AzureFunctionsCommon.cs:243-259`) - Detect ASP.NET Core integration by checking for `"HttpRequestContext"` key in `FunctionContext.Items` - Skip extracting trace context from gRPC message headers (which contain stale host root span context) - Only extract headers in non-ASP.NET Core mode (timer triggers, non-proxying HTTP triggers) 3. **Retrieve scope from `HttpContext.Items`** (`AzureFunctionsCommon.cs:287-367`) - When `tracer.InternalActiveScope` is null (AsyncLocal didn't flow), call `GetAspNetCoreScope()` - Get HttpContext from `FunctionContext.Items["HttpRequestContext"]` (set by `FunctionsHttpProxyingMiddleware`) - Get scope from `HttpContext.Items[HttpContextActiveScopeKey]` - Use retrieved scope as parent if found, otherwise fall back to extracted context or create root 4. **Added Items property to `IFunctionContext`** (`IFunctionContext.cs:21`) - Added `IDictionary<object, object>? Items { get; }` for duck-typed access to FunctionContext.Items This preserves existing behavior for non-proxying scenarios (timer triggers, non-ASP.NET Core HTTP triggers) while fixing the proxying case. ## Test coverage Covered by existing tests in `AzureFunctionsTests.cs`. Fixed test snapshots to reflect correct span counts and hierarchy. ## Screenshots ### Before <img width="3267" height="293" alt="image" src="https://github.com/user-attachments/assets/f0942ae5-1c27-4f7c-a1e7-d6b3559ba486" /> https://dd-dotnet.datadoghq.com/apm/trace/69129102000000009bd07f7872769a84 ### After <img width="3264" height="246" alt="image" src="https://github.com/user-attachments/assets/4074aa46-5577-44fc-b0da-bc0ed9b44c05" /> https://dd-dotnet.datadoghq.com/apm/trace/69150c5500000000cb655a7e34732dd6 More recent examples with these changes: - https://dd-dotnet.datadoghq.com/apm/trace/697be6d40000000000710c25e093bbc4 - https://dd-dotnet.datadoghq.com/apm/trace/6994c3a200000000008c4ab320c9c4b7 ## Other details Fixes APMSVLS-58 --------- Co-authored-by: Claude <noreply@anthropic.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
1 parent 8840657 commit 14c47d8

11 files changed

Lines changed: 962 additions & 447 deletions

File tree

tracer/src/Datadog.Trace/ClrProfiler/AutoInstrumentation/Azure/Functions/AzureFunctionsCommon.cs

Lines changed: 199 additions & 42 deletions
Large diffs are not rendered by default.

tracer/src/Datadog.Trace/ClrProfiler/AutoInstrumentation/Azure/Functions/Isolated/FunctionExecutionMiddlewareInvokeIntegration.cs

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,13 @@ internal static CallTargetState OnMethodBegin<TTarget, TFunctionContext>(TTarget
4545
[PreserveContext]
4646
internal static TReturn OnAsyncMethodEnd<TTarget, TReturn>(TTarget instance, TReturn returnValue, Exception exception, in CallTargetState state)
4747
{
48+
// The worker's FunctionExecutionMiddleware catches this exception internally,
49+
// so the aspnet_core.request span otherwise records status 200. Annotate it here.
50+
if (exception is not null && state.State is Scope aspNetCoreScope)
51+
{
52+
AzureFunctionsCommon.SetExceptionOnAspNetCoreScope(aspNetCoreScope, exception, Tracer.Instance);
53+
}
54+
4855
state.Scope?.DisposeWithException(exception);
4956
return returnValue;
5057
}

tracer/src/Datadog.Trace/ClrProfiler/AutoInstrumentation/Azure/Functions/Isolated/IFunctionContext.cs

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,8 @@ internal interface IFunctionContext
1818
FunctionDefinitionStruct FunctionDefinition { get; }
1919

2020
IEnumerable<KeyValuePair<Type, object?>>? Features { get; }
21+
22+
IDictionary<object, object?>? Items { get; }
2123
}
2224

2325
#endif
Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
// <copyright file="IHttpContextItems.cs" company="Datadog">
2+
// Unless explicitly stated otherwise all files in this repository are licensed under the Apache 2 License.
3+
// This product includes software developed at Datadog (https://www.datadoghq.com/). Copyright 2017 Datadog, Inc.
4+
// </copyright>
5+
6+
#if !NETFRAMEWORK
7+
#nullable enable
8+
9+
using System.Collections.Generic;
10+
11+
namespace Datadog.Trace.ClrProfiler.AutoInstrumentation.Azure.Functions;
12+
13+
/// <summary>
14+
/// Duck type for Microsoft.AspNetCore.Http.HttpContext,
15+
/// used to avoid a hard assembly reference to Microsoft.AspNetCore.Http.Abstractions
16+
/// which may not be available in non-ASP.NET Core Azure Functions workers.
17+
/// </summary>
18+
internal interface IHttpContextItems
19+
{
20+
IDictionary<object, object?> Items { get; }
21+
}
22+
23+
#endif

tracer/src/Datadog.Trace/ClrProfiler/Instrumentation.cs

Lines changed: 47 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ private static void PropagateStableConfiguration()
8585
var tracerSettings = tracer.Settings;
8686
var mutableSettings = tracerSettings.Manager.InitialMutableSettings;
8787

88-
NativeInterop.SharedConfig config = new NativeInterop.SharedConfig
88+
var config = new NativeInterop.SharedConfig
8989
{
9090
ProfilingEnabled = profilerSettings.ProfilerState switch
9191
{
@@ -525,10 +525,53 @@ private static AspNetCoreDiagnosticObserver GetAspNetCoreDiagnosticObserver()
525525
[Pure]
526526
private static bool SkipAspNetCoreDiagnosticObserver()
527527
{
528-
// this is extremely simple now, but will get more complex soon...
529-
return AzureInfo.Instance.IsAzureFunction;
528+
// Enable AspNetCoreDiagnosticObserver in:
529+
// - outside Azure Functions
530+
// - Isolated functions worker processes with extension v4
531+
// (to create aspnet_core.request spans that azure_functions.invoke can parent to)
532+
533+
// Skip AspNetCoreDiagnosticObserver in Azure Functions:
534+
// - In-process functions (due to AssemblyLoadContext issues)
535+
// - Isolated functions host process (to avoid duplicate spans)
536+
// - Isolated functions worker process with extension v1 (FUNCTIONS_EXTENSION_VERSION="~1")
537+
538+
if (!AzureInfo.Instance.IsAzureFunction)
539+
{
540+
// We only skip AspNetCoreDiagnosticObserver in Azure Functions.
541+
// Don't skip it outside Azure Functions.
542+
return false;
543+
}
544+
545+
// FUNCTIONS_WORKER_RUNTIME == "dotnet-isolated"
546+
if (!AzureInfo.Instance.IsIsolatedFunction)
547+
{
548+
// Skip AspNetCoreDiagnosticObserver in in-process Azure Functions
549+
Log.Debug("Skipping AspNetCoreDiagnosticObserver: running in an in-process Azure Function.");
550+
return true;
551+
}
552+
553+
if (AzureInfo.Instance.IsIsolatedFunctionHostProcess)
554+
{
555+
// Skip AspNetCoreDiagnosticObserver in Azure Functions _host_ processes
556+
Log.Debug("Skipping AspNetCoreDiagnosticObserver: running in an isolated Azure Function host process.");
557+
return true;
558+
}
559+
560+
// FUNCTIONS_EXTENSION_VERSION
561+
var azureFunctionsExtensionVersion = AzureInfo.Instance.AzureFunctionsExtensionVersion;
562+
563+
if (azureFunctionsExtensionVersion != "~4")
564+
{
565+
// Skip AspNetCoreDiagnosticObserver in v1 isolated functions (v2 and v3 are not supported at all)
566+
// to keep the previous behavior
567+
Log.Debug("Skipping AspNetCoreDiagnosticObserver: running in Azure Function with extension version {AzureFunctionsExtensionVersion}.", azureFunctionsExtensionVersion);
568+
return true;
569+
}
570+
571+
// do not skip when running in an isolated Azure Functions worker process with extension v4
572+
return false;
530573
}
531-
#endif
574+
#endif // #if !NETFRAMEWORK
532575

533576
private static void InitializeDebugger(TracerSettings tracerSettings)
534577
{

tracer/src/Datadog.Trace/PlatformHelpers/AspNetCoreHttpRequestHandler.cs

Lines changed: 18 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,8 @@
55

66
#if !NETFRAMEWORK
77
using System;
8-
using System.Collections.Generic;
98
using System.Collections.ObjectModel;
109
using System.Diagnostics.CodeAnalysis;
11-
using System.Linq;
1210
using Datadog.Trace.Activity;
1311
using Datadog.Trace.Activity.DuckTypes;
1412
using Datadog.Trace.Activity.Helpers;
@@ -20,21 +18,22 @@
2018
using Datadog.Trace.DataStreamsMonitoring.TransactionTracking;
2119
using Datadog.Trace.DiagnosticListeners;
2220
using Datadog.Trace.DuckTyping;
23-
using Datadog.Trace.ExtensionMethods;
2421
using Datadog.Trace.Headers;
25-
using Datadog.Trace.Iast;
2622
using Datadog.Trace.Logging;
2723
using Datadog.Trace.Propagators;
24+
using Datadog.Trace.Serverless;
2825
using Datadog.Trace.Tagging;
2926
using Datadog.Trace.Util;
3027
using Datadog.Trace.Util.Http;
28+
using Datadog.Trace.Vendors.Serilog.Events;
3129
using Microsoft.AspNetCore.Http;
3230

3331
namespace Datadog.Trace.PlatformHelpers
3432
{
3533
internal sealed class AspNetCoreHttpRequestHandler
3634
{
3735
internal const string HttpContextTrackingKey = "__Datadog.AspNetCoreHttpRequestHandler.Tracking";
36+
internal const string HttpContextActiveScopeKey = "__Datadog.AspNetCoreHttpRequestHandler.ActiveScope";
3837

3938
private readonly IDatadogLogger _log;
4039
private readonly IntegrationId _integrationId;
@@ -174,6 +173,21 @@ private Scope StartAspNetCorePipelineScope(Tracer tracer, Security security, Ias
174173
httpContext.Items[HttpContextTrackingKey] = new RequestTrackingFeature(originalPath, scope, proxyContext?.Scope);
175174
#endif
176175

176+
if (AzureInfo.Instance.IsAzureFunction)
177+
{
178+
// Store scope in HttpContext.Items for Azure Functions middleware to retrieve
179+
httpContext.Items[HttpContextActiveScopeKey] = scope;
180+
181+
if (_log.IsEnabled(LogEventLevel.Debug) && scope.Span.Context is { } spanContext)
182+
{
183+
_log.Debug(
184+
"AspNetCore: Stored scope in HttpContext.Items, {TraceId}-{SpanId}, path: {Path}",
185+
spanContext.RawTraceId,
186+
spanContext.RawSpanId,
187+
request.Path);
188+
}
189+
}
190+
177191
if (tracer.Settings.IpHeaderEnabled || security.AppsecEnabled)
178192
{
179193
var peerIp = new Headers.Ip.IpInfo(httpContext.Connection.RemoteIpAddress?.ToString(), httpContext.Connection.RemotePort);

tracer/src/Datadog.Trace/Tagging/AzureFunctionsTags.cs

Lines changed: 16 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -34,14 +34,13 @@ internal sealed partial class AzureFunctionsTags : InstrumentationTags
3434
public string TriggerType { get; set; } = "Unknown";
3535

3636
internal static void SetRootSpanTags(
37-
Span span,
37+
ITags tags,
3838
string shortName,
3939
string fullName,
4040
string bindingSource,
4141
string triggerType)
4242
{
43-
var tags = span.Tags;
44-
if (span.Tags is AspNetCoreTags aspNetTags)
43+
if (tags is AspNetCoreTags aspNetTags)
4544
{
4645
aspNetTags.InstrumentationName = ComponentName;
4746
}
@@ -51,10 +50,20 @@ internal static void SetRootSpanTags(
5150
tags.SetTag(Tags.InstrumentationName, ComponentName);
5251
}
5352

54-
tags.SetTag(ShortNameTagName, shortName);
55-
tags.SetTag(FullNameTagName, fullName);
56-
tags.SetTag(BindingSourceTagName, bindingSource);
57-
tags.SetTag(TriggerTypeTagName, triggerType);
53+
if (tags is AzureFunctionsTags azureFunctionsTags)
54+
{
55+
azureFunctionsTags.ShortName = shortName;
56+
azureFunctionsTags.FullName = fullName;
57+
azureFunctionsTags.BindingSource = bindingSource;
58+
azureFunctionsTags.TriggerType = triggerType;
59+
}
60+
else
61+
{
62+
tags.SetTag(ShortNameTagName, shortName);
63+
tags.SetTag(FullNameTagName, fullName);
64+
tags.SetTag(BindingSourceTagName, bindingSource);
65+
tags.SetTag(TriggerTypeTagName, triggerType);
66+
}
5867
}
5968
}
6069
}

tracer/test/Datadog.Trace.ClrProfiler.IntegrationTests/AzureFunctionsTests.cs

Lines changed: 28 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -145,15 +145,15 @@ public InProcessRuntimeV3(ITestOutputHelper output)
145145
public async Task SubmitsTraces()
146146
{
147147
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true);
148+
148149
using (await RunAzureFunctionAndWaitForExit(agent))
149150
{
150151
const int expectedSpanCount = 21;
151152
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
152153
var filteredSpans = spans.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase)).ToImmutableList();
153154

154155
using var s = new AssertionScope();
155-
filteredSpans.Count.Should().Be(expectedSpanCount);
156-
156+
filteredSpans.Should().HaveCount(expectedSpanCount);
157157
await AssertInProcessSpans(filteredSpans);
158158
}
159159
}
@@ -179,14 +179,15 @@ public InProcessRuntimeV4(ITestOutputHelper output)
179179
public async Task SubmitsTraces()
180180
{
181181
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true, useStatsD: true);
182+
182183
using (await RunAzureFunctionAndWaitForExit(agent, framework: "net6.0"))
183184
{
184185
const int expectedSpanCount = 21;
185186
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
186187
var filteredSpans = spans.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase)).ToImmutableList();
187188

188189
using var s = new AssertionScope();
189-
190+
filteredSpans.Should().HaveCount(expectedSpanCount);
190191
await AssertInProcessSpans(filteredSpans);
191192
}
192193
}
@@ -213,13 +214,15 @@ public IsolatedRuntimeV4SdkV1(ITestOutputHelper output)
213214
public async Task SubmitsTraces()
214215
{
215216
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true);
217+
216218
using (await RunAzureFunctionAndWaitForExit(agent, expectedExitCode: -1))
217219
{
218220
const int expectedSpanCount = 21;
219221
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
220222
var filteredSpans = spans.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase)).ToImmutableList();
221-
using var s = new AssertionScope();
222223

224+
using var s = new AssertionScope();
225+
filteredSpans.Should().HaveCount(expectedSpanCount);
223226
await AssertIsolatedSpans(filteredSpans, $"{nameof(AzureFunctionsTests)}.Isolated.V4.Sdk1");
224227
}
225228
}
@@ -245,16 +248,14 @@ public async Task SubmitsTraces()
245248
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true);
246249
using (await RunAzureFunctionAndWaitForExit(agent, expectedExitCode: -1))
247250
{
248-
const int expectedSpanCount = 26;
251+
const int expectedSpanCount = 31;
249252
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
250253

251254
var filteredSpans = FilterOutSocketsHttpHandler(spans);
252255

253256
using var s = new AssertionScope();
254-
257+
spans.Should().HaveCount(expectedSpanCount);
255258
await AssertIsolatedSpans(filteredSpans.ToImmutableList(), $"{nameof(AzureFunctionsTests)}.Isolated.V4.AspNetCore1");
256-
257-
spans.Count.Should().Be(expectedSpanCount);
258259
}
259260
}
260261
}
@@ -283,27 +284,27 @@ public async Task SubmitsTraces()
283284
// so we will enable them with a lot of logging
284285
SetEnvironmentVariable("DD_LOGS_DIRECT_SUBMISSION_AZURE_FUNCTIONS_HOST_ENABLED", "true");
285286
SetEnvironmentVariable("DD_LOGS_DIRECT_SUBMISSION_MINIMUM_LEVEL", "VERBOSE");
286-
var hostName = "integration_ilogger_az_tests";
287+
const string hostName = "integration_ilogger_az_tests";
288+
287289
using var logsIntake = new MockLogsIntake();
288290
EnableDirectLogSubmission(logsIntake.Port, nameof(IntegrationId.ILogger), hostName);
291+
289292
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true);
293+
290294
using (await RunAzureFunctionAndWaitForExit(agent, expectedExitCode: -1))
291295
{
292296
const int expectedSpanCount = 21;
293297
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
294298
var filteredSpans = spans.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase)).ToImmutableList();
295299

296300
using var s = new AssertionScope();
301+
filteredSpans.Should().HaveCount(expectedSpanCount);
297302
await AssertIsolatedSpans(filteredSpans);
298303

299-
filteredSpans.Count.Should().Be(expectedSpanCount);
300-
301-
var logs = logsIntake.Logs;
302-
303304
// ~327 (ish) logs but we kill func.exe so some logs are lost
304305
// and since sometimes the batch of logs can be 100+ it can be a LOT of logs that we lose
305306
// so just check that we have much more than when we have host logs disabled
306-
logs.Should().HaveCountGreaterThanOrEqualTo(200);
307+
logsIntake.Logs.Should().HaveCountGreaterThanOrEqualTo(200);
307308
}
308309
}
309310
}
@@ -331,25 +332,27 @@ public async Task SubmitsTraces()
331332
{
332333
SetEnvironmentVariable("DD_LOGS_DIRECT_SUBMISSION_AZURE_FUNCTIONS_HOST_ENABLED", "false");
333334
SetEnvironmentVariable("DD_LOGS_DIRECT_SUBMISSION_MINIMUM_LEVEL", "VERBOSE");
334-
var hostName = "integration_ilogger_az_tests";
335+
const string hostName = "integration_ilogger_az_tests";
336+
335337
using var logsIntake = new MockLogsIntake();
336338
EnableDirectLogSubmission(logsIntake.Port, nameof(IntegrationId.ILogger), hostName);
337339

338340
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true);
341+
339342
using (await RunAzureFunctionAndWaitForExit(agent, expectedExitCode: -1))
340343
{
341344
const int expectedSpanCount = 21;
342345
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
343-
344346
var filteredSpans = spans.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase)).ToImmutableList();
345347

348+
using var s = new AssertionScope();
349+
filteredSpans.Should().HaveCount(expectedSpanCount);
346350
await AssertIsolatedSpans(filteredSpans, filename: $"{nameof(AzureFunctionsTests)}.Isolated.V4.HostLogsDisabled");
347-
filteredSpans.Count.Should().Be(expectedSpanCount);
348351

349-
var logs = logsIntake.Logs;
350352
// we expect some logs still from the worker process
351353
// this just seems flaky I THINK because of killing the func.exe process (even though we aren't using the host logs)
352354
// commonly see 13, 14, 15, 16 logs, but IF we were logging the host logs we'd see 300+
355+
var logs = logsIntake.Logs;
353356
logs.Should().HaveCountGreaterThan(10);
354357
logs.Should().HaveCountLessThanOrEqualTo(20);
355358
}
@@ -374,9 +377,10 @@ public IsolatedRuntimeV4AspNetCore(ITestOutputHelper output)
374377
public async Task SubmitsTraces()
375378
{
376379
using var agent = EnvironmentHelper.GetMockAgent(useTelemetry: true);
380+
377381
using (await RunAzureFunctionAndWaitForExit(agent, expectedExitCode: -1))
378382
{
379-
const int expectedSpanCount = 26;
383+
const int expectedSpanCount = 31;
380384
var spans = await agent.WaitForSpansAsync(expectedSpanCount);
381385

382386
// There are _additional_ spans created for these compared to the non-AspNetCore version
@@ -385,15 +389,13 @@ public async Task SubmitsTraces()
385389
// because of this they cause a lot of flake in the snapshots where they shift places
386390
// opting to just scrub them from the snapshots - we also don't think that the spans provide much
387391
// value so they may be removed from being traced.
388-
var filteredSpans = FilterOutSocketsHttpHandler(spans);
389-
390-
filteredSpans = filteredSpans.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase)).ToImmutableList();
392+
var filteredSpans = FilterOutSocketsHttpHandler(spans)
393+
.Where(s => !s.Resource.Equals("Timer ExitApp", StringComparison.OrdinalIgnoreCase))
394+
.ToImmutableList();
391395

392396
using var s = new AssertionScope();
393-
394-
await AssertIsolatedSpans(filteredSpans.ToImmutableList(), $"{nameof(AzureFunctionsTests)}.Isolated.V4.AspNetCore");
395-
396-
spans.Count.Should().Be(expectedSpanCount);
397+
spans.Should().HaveCount(expectedSpanCount);
398+
await AssertIsolatedSpans(filteredSpans, $"{nameof(AzureFunctionsTests)}.Isolated.V4.AspNetCore");
397399
}
398400
}
399401
}

tracer/test/Datadog.Trace.Tests/ClrProfiler/AutoInstrumentation/Azure/Functions/AzureFunctionsCommonTests.cs

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,8 @@ private class MockFunctionContext : IFunctionContext
122122
public FunctionDefinitionStruct FunctionDefinition { get; set; }
123123

124124
public IEnumerable<KeyValuePair<Type, object?>>? Features { get; set; }
125+
126+
public IDictionary<object, object?>? Items { get; }
125127
}
126128

127129
// This duck types with tracer/src/Datadog.Trace/ClrProfiler/AutoInstrumentation/Azure/Functions/Isolated/GrpcBindingsFeatureStruct.cs

0 commit comments

Comments
 (0)