Skip to content

utf-8 conversion issue with use_otlp_decoding: true #24984

@stepbeta

Description

@stepbeta

A note for the community

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Problem

When configuring a source with the new use_otlp_decoding: true parameter and then sending those incoming logs to a OTLP sink, it seems something gets wrongly converted from the source to the sink.

Setup

The way I tried to reproduce the issue is the following:

  • Logs generator: a small Go program that just generates logs/traces using the go.opentelemetry.io packages
  • Client: otel collector that receives the logs/traces generated and sends them to Vector
  • Server: a Vector instance that receives and enrich the logs then sends them to a Loki instance

The generator and the client I run them locally with a docker-compose. The server runs in a kubernetes cluster.

Data

When I run the above setup I can see this in the docker-compose logs:

otel-collector  | 2026-03-23T12:27:03.502Z      info    Logs    {"resource": {"service.instance.id": "4eb2c0c3-4630-431e-8729-016272c66123", "service.name": "otelcol-contrib", "service.version": "0.148.0"}, "otelcol.component.id": "debug", "otelcol.component.kind": "exporter", "otelcol.signal": "logs", "resource logs": 1, "log records": 1}
otel-collector  | 2026-03-23T12:27:03.502Z      info    ResourceLog #0
otel-collector  | Resource SchemaURL: 
otel-collector  | Resource attributes:
otel-collector  |      -> environment: Str(prod)
otel-collector  |      -> service.name: Str(myservice-c)
otel-collector  | ScopeLogs #0
otel-collector  | ScopeLogs SchemaURL: 
otel-collector  | InstrumentationScope otlptest 
otel-collector  | LogRecord #0
otel-collector  | ObservedTimestamp: 2026-03-23 12:27:02.630813412 +0000 UTC
otel-collector  | Timestamp: 2026-03-23 12:27:02.630805424 +0000 UTC
otel-collector  | SeverityText: INFO
otel-collector  | SeverityNumber: Info(9)
otel-collector  | Body: Str(Log Event: lgHy2ivoVnAv)
otel-collector  | Attributes:
otel-collector  |      -> log.source: Str(otlptest-app)
otel-collector  | Trace ID: c8a406fc7e45bf889c742d74cf325f74
otel-collector  | Span ID: 50c796eee611c0b7
otel-collector  | Flags: 1
otel-collector  |       {"resource": {"service.instance.id": "4eb2c0c3-4630-431e-8729-016272c66123", "service.name": "otelcol-contrib", "service.version": "0.148.0"}, "otelcol.component.id": "debug", "otelcol.component.kind": "exporter", "otelcol.signal": "logs"}
otlptest-app    | 2026/03/23 12:27:03 Emitted log: Log Event: n05w91Hg9vb4 (trace=bcc56e153a645f7dc475030f52db1ea5, span=90cd88b0657304bd)

But then, Vector logs this errors:

vector 2026-03-23T12:27:00.796565Z ERROR sink{component_kind="sink" component_id=loki_opentelemetry component_type=opentelemetry}: vector::sinks::util::retries: Not retriable; dropping the request. reason="Http status: 400 Bad Request"

When I go check the loki-write instance that receives the logs from Vector, instead, I see this error:

level=error ts=2026-03-23T12:26:51.777908375Z caller=manager.go:49 component=distributor path=write msg="write operation failed" details="couldn't parse push request: proto: illegal wireType 7" org_id=fake

Note: there are several of these, so apologies if the timestamps don't match perfectly, but the messages are always the same

Troubleshooting

I tried tapping the logs at the source point and this is the results:

// vector tap -d 3000 --outputs-of open_telemetry_raw.logs
{"resourceLogs":[{"resource":{"attributes":[{"key":"environment","value":{"stringValue":"staging"}},{"key":"service.name","value":{"stringValue":"myservice-c"}}]},"scopeLogs":[{"logRecords":[{"attributes":[{"key":"log.source","value":{"stringValue":"otlptest-app"}},{"key":"trace_id","value":{"stringValue":"57cf79e7bf8c68ca1c6861237b4b01a8"}},{"key":"span_id","value":{"stringValue":"60e36398f172f4c7"}}],"body":{"stringValue":"Log Event: JvywOLPNUb8m"},"flags":1,"observedTimeUnixNano":1774269925426890371,"severityNumber":"SEVERITY_NUMBER_INFO","severityText":"INFO","spanId":"`�c��r��","timeUnixNano":1774269925426887030,"traceId":"W�y翌h�\u001cha#{K\u0001"}],"scope":{"name":"otlptest"}}]}],"timestamp":"2026-03-23T12:45:26.710771761Z"}
{"resourceLogs":[{"resource":{"attributes":[{"key":"environment","value":{"stringValue":"staging"}},{"key":"service.name","value":{"stringValue":"myservice-c"}}]},"scopeLogs":[{"logRecords":[{"attributes":[{"key":"log.source","value":{"stringValue":"otlptest-app"}},{"key":"trace_id","value":{"stringValue":"b9dd26a54863f75578d83132ec27e97b"}},{"key":"span_id","value":{"stringValue":"3d7723e23bf0fd56"}}],"body":{"stringValue":"Log Event: 6Jsw6hmRs3qB"},"flags":1,"observedTimeUnixNano":1774269926427036740,"severityNumber":"SEVERITY_NUMBER_INFO","severityText":"INFO","spanId":"=w#�;��V","timeUnixNano":1774269926427032519,"traceId":"��&�Hc�Ux�12�'�{"}],"scope":{"name":"otlptest"}}]}],"timestamp":"2026-03-23T12:45:27.714935300Z"}

As you see the spanId/traceId here are not strings but what looks like raw byte array?

Configuration

api:
  address: 0.0.0.0:8686
  enabled: true
  playground: false
data_dir: /vector-data-dir
sinks:
  loki_opentelemetry:
    buffer:
      max_size: 3221225472
      type: disk
      when_full: drop_newest
    inputs:
    - enriched_logs_raw
    protocol:
      acknowledgements:
        enabled: false
      encoding:
        codec: otlp
      framing:
        method: varint_length_delimited
      method: post
      request:
        headers:
          content-type: application/x-protobuf
        retry_attempts: 20
      type: http
      uri: http://loki-gateway:80/otlp/v1/logs
    type: opentelemetry
sources:
  open_telemetry_raw:
    grpc:
      address: 0.0.0.0:4320
    http:
      address: 0.0.0.0:4319
    type: opentelemetry
    use_otlp_decoding: true
transforms:
  enriched_logs_raw:
    file: /vector-data-dir/vrl-scripts/enrichment-otlp/enrichment-otlp.vrl
    inputs:
    - open_telemetry_raw.logs
    type: remap

Version

  • Docker image: 0.54.0-alpine
  • Helm version: 0.51.0

Additional Context

I'm adding the generator/client code, if needed to reproduce:

Generator/client code

Generator

// file: main.go
package main

import (
	"context"
	"log"
	"math/rand"
	"os"
	"os/signal"
	"time"

	"go.opentelemetry.io/otel/attribute"
	"go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploggrpc"
	"go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc"
	otellog "go.opentelemetry.io/otel/log"
	sdklog "go.opentelemetry.io/otel/sdk/log"
	"go.opentelemetry.io/otel/sdk/resource"
	sdktrace "go.opentelemetry.io/otel/sdk/trace"
)

var (
	allowedEnvironments = []string{"dev", "staging", "prod"}
	allowedServices     = []string{"myservice-a", "myservice-b", "myservice-c"}
)

func randomString(list []string) string {
	return list[rand.Intn(len(list))]
}

func main() {
	ctx, cancel := signal.NotifyContext(context.Background(), os.Interrupt)
	defer cancel()

	rnd := rand.New(rand.NewSource(time.Now().UnixNano()))

	// Pick random but consistent resource values for this instance
	env := randomString(allowedEnvironments)
	service := randomString(allowedServices)

	// Create resource with service info (this populates the OTLP resource section)
	res, err := resource.New(ctx,
		resource.WithAttributes(
			attribute.String("service.name", service),
			attribute.String("environment", env),
		),
	)
	if err != nil {
		log.Fatalf("Failed to create resource: %v", err)
	}

	// OTLP gRPC endpoint
	endpoint := os.Getenv("OTELCOL_GRPC_ENDPOINT")
	if endpoint == "" {
		endpoint = "otelcol:4317"
	}

	// --- Trace Exporter Setup ---
	traceExp, err := otlptracegrpc.New(ctx,
		otlptracegrpc.WithEndpoint(endpoint),
		otlptracegrpc.WithInsecure(),
	)
	if err != nil {
		log.Fatalf("failed to create trace exporter: %v", err)
	}
	tp := sdktrace.NewTracerProvider(
		sdktrace.WithBatcher(traceExp),
		sdktrace.WithResource(res),
	)
	defer func() { _ = tp.Shutdown(ctx) }()

	// --- Log Exporter Setup (OTLP native logs) ---
	logExp, err := otlploggrpc.New(ctx,
		otlploggrpc.WithEndpoint(endpoint),
		otlploggrpc.WithInsecure(),
	)
	if err != nil {
		log.Fatalf("failed to create log exporter: %v", err)
	}
	lp := sdklog.NewLoggerProvider(
		sdklog.WithProcessor(sdklog.NewBatchProcessor(logExp)),
		sdklog.WithResource(res),
	)
	defer func() { _ = lp.Shutdown(ctx) }()

	logger := lp.Logger("otlptest")
	tracer := tp.Tracer("otlptest")

	for {
		select {
		case <-ctx.Done():
			return
		default:
			// Start a new span
			spanCtx, span := tracer.Start(ctx, "otlp-log-event")
			traceID := span.SpanContext().TraceID()
			spanID := span.SpanContext().SpanID()

			// Generate log message
			message := "Log Event: " + randString(12, rnd)

			// Build a proper OTLP log record
			var record otellog.Record
			record.SetTimestamp(time.Now())
			record.SetBody(otellog.StringValue(message))
			record.SetSeverity(otellog.SeverityInfo)
			record.SetSeverityText("INFO")

			// Add any custom attributes (not duplicating resource or trace context)
			record.AddAttributes(
				otellog.String("log.source", "otlptest-app"),
			)

			// Emit with spanCtx so the SDK automatically populates traceId/spanId
			// from the active span — no manual duplication needed
			logger.Emit(spanCtx, record)

			// Also add as span event for trace correlation
			span.AddEvent("log") // Only add the message as an attribute, trace/span context is automatic

			span.End()

			log.Printf("Emitted log: %s (trace=%s, span=%s)", message, traceID, spanID)
			time.Sleep(1000 * time.Millisecond)
		}
	}
}

func randString(n int, rnd *rand.Rand) string {
	const letters = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
	b := make([]byte, n)
	for i := range b {
		b[i] = letters[rnd.Intn(len(letters))]
	}
	return string(b)
}
// file: go.mod
module otlptest

go 1.25.1

require (
	go.opentelemetry.io/otel v1.42.0
	go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploggrpc v0.18.0
	go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.42.0
	go.opentelemetry.io/otel/log v0.18.0
	go.opentelemetry.io/otel/sdk v1.42.0
	go.opentelemetry.io/otel/sdk/log v0.18.0
)

require (
	github.com/cenkalti/backoff/v5 v5.0.3 // indirect
	github.com/cespare/xxhash/v2 v2.3.0 // indirect
	github.com/go-logr/logr v1.4.3 // indirect
	github.com/go-logr/stdr v1.2.2 // indirect
	github.com/google/uuid v1.6.0 // indirect
	github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0 // indirect
	go.opentelemetry.io/auto/sdk v1.2.1 // indirect
	go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.42.0 // indirect
	go.opentelemetry.io/otel/metric v1.42.0 // indirect
	go.opentelemetry.io/otel/trace v1.42.0 // indirect
	go.opentelemetry.io/proto/otlp v1.9.0 // indirect
	golang.org/x/net v0.51.0 // indirect
	golang.org/x/sys v0.41.0 // indirect
	golang.org/x/text v0.34.0 // indirect
	google.golang.org/genproto/googleapis/api v0.0.0-20260209200024-4cfbd4190f57 // indirect
	google.golang.org/genproto/googleapis/rpc v0.0.0-20260209200024-4cfbd4190f57 // indirect
	google.golang.org/grpc v1.79.2 // indirect
	google.golang.org/protobuf v1.36.11 // indirect
)
// file: go.sum
github.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=
github.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F97BxZthm/crw=
github.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=
github.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
github.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=
github.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
github.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=
github.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0 h1:HWRh5R2+9EifMyIHV7ZV+MIZqgz+PMpZ14Jynv3O2Zs=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0/go.mod h1:JfhWUomR1baixubs02l85lZYYOm7LV6om4ceouMv45c=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
go.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=
go.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=
go.opentelemetry.io/otel v1.42.0 h1:lSQGzTgVR3+sgJDAU/7/ZMjN9Z+vUip7leaqBKy4sho=
go.opentelemetry.io/otel v1.42.0/go.mod h1:lJNsdRMxCUIWuMlVJWzecSMuNjE7dOYyWlqOXWkdqCc=
go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploggrpc v0.18.0 h1:deI9UQMoGFgrg5iLPgzueqFPHevDl+28YKfSpPTI6rY=
go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploggrpc v0.18.0/go.mod h1:PFx9NgpNUKXdf7J4Q3agRxMs3Y07QhTCVipKmLsMKnU=
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.42.0 h1:THuZiwpQZuHPul65w4WcwEnkX2QIuMT+UFoOrygtoJw=
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.42.0/go.mod h1:J2pvYM5NGHofZ2/Ru6zw/TNWnEQp5crgyDeSrYpXkAw=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.42.0 h1:zWWrB1U6nqhS/k6zYB74CjRpuiitRtLLi68VcgmOEto=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.42.0/go.mod h1:2qXPNBX1OVRC0IwOnfo1ljoid+RD0QK3443EaqVlsOU=
go.opentelemetry.io/otel/log v0.18.0 h1:XgeQIIBjZZrliksMEbcwMZefoOSMI1hdjiLEiiB0bAg=
go.opentelemetry.io/otel/log v0.18.0/go.mod h1:KEV1kad0NofR3ycsiDH4Yjcoj0+8206I6Ox2QYFSNgI=
go.opentelemetry.io/otel/metric v1.42.0 h1:2jXG+3oZLNXEPfNmnpxKDeZsFI5o4J+nz6xUlaFdF/4=
go.opentelemetry.io/otel/metric v1.42.0/go.mod h1:RlUN/7vTU7Ao/diDkEpQpnz3/92J9ko05BIwxYa2SSI=
go.opentelemetry.io/otel/sdk v1.42.0 h1:LyC8+jqk6UJwdrI/8VydAq/hvkFKNHZVIWuslJXYsDo=
go.opentelemetry.io/otel/sdk v1.42.0/go.mod h1:rGHCAxd9DAph0joO4W6OPwxjNTYWghRWmkHuGbayMts=
go.opentelemetry.io/otel/sdk/log v0.18.0 h1:n8OyZr7t7otkeTnPTbDNom6rW16TBYGtvyy2Gk6buQw=
go.opentelemetry.io/otel/sdk/log v0.18.0/go.mod h1:C0+wxkTwKpOCZLrlJ3pewPiiQwpzycPI/u6W0Z9fuYk=
go.opentelemetry.io/otel/sdk/log/logtest v0.18.0 h1:l3mYuPsuBx6UKE47BVcPrZoZ0q/KER57vbj2qkgDLXA=
go.opentelemetry.io/otel/sdk/log/logtest v0.18.0/go.mod h1:7cHtiVJpZebB3wybTa4NG+FUo5NPe3PROz1FqB0+qdw=
go.opentelemetry.io/otel/sdk/metric v1.42.0 h1:D/1QR46Clz6ajyZ3G8SgNlTJKBdGp84q9RKCAZ3YGuA=
go.opentelemetry.io/otel/sdk/metric v1.42.0/go.mod h1:Ua6AAlDKdZ7tdvaQKfSmnFTdHx37+J4ba8MwVCYM5hc=
go.opentelemetry.io/otel/trace v1.42.0 h1:OUCgIPt+mzOnaUTpOQcBiM/PLQ/Op7oq6g4LenLmOYY=
go.opentelemetry.io/otel/trace v1.42.0/go.mod h1:f3K9S+IFqnumBkKhRJMeaZeNk9epyhnCmQh/EysQCdc=
go.opentelemetry.io/proto/otlp v1.9.0 h1:l706jCMITVouPOqEnii2fIAuO3IVGBRPV5ICjceRb/A=
go.opentelemetry.io/proto/otlp v1.9.0/go.mod h1:xE+Cx5E/eEHw+ISFkwPLwCZefwVjY+pqKg1qcK03+/4=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
golang.org/x/net v0.51.0 h1:94R/GTO7mt3/4wIKpcR5gkGmRLOuE/2hNGeWq/GBIFo=
golang.org/x/net v0.51.0/go.mod h1:aamm+2QF5ogm02fjy5Bb7CQ0WMt1/WVM7FtyaTLlA9Y=
golang.org/x/sys v0.41.0 h1:Ivj+2Cp/ylzLiEU89QhWblYnOE9zerudt9Ftecq2C6k=
golang.org/x/sys v0.41.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.34.0 h1:oL/Qq0Kdaqxa1KbNeMKwQq0reLCCaFtqu2eNuSeNHbk=
golang.org/x/text v0.34.0/go.mod h1:homfLqTYRFyVYemLBFl5GgL/DWEiH5wcsQ5gSh1yziA=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
google.golang.org/genproto/googleapis/api v0.0.0-20260209200024-4cfbd4190f57 h1:JLQynH/LBHfCTSbDWl+py8C+Rg/k1OVH3xfcaiANuF0=
google.golang.org/genproto/googleapis/api v0.0.0-20260209200024-4cfbd4190f57/go.mod h1:kSJwQxqmFXeo79zOmbrALdflXQeAYcUbgS7PbpMknCY=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260209200024-4cfbd4190f57 h1:mWPCjDEyshlQYzBpMNHaEof6UX1PmHcaUODUywQ0uac=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260209200024-4cfbd4190f57/go.mod h1:j9x/tPzZkyxcgEFkiKEEGxfvyumM01BEtsW8xzOahRQ=
google.golang.org/grpc v1.79.2 h1:fRMD94s2tITpyJGtBBn7MkMseNpOZU8ZxgC3MMBaXRU=
google.golang.org/grpc v1.79.2/go.mod h1:KmT0Kjez+0dde/v2j9vzwoAScgEPx/Bw1CYChhHLrHQ=
google.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE=
google.golang.org/protobuf v1.36.11/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

Docker

# file: Dockerfile
# syntax=docker/dockerfile:1
FROM golang:1.25.1 AS builder

WORKDIR /app
COPY . .
RUN go build -o otlptestbin main.go

FROM debian:bookworm-slim
WORKDIR /app
COPY --from=builder /app/otlptestbin /app/otlptestbin
ENTRYPOINT ["/app/otlptestbin"]
# file: docker-compose.yaml
services:
  otlptest:
    build:
      context: .
      dockerfile: Dockerfile
    container_name: otlptest-app
    restart: unless-stopped
    environment:
      - OTELCOL_GRPC_ENDPOINT=otelcol:4317
    depends_on:
      - otelcol
    volumes:
      - otlplogs:/shared
    networks:
      - otlpnet

  otelcol:
    image: otel/opentelemetry-collector-contrib:latest
    container_name: otel-collector
    restart: unless-stopped
    command: ["--config=/etc/otelcol-config.yaml"]
    volumes:
      - ./otelcol-config.yaml:/etc/otelcol-config.yaml:ro
      - otlplogs:/shared
    ports:
      - "4317:4317"
      - "4318:4318"
    networks:
      - otlpnet

volumes:
  otlplogs:
    driver: local

networks:
  otlpnet:
    driver: bridge

Opentelemetry

# file: otelcol-config.yaml
receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318
exporters:
  debug:
    verbosity: detailed
  otlphttp:
    endpoint: "https://my-logs.example.com/ingest/otlp/"
    sending_queue:
      enabled: true
      num_consumers: 3
      queue_size: 5000
processors:
  batch: {}
  transform/logs:
    log_statements:
      - context: log
        statements:
          - set(attributes["trace_id"], trace_id.string)
          - set(attributes["span_id"], span_id.string)
service:
  pipelines:
    logs:
      receivers: [otlp]
      processors: [transform/logs, batch]
      exporters: [debug, otlphttp]
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [debug, otlphttp]

The following is the enrichment code I'm using. As you see I do not touch the spanid/traceid:

collectorTs = {
    "key": "logs.collector.timestamp",
    "value": {
        "stringValue": format_timestamp!(now(), format:"%s%f")
    }
}
.resourceLogs = map_values(array!(get!(.resourceLogs, []))) -> |rl| {
    ogResAttrs = get(rl.resource.attributes, []) ?? []
    if is_null(ogResAttrs) {
        ogResAttrs = []
    }
    res_attr_obj = {}
    # flatten attributes for easier checks
    for_each(array!(ogResAttrs)) -> |_atIdx, at|{
        flattened_at = flatten(object!(at))
        val = get(flattened_at, ["value.stringValue"]) ?? ""
        k = get(flattened_at, ["key"]) ?? ""
        res_attr_obj = set!(res_attr_obj, [k], val)
    }
    
    hostIp = get(res_attr_obj, ["host.ip"]) ?? ""
    monsrvrow, err = get_enrichment_table_record("monitoring_server", { "ID1": hostIp })
    if err != null {
        # could not find enrichment data via IP address, let's try with the hostname
        hostName = get(res_attr_obj, ["host.name"]) ?? ""
        monsrvrow, err = get_enrichment_table_record("monitoring_server", { "ID2": hostName })
    }
    if err == null {
        if (res_attr_obj.my_env == null || res_attr_obj.my_env == "") {
            env_res = monsrvrow."Environment"
            if (env_res != null || env_res != "") {
                res_attr_obj.my_env = env_res
            }
        }
    } else {
        log("Could not find enrichment data from 'monitoring_server' table", level: "info", rate_limit_secs: 60)
    }

    newAttrs = []
    for_each(object(res_attr_obj)) -> |key,value|{
        newAttrs = push(array(newAttrs), {
            "key": key,
            "value": {
                "stringValue": value
            }
        })
    }
    rl.resource.attributes = newAttrs
    
    rl.scopeLogs = map_values(array!(get!(rl.scopeLogs, []))) -> |sl| {
        sl.logRecords = map_values(array!(get!(sl.logRecords, []))) -> |lr| {
            existing_attrs = get!(lr.attributes, [])
            lr.attributes = push(array!(existing_attrs), collectorTs)
            lr
        }
        sl
    }
    rl
}

References

#22696 and #24316

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions