Skip to content

feat: Datadog log adapter for ingestion #1

@leo-aa88

Description

@leo-aa88

Summary

Add a Datadog ingestion path so operators can pull logs from Datadog into the same pipeline as file-based raglogs ingest (normalize → fingerprint → Postgres/pgvector).

This is the suggested next implementation focus (chosen over Loki/K8s first).

Motivation

  • README Roadmap lists a Datadog adapter.
  • Many teams do not have log files on disk; Datadog is a common source of truth.

Scope (proposal)

  • New adapter under src/.../adapters/ (or documented subpackage) that yields ParsedLogLine like the file adapter.
  • CLI entry point, e.g. raglogs ingest-datadog or raglogs ingest --source datadog with required flags/env (API key, site, query/window, service tags).
  • Configuration via .env / env vars (e.g. API key, app key if needed, site, default query window).
  • Document limits: rate limits, max rows per run, pagination.

Acceptance criteria

  • Can ingest a bounded time window from Datadog into the existing DB schema without manual export files.
  • Unit tests for adapter parsing/mapping; integration test optional (mock HTTP or recorded fixtures).

References

Notes

  • Align field mapping with README JSON aliases (timestamp, message, level, service, env, trace/request IDs, host) where Datadog attributes provide them.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions