Skip to content

Commit 1b07d2c

Browse files
api-clients-generation-pipeline[bot]ci.datadog-api-spec
andauthored
Adding compression optional field to Amazon S3 source (#3205)
Co-authored-by: ci.datadog-api-spec <packages@datadoghq.com>
1 parent 999d497 commit 1b07d2c

8 files changed

Lines changed: 146 additions & 3 deletions

.generator/schemas/v2/openapi.yaml

Lines changed: 17 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43045,12 +43045,14 @@ components:
4304543045
ObservabilityPipelineAmazonS3Source:
4304643046
description: |-
4304743047
The `amazon_s3` source ingests logs from an Amazon S3 bucket.
43048-
It supports AWS authentication and TLS encryption.
43048+
It supports AWS authentication, TLS encryption, and configurable compression.
4304943049

4305043050
**Supported pipeline types:** logs
4305143051
properties:
4305243052
auth:
4305343053
$ref: "#/components/schemas/ObservabilityPipelineAwsAuth"
43054+
compression:
43055+
$ref: "#/components/schemas/ObservabilityPipelineAmazonS3SourceCompression"
4305443056
id:
4305543057
description: The unique identifier for this component. Used in other parts of the pipeline to reference this component (for example, as the `input` to downstream components).
4305643058
example: aws-s3-source
@@ -43073,6 +43075,20 @@ components:
4307343075
- region
4307443076
type: object
4307543077
x-pipeline-types: [logs]
43078+
ObservabilityPipelineAmazonS3SourceCompression:
43079+
description: Compression format for objects retrieved from the S3 bucket. Use `auto` to detect compression from the object's Content-Encoding header or file extension.
43080+
enum:
43081+
- auto
43082+
- none
43083+
- gzip
43084+
- zstd
43085+
example: gzip
43086+
type: string
43087+
x-enum-varnames:
43088+
- AUTO
43089+
- NONE
43090+
- GZIP
43091+
- ZSTD
4307643092
ObservabilityPipelineAmazonS3SourceType:
4307743093
default: amazon_s3
4307843094
description: The source type. Always `amazon_s3`.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
2026-04-08T12:44:25.060Z

cassettes/features/v2/observability_pipelines/Validate-an-observability-pipeline-with-amazon-S3-source-compression-returns-OK-response.yml

Lines changed: 27 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
# Validate an observability pipeline with amazon S3 source compression returns "OK" response
2+
3+
require "datadog_api_client"
4+
api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new
5+
6+
body = DatadogAPIClient::V2::ObservabilityPipelineSpec.new({
7+
data: DatadogAPIClient::V2::ObservabilityPipelineSpecData.new({
8+
attributes: DatadogAPIClient::V2::ObservabilityPipelineDataAttributes.new({
9+
config: DatadogAPIClient::V2::ObservabilityPipelineConfig.new({
10+
destinations: [
11+
DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestination.new({
12+
id: "datadog-logs-destination",
13+
inputs: [
14+
"my-processor-group",
15+
],
16+
type: DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS,
17+
}),
18+
],
19+
processor_groups: [
20+
DatadogAPIClient::V2::ObservabilityPipelineConfigProcessorGroup.new({
21+
enabled: true,
22+
id: "my-processor-group",
23+
include: "service:my-service",
24+
inputs: [
25+
"amazon-s3-source",
26+
],
27+
processors: [
28+
DatadogAPIClient::V2::ObservabilityPipelineFilterProcessor.new({
29+
enabled: true,
30+
id: "filter-processor",
31+
include: "service:my-service",
32+
type: DatadogAPIClient::V2::ObservabilityPipelineFilterProcessorType::FILTER,
33+
}),
34+
],
35+
}),
36+
],
37+
sources: [
38+
DatadogAPIClient::V2::ObservabilityPipelineAmazonS3Source.new({
39+
id: "amazon-s3-source",
40+
type: DatadogAPIClient::V2::ObservabilityPipelineAmazonS3SourceType::AMAZON_S3,
41+
region: "us-east-1",
42+
compression: DatadogAPIClient::V2::ObservabilityPipelineAmazonS3SourceCompression::GZIP,
43+
}),
44+
],
45+
}),
46+
name: "Pipeline with S3 Source Compression",
47+
}),
48+
type: "pipelines",
49+
}),
50+
})
51+
p api_instance.validate_pipeline(body)

features/v2/observability_pipelines.feature

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -207,6 +207,14 @@ Feature: Observability Pipelines
207207
Then the response status is 200 OK
208208
And the response "errors" has length 0
209209

210+
@team:DataDog/observability-pipelines
211+
Scenario: Validate an observability pipeline with amazon S3 source compression returns "OK" response
212+
Given new "ValidatePipeline" request
213+
And body with value {"data": {"attributes": {"config": {"destinations": [{"id": "datadog-logs-destination", "inputs": ["my-processor-group"], "type": "datadog_logs"}], "processor_groups": [{"enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": ["amazon-s3-source"], "processors": [{"enabled": true, "id": "filter-processor", "include": "service:my-service", "type": "filter"}]}], "sources": [{"id": "amazon-s3-source", "type": "amazon_s3", "region": "us-east-1", "compression": "gzip"}]}, "name": "Pipeline with S3 Source Compression"}, "type": "pipelines"}}
214+
When the request is sent
215+
Then the response status is 200 OK
216+
And the response "errors" has length 0
217+
210218
@team:DataDog/observability-pipelines
211219
Scenario: Validate an observability pipeline with destination secret key returns "OK" response
212220
Given new "ValidatePipeline" request

lib/datadog_api_client/inflector.rb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3758,6 +3758,7 @@ def overrides
37583758
"v2.observability_pipeline_amazon_s3_generic_encoding_parquet" => "ObservabilityPipelineAmazonS3GenericEncodingParquet",
37593759
"v2.observability_pipeline_amazon_s3_generic_encoding_parquet_type" => "ObservabilityPipelineAmazonS3GenericEncodingParquetType",
37603760
"v2.observability_pipeline_amazon_s3_source" => "ObservabilityPipelineAmazonS3Source",
3761+
"v2.observability_pipeline_amazon_s3_source_compression" => "ObservabilityPipelineAmazonS3SourceCompression",
37613762
"v2.observability_pipeline_amazon_s3_source_type" => "ObservabilityPipelineAmazonS3SourceType",
37623763
"v2.observability_pipeline_amazon_security_lake_destination" => "ObservabilityPipelineAmazonSecurityLakeDestination",
37633764
"v2.observability_pipeline_amazon_security_lake_destination_type" => "ObservabilityPipelineAmazonSecurityLakeDestinationType",

lib/datadog_api_client/v2/models/observability_pipeline_amazon_s3_source.rb

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818

1919
module DatadogAPIClient::V2
2020
# The `amazon_s3` source ingests logs from an Amazon S3 bucket.
21-
# It supports AWS authentication and TLS encryption.
21+
# It supports AWS authentication, TLS encryption, and configurable compression.
2222
#
2323
# **Supported pipeline types:** logs
2424
class ObservabilityPipelineAmazonS3Source
@@ -28,6 +28,9 @@ class ObservabilityPipelineAmazonS3Source
2828
# If omitted, the system’s default credentials are used (for example, the IAM role and environment variables).
2929
attr_accessor :auth
3030

31+
# Compression format for objects retrieved from the S3 bucket. Use `auto` to detect compression from the object's Content-Encoding header or file extension.
32+
attr_accessor :compression
33+
3134
# The unique identifier for this component. Used in other parts of the pipeline to reference this component (for example, as the `input` to downstream components).
3235
attr_reader :id
3336

@@ -50,6 +53,7 @@ class ObservabilityPipelineAmazonS3Source
5053
def self.attribute_map
5154
{
5255
:'auth' => :'auth',
56+
:'compression' => :'compression',
5357
:'id' => :'id',
5458
:'region' => :'region',
5559
:'tls' => :'tls',
@@ -63,6 +67,7 @@ def self.attribute_map
6367
def self.openapi_types
6468
{
6569
:'auth' => :'ObservabilityPipelineAwsAuth',
70+
:'compression' => :'ObservabilityPipelineAmazonS3SourceCompression',
6671
:'id' => :'String',
6772
:'region' => :'String',
6873
:'tls' => :'ObservabilityPipelineTls',
@@ -93,6 +98,10 @@ def initialize(attributes = {})
9398
self.auth = attributes[:'auth']
9499
end
95100

101+
if attributes.key?(:'compression')
102+
self.compression = attributes[:'compression']
103+
end
104+
96105
if attributes.key?(:'id')
97106
self.id = attributes[:'id']
98107
end
@@ -181,6 +190,7 @@ def ==(o)
181190
return true if self.equal?(o)
182191
self.class == o.class &&
183192
auth == o.auth &&
193+
compression == o.compression &&
184194
id == o.id &&
185195
region == o.region &&
186196
tls == o.tls &&
@@ -193,7 +203,7 @@ def ==(o)
193203
# @return [Integer] Hash code
194204
# @!visibility private
195205
def hash
196-
[auth, id, region, tls, type, url_key, additional_properties].hash
206+
[auth, compression, id, region, tls, type, url_key, additional_properties].hash
197207
end
198208
end
199209
end
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
=begin
2+
#Datadog API V2 Collection
3+
4+
#Collection of all Datadog Public endpoints.
5+
6+
The version of the OpenAPI document: 1.0
7+
Contact: support@datadoghq.com
8+
Generated by: https://github.com/DataDog/datadog-api-client-ruby/tree/master/.generator
9+
10+
Unless explicitly stated otherwise all files in this repository are licensed under the Apache-2.0 License.
11+
This product includes software developed at Datadog (https://www.datadoghq.com/).
12+
Copyright 2020-Present Datadog, Inc.
13+
14+
=end
15+
16+
require 'date'
17+
require 'time'
18+
19+
module DatadogAPIClient::V2
20+
# Compression format for objects retrieved from the S3 bucket. Use `auto` to detect compression from the object's Content-Encoding header or file extension.
21+
class ObservabilityPipelineAmazonS3SourceCompression
22+
include BaseEnumModel
23+
24+
AUTO = "auto".freeze
25+
NONE = "none".freeze
26+
GZIP = "gzip".freeze
27+
ZSTD = "zstd".freeze
28+
end
29+
end

0 commit comments

Comments
 (0)