You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs: outputs: update Zerobus documentation for clarity and consistency
- Revised the description to use "through" instead of "via" for improved readability.
- Clarified the OAuth2 terminology to "OAuth 2.0" for consistency.
- Ensured consistent language throughout the documentation regarding Zerobus and its configuration parameters.
Signed-off-by: mats <mats.kazuki@gmail.com>
Copy file name to clipboardExpand all lines: pipeline/outputs/zerobus.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
description: Send logs to Databricks via Zerobus
2
+
description: Send logs to Databricks through Zerobus
3
3
---
4
4
5
5
# Zerobus
@@ -8,9 +8,9 @@ description: Send logs to Databricks via Zerobus
8
8
**Supported event types:**`logs`
9
9
{% endhint %}
10
10
11
-
The _Zerobus_ output plugin lets you ingest log records into a [Databricks](https://www.databricks.com/) table through the Zerobus streaming ingestion interface. Records are converted to JSON and sent via the Zerobus SDK using gRPC.
11
+
The _Zerobus_ output plugin lets you ingest log records into a [Databricks](https://www.databricks.com/) table through the Zerobus streaming ingestion interface. Records are converted to JSON and sent by using the Zerobus SDK using gRPC.
12
12
13
-
Before you begin, you need a Databricks workspace with a Unity Catalog table configured for Zerobus ingestion, and an OAuth2 service principal (client ID and client secret) with appropriate permissions.
13
+
Before you begin, you need a Databricks workspace with a Unity Catalog table configured for Zerobus ingestion, and an OAuth 2.0 service principal (client ID and client secret) with appropriate permissions.
14
14
15
15
## Configuration parameters
16
16
@@ -19,16 +19,16 @@ Before you begin, you need a Databricks workspace with a Unity Catalog table con
19
19
|`endpoint`| Zerobus gRPC endpoint URL. If no scheme is provided, `https://` is automatically prepended. |_none_|
20
20
|`workspace_url`| Databricks workspace URL. If no scheme is provided, `https://` is automatically prepended. |_none_|
21
21
|`table_name`| Fully qualified Unity Catalog table name in `catalog.schema.table` format. |_none_|
22
-
|`client_id`|OAuth2 client ID for authentication. |_none_|
23
-
|`client_secret`|OAuth2 client secret for authentication. |_none_|
22
+
|`client_id`|OAuth 2.0 client ID for authentication. |_none_|
23
+
|`client_secret`|OAuth 2.0 client secret for authentication. |_none_|
24
24
|`add_tag`| If enabled, the Fluent Bit tag is added as a `_tag` field in each record. |`true`|
25
25
|`time_key`| Key name for the injected timestamp. The timestamp is formatted as RFC 3339 with nanosecond precision. Set to an empty string to disable timestamp injection. |`_time`|
26
26
|`log_key`| Comma-separated list of record keys to include in the output. When unset, all keys are included. |_none_|
27
27
|`raw_log_key`| If set, the full original record (before filtering by `log_key`) is stored as a JSON string under this key name. |_none_|
28
28
29
29
## Get started
30
30
31
-
To send log records to Databricks via Zerobus, configure the plugin with your Zerobus endpoint, workspace URL, table name, and OAuth2 credentials.
31
+
To send log records to Databricks through Zerobus, configure the plugin with your Zerobus endpoint, workspace URL, table name, and OAuth 2.0 credentials.
0 commit comments