Skip to content

Commit 8fd9703

Browse files
committed
docs: outputs: update Zerobus documentation for clarity and consistency
- Revised the description to use "through" instead of "via" for improved readability. - Clarified the OAuth2 terminology to "OAuth 2.0" for consistency. - Ensured consistent language throughout the documentation regarding Zerobus and its configuration parameters. Signed-off-by: mats <mats.kazuki@gmail.com>
1 parent 1c4b72f commit 8fd9703

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

pipeline/outputs/zerobus.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
description: Send logs to Databricks via Zerobus
2+
description: Send logs to Databricks through Zerobus
33
---
44

55
# Zerobus
@@ -8,9 +8,9 @@ description: Send logs to Databricks via Zerobus
88
**Supported event types:** `logs`
99
{% endhint %}
1010

11-
The _Zerobus_ output plugin lets you ingest log records into a [Databricks](https://www.databricks.com/) table through the Zerobus streaming ingestion interface. Records are converted to JSON and sent via the Zerobus SDK using gRPC.
11+
The _Zerobus_ output plugin lets you ingest log records into a [Databricks](https://www.databricks.com/) table through the Zerobus streaming ingestion interface. Records are converted to JSON and sent by using the Zerobus SDK using gRPC.
1212

13-
Before you begin, you need a Databricks workspace with a Unity Catalog table configured for Zerobus ingestion, and an OAuth2 service principal (client ID and client secret) with appropriate permissions.
13+
Before you begin, you need a Databricks workspace with a Unity Catalog table configured for Zerobus ingestion, and an OAuth 2.0 service principal (client ID and client secret) with appropriate permissions.
1414

1515
## Configuration parameters
1616

@@ -19,16 +19,16 @@ Before you begin, you need a Databricks workspace with a Unity Catalog table con
1919
| `endpoint` | Zerobus gRPC endpoint URL. If no scheme is provided, `https://` is automatically prepended. | _none_ |
2020
| `workspace_url` | Databricks workspace URL. If no scheme is provided, `https://` is automatically prepended. | _none_ |
2121
| `table_name` | Fully qualified Unity Catalog table name in `catalog.schema.table` format. | _none_ |
22-
| `client_id` | OAuth2 client ID for authentication. | _none_ |
23-
| `client_secret` | OAuth2 client secret for authentication. | _none_ |
22+
| `client_id` | OAuth 2.0 client ID for authentication. | _none_ |
23+
| `client_secret` | OAuth 2.0 client secret for authentication. | _none_ |
2424
| `add_tag` | If enabled, the Fluent Bit tag is added as a `_tag` field in each record. | `true` |
2525
| `time_key` | Key name for the injected timestamp. The timestamp is formatted as RFC 3339 with nanosecond precision. Set to an empty string to disable timestamp injection. | `_time` |
2626
| `log_key` | Comma-separated list of record keys to include in the output. When unset, all keys are included. | _none_ |
2727
| `raw_log_key` | If set, the full original record (before filtering by `log_key`) is stored as a JSON string under this key name. | _none_ |
2828

2929
## Get started
3030

31-
To send log records to Databricks via Zerobus, configure the plugin with your Zerobus endpoint, workspace URL, table name, and OAuth2 credentials.
31+
To send log records to Databricks through Zerobus, configure the plugin with your Zerobus endpoint, workspace URL, table name, and OAuth 2.0 credentials.
3232

3333
### Configuration file
3434

0 commit comments

Comments
 (0)