Skip to content

Commit 1f58c9b

Browse files
author
Sattar Gyulmamedov
committed
clear rst from md in consumer & broker reference docs
1 parent 0de1cd4 commit 1f58c9b

7 files changed

Lines changed: 42 additions & 83 deletions

File tree

mddocs/docs/en/reference/broker/index.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,6 @@
1-
(message-broker)=
1+
# Message Broker { #message-broker }
22

3-
# Message Broker
4-
5-
Message broker is component used by OpenLineage to store all received events. Then these avents are handled by {ref}`message-consumer`, in batches.
3+
Message broker is component used by OpenLineage to store all received events. Then these avents are handled by [`message-consumer`][message-consumer], in batches.
64

75
Currently, Data.Rentgen supports only [Apache Kafka](https://kafka.apache.org/) as message broker.
86

@@ -13,7 +11,7 @@ Other popular OpenLineage server implementations use HTTP protocol for receiving
1311
- Kafka is designed to be scalable. If performance level is not enough, just add another broker to the cluster. For HTTP servers it's not that simple,
1412
as this requires load balancing on reverse proxy side or DNS side.
1513
- Kafka is designed to receive A LOT of events per second, like millions, and store them on disk as fast as possible. So no events are lost
16-
even if {ref}`message-consumer` is overloaded - events are already on disk, and will be handled later.
14+
even if [`message-consumer`][message-consumer] is overloaded - events are already on disk, and will be handled later.
1715
- ETL scripts are mostly run on schedule The usual pattern is almost zero events during the day, but huge spikes at every whole hour
1816
(e.g. at 00:00, 01:00, 03:00, 12:00). Kafka is used as an intermediate buffer which smooths these spikes.
1917
- Events stored in Kafka can be read in batches, even if OpenLineage integration initially send them one-by-one.
@@ -38,23 +36,24 @@ Other popular OpenLineage server implementations use HTTP protocol for receiving
3836

3937
```console
4038
$ docker compose --profile broker up -d --wait
39+
...
4140
```
4241

4342
`docker-compose` will download Apache Kafka image, create container and volume, and then start container.
4443

4544
Image entrypoint will create database if volume is empty.
4645
Options can be set via `.env` file or `environment` section in `docker-compose.yml`
4746

48-
```{eval-rst}
49-
.. dropdown:: ``docker-compose.yml``
47+
=== "docker-compose.yml"
5048

49+
```yaml
5150
.. literalinclude:: ../../../docker-compose.yml
5251
:emphasize-lines: 101-117,177
5352
```
5453
55-
```{eval-rst}
56-
.. dropdown:: ``.env.docker``
54+
=== ".env.docker"
5755
56+
```yaml
5857
.. literalinclude:: ../../../.env.docker
5958
:emphasize-lines: 7-20
6059
```
Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
(configuration-consumer-specific)=
1+
# Consumer-specific settings { #configuration-consumer-specific }
22

3-
# Consumer-specific settings
4-
5-
```{eval-rst}
6-
.. autopydantic_model:: data_rentgen.consumer.settings.consumer.ConsumerSettings
7-
```
3+
::: data_rentgen.consumer.settings.consumer.ConsumerSettings
Lines changed: 2 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,3 @@
1-
(configuration-consumer)=
1+
# Consumer configuration { #configuration-consumer }
22

3-
# Consumer configuration
4-
5-
```{toctree}
6-
:caption: Configuration
7-
:hidden: true
8-
:maxdepth: 1
9-
10-
kafka
11-
consumer-specific
12-
producer-specific
13-
logging
14-
```
15-
16-
```{eval-rst}
17-
.. autopydantic_settings:: data_rentgen.consumer.settings.ConsumerApplicationSettings
18-
```
3+
::: data_rentgen.consumer.settings.ConsumerApplicationSettings
Lines changed: 7 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,13 @@
1-
(configuration-consumer-kafka)=
1+
# Kafka settings { #configuration-consumer-kafka }
22

3-
# Kafka settings
3+
::: data_rentgen.consumer.settings.kafka.KafkaSettings
44

5-
```{eval-rst}
6-
.. autopydantic_model:: data_rentgen.consumer.settings.kafka.KafkaSettings
7-
```
5+
::: data_rentgen.consumer.settings.security.scram.KafkaSecurityScram256Settings
86

9-
```{eval-rst}
10-
.. autopydantic_model:: data_rentgen.consumer.settings.security.scram.KafkaSecurityScram256Settings
11-
```
7+
::: data_rentgen.consumer.settings.security.scram.KafkaSecurityScram512Settings
128

13-
```{eval-rst}
14-
.. autopydantic_model:: data_rentgen.consumer.settings.security.scram.KafkaSecurityScram512Settings
15-
```
9+
::: data_rentgen.consumer.settings.security.plain.KafkaSecurityPlaintextSettings
1610

17-
```{eval-rst}
18-
.. autopydantic_model:: data_rentgen.consumer.settings.security.plain.KafkaSecurityPlaintextSettings
19-
```
11+
::: data_rentgen.consumer.settings.security.gssapi.KafkaSecurityGSSAPISettings
2012

21-
```{eval-rst}
22-
.. autopydantic_model:: data_rentgen.consumer.settings.security.gssapi.KafkaSecurityGSSAPISettings
23-
```
24-
25-
```{eval-rst}
26-
.. autopydantic_model:: data_rentgen.consumer.settings.security.anonymous.KafkaSecurityAnonymousSettings
27-
```
13+
::: data_rentgen.consumer.settings.security.anonymous.KafkaSecurityAnonymousSettings
Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
(configuration-consumer-logging)=
1+
# Logging settings { #configuration-consumer-logging }
22

3-
# Logging settings
4-
5-
```{eval-rst}
6-
.. autopydantic_model:: data_rentgen.logging.settings.LoggingSettings
7-
```
3+
::: data_rentgen.logging.settings.LoggingSettings
Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
(configuration-producer-specific)=
1+
# Producer-specific settings { #configuration-producer-specific }
22

3-
# Producer-specific settings
4-
5-
```{eval-rst}
6-
.. autopydantic_model:: data_rentgen.consumer.settings.producer.ProducerSettings
7-
```
3+
::: data_rentgen.consumer.settings.producer.ProducerSettings
Lines changed: 19 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,6 @@
1-
(message-consumer)=
1+
# Message Consumer { #message-consumer }
22

3-
# Message Consumer
4-
5-
Data.Rentgen fetches messages from a {ref}`message-broker` using a [FastStream](https://faststream.airt.ai) based consumer, parses incoming messages,
6-
and creates all parsed entities in the {ref}`database`. Malformed messages are send back to broker, to different topic.
3+
Data.Rentgen fetches messages from a [`message-broker`][message-broker] using a [FastStream](https://faststream.airt.ai) based consumer, parses incoming messages, and creates all parsed entities in the [`database`][database]. Malformed messages are send back to broker, to different topic.
74

85
## Install & run
96

@@ -17,22 +14,23 @@ and creates all parsed entities in the {ref}`database`. Malformed messages are s
1714

1815
```console
1916
$ docker compose --profile consumer up -d --wait
17+
...
2018
```
2119

2220
`docker-compose` will download all necessary images, create containers, and then start consumer process.
2321

2422
Options can be set via `.env` file or `environment` section in `docker-compose.yml`
2523

26-
```{eval-rst}
27-
.. dropdown:: ``docker-compose.yml``
24+
=== "docker-compose.yml"
2825

26+
```yaml
2927
.. literalinclude:: ../../../docker-compose.yml
3028
:emphasize-lines: 120-138
3129
```
3230
33-
```{eval-rst}
34-
.. dropdown:: ``.env.docker``
31+
=== ".env.docker"
3532
33+
```yaml
3634
.. literalinclude:: ../../../.env.docker
3735
:emphasize-lines: 22-24,29-34
3836
```
@@ -41,48 +39,51 @@ and creates all parsed entities in the {ref}`database`. Malformed messages are s
4139
4240
- Install Python 3.10 or above
4341
44-
- Setup {ref}`database`, run migrations and create partitions
42+
- Setup [`database`][database], run migrations and create partitions
4543

46-
- Setup {ref}`message-broker`
44+
- Setup [`message-broker`][message-broker]
4745

4846
- Create virtual environment
4947

5048
```console
5149
$ python -m venv /some/.venv
50+
...
5251
$ source /some/.venv/activate
52+
...
5353
```
5454

5555
- Install `data-rentgen` package with following *extra* dependencies:
5656

5757
```console
5858
$ pip install data-rentgen[consumer,postgres]
59+
...
5960
```
6061

61-
:::{note}
62+
!!! note
63+
6264
For `SASL_GSSAPI` auth mechanism you also need to install system packages providing `kinit` and `kdestroy` binaries:
6365

6466
```console
6567
$ apt install libkrb5-dev krb5-user gcc make autoconf # Debian-based
68+
...
6669
$ dnf install krb5-devel krb5-libs krb5-workstation gcc make autoconf # CentOS, OracleLinux
70+
...
6771
```
6872

6973
And then install `gssapi` extra:
7074

7175
```console
7276
$ pip install data-rentgen[consumer,postgres,gssapi]
77+
...
7378
```
74-
:::
7579

7680
- Run consumer process
7781

7882
```console
7983
$ python -m data_rentgen.consumer
84+
...
8085
```
8186

8287
## See also
8388

84-
```{toctree}
85-
:maxdepth: 1
86-
87-
configuration/index
88-
```
89+
[Consumer configuration][configuration/index.md]

0 commit comments

Comments
 (0)