Skip to content

Commit 780bfda

Browse files
authored
chore: CI improvements (#402)
* CI -> Moving lint to fail fast * Removing node 20 from CI * Improving CI * Triggering CI for Kafka and core * Trigger CI for sqs * Fix lint * Adding node 20 support note to readme
1 parent feace20 commit 780bfda

6 files changed

Lines changed: 75 additions & 34 deletions

File tree

.github/workflows/ci.common.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,9 @@ jobs:
2727
- name: Build TS
2828
run: npm run build -- --filter=${{ inputs.package_name }}
2929

30+
- name: Run lint
31+
run: npm run lint -- --filter=${{ inputs.package_name }}
32+
3033
- name: Docker start
3134
run: npm run docker:start:ci -- --filter=${{ inputs.package_name }}
3235

@@ -35,6 +38,3 @@ jobs:
3538

3639
- name: Docker stop
3740
run: npm run docker:stop:ci -- --filter=${{ inputs.package_name }}
38-
39-
- name: Run lint
40-
run: npm run lint -- --filter=${{ inputs.package_name }}

.github/workflows/ci.yml

Lines changed: 54 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -7,39 +7,69 @@ on:
77
pull_request:
88

99
jobs:
10-
general:
11-
strategy:
12-
matrix:
13-
node-version: [20.x, 22.x, 24.x]
14-
package-name: [
15-
'@message-queue-toolkit/amqp',
16-
'@message-queue-toolkit/core',
17-
'@message-queue-toolkit/metrics',
18-
'@message-queue-toolkit/outbox-core',
19-
'@message-queue-toolkit/redis-message-deduplication-store',
20-
'@message-queue-toolkit/s3-payload-store',
21-
'@message-queue-toolkit/schemas',
22-
'@message-queue-toolkit/sns',
23-
'@message-queue-toolkit/sqs',
24-
'@message-queue-toolkit/gcp-pubsub',
25-
'@message-queue-toolkit/gcs-payload-store'
26-
]
27-
uses: ./.github/workflows/ci.common.yml
28-
with:
29-
node_version: ${{ matrix.node-version }}
30-
package_name: ${{ matrix.package-name }}
10+
changed-files-job:
11+
name: Get changed packages
12+
runs-on: ubuntu-latest
13+
outputs:
14+
packages: ${{ steps.detect.outputs.packages }}
15+
steps:
16+
- name: Get changed files
17+
id: changed-files
18+
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46
19+
with:
20+
files: packages/**
21+
22+
- name: Detect changed packages
23+
id: detect
24+
env:
25+
ALL_CHANGED_FILES: ${{ steps.changed-files.outputs.all_changed_files }}
26+
run: |
27+
declare -A PATH_TO_NAME=(
28+
["packages/amqp"]="@message-queue-toolkit/amqp"
29+
["packages/core"]="@message-queue-toolkit/core"
30+
["packages/gcp-pubsub"]="@message-queue-toolkit/gcp-pubsub"
31+
["packages/gcs-payload-store"]="@message-queue-toolkit/gcs-payload-store"
32+
["packages/kafka"]="@message-queue-toolkit/kafka"
33+
["packages/metrics"]="@message-queue-toolkit/metrics"
34+
["packages/outbox-core"]="@message-queue-toolkit/outbox-core"
35+
["packages/redis-message-deduplication-store"]="@message-queue-toolkit/redis-message-deduplication-store"
36+
["packages/s3-payload-store"]="@message-queue-toolkit/s3-payload-store"
37+
["packages/schemas"]="@message-queue-toolkit/schemas"
38+
["packages/sns"]="@message-queue-toolkit/sns"
39+
["packages/sqs"]="@message-queue-toolkit/sqs"
40+
)
41+
42+
PACKAGES=()
43+
for path in "${!PATH_TO_NAME[@]}"; do
44+
if echo "$ALL_CHANGED_FILES" | grep -q "$path/"; then
45+
PACKAGES+=("\"${PATH_TO_NAME[$path]}\"")
46+
fi
47+
done
3148
32-
kafka:
49+
if [ ${#PACKAGES[@]} -eq 0 ]; then
50+
echo 'packages=[]' >> $GITHUB_OUTPUT
51+
echo "No packages changed"
52+
else
53+
JSON="[$(IFS=,; echo "${PACKAGES[*]}")]"
54+
echo "packages=$JSON" >> $GITHUB_OUTPUT
55+
echo "Changed packages: $JSON"
56+
fi
57+
58+
general:
59+
needs: [changed-files-job]
60+
if: needs.changed-files-job.outputs.packages != '[]'
3361
strategy:
3462
matrix:
3563
node-version: [22.x, 24.x]
64+
package-name: ${{ fromJson(needs.changed-files-job.outputs.packages) }}
3665
uses: ./.github/workflows/ci.common.yml
3766
with:
38-
package_name: '@message-queue-toolkit/kafka'
3967
node_version: ${{ matrix.node-version }}
68+
package_name: ${{ matrix.package-name }}
4069

4170
automerge:
42-
needs: [ general ]
71+
needs: [general]
72+
if: always() && (needs.general.result == 'success' || needs.general.result == 'skipped')
4373
runs-on: ubuntu-latest
4474
permissions:
4575
pull-requests: write

README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,9 @@
66

77
Useful utilities, interfaces and base classes for message queue handling.
88

9+
> **Note:** Node.js 20 is no longer officially supported. It is likely to work, but we are no longer maintaining it.
10+
Please use a later version of Node.js.
11+
912
## Overview
1013

1114
`message-queue-toolkit ` is an abstraction over several different queue systems, which implements common deserialization,

packages/core/README.md

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
# @message-queue-toolkit/core
22

3-
Core library for message-queue-toolkit. Provides foundational abstractions, utilities, and base classes for building message queue publishers and consumers.
3+
Core library for message-queue-toolkit. Provides foundational abstractions, utilities, and base classes for building
4+
message queue publishers and consumers.
45

56
## Table of Contents
67

@@ -74,13 +75,16 @@ type UserCreated = z.infer<typeof UserCreatedSchema>
7475
7576
#### What is Message Type?
7677
77-
The **message type** is a discriminator field that identifies what kind of event or command a message represents. It's used for:
78+
The **message type** is a discriminator field that identifies what kind of event or command a message represents. It's
79+
used for:
7880
7981
1. **Routing**: Directing messages to the appropriate handler based on their type
8082
2. **Schema validation**: Selecting the correct Zod schema to validate the message
8183
3. **Observability**: Tracking metrics and logs per message type
8284
83-
In a typical event-driven architecture, a single queue or topic may receive multiple types of messages. For example, a `user-events` queue might receive `user.created`, `user.updated`, and `user.deleted` events. The message type tells the consumer which handler should process each message.
85+
In a typical event-driven architecture, a single queue or topic may receive multiple types of messages. For example, a
86+
`user-events` queue might receive `user.created`, `user.updated`, and `user.deleted` events. The message type tells the
87+
consumer which handler should process each message.
8488
8589
#### Configuration Options
8690
@@ -126,7 +130,8 @@ const resolverConfig: MessageTypeResolverConfig = {
126130
}
127131
```
128132

129-
**Important:** The resolver function must always return a valid string. If the type cannot be determined, either return a default type or throw an error with a descriptive message.
133+
**Important:** The resolver function must always return a valid string. If the type cannot be determined, either return
134+
a default type or throw an error with a descriptive message.
130135

131136
#### Real-World Examples by Platform
132137

@@ -478,7 +483,8 @@ The third parameter to `addConfig` accepts these options:
478483

479484
#### Explicit Message Type
480485

481-
When using a custom resolver function (`messageTypeResolver: { resolver: fn }`), the message type cannot be automatically extracted from schemas at registration time. You must provide an explicit `messageType` for each handler:
486+
When using a custom resolver function (`messageTypeResolver: { resolver: fn }`), the message type cannot be automatically
487+
extracted from schemas at registration time. You must provide an explicit `messageType` for each handler:
482488

483489
```typescript
484490
const handlers = new MessageHandlerConfigBuilder<SupportedMessages, Context>()

packages/kafka/README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,8 @@ See [test consumer](test/consumer/PermissionConsumer.ts) for an example of imple
2020

2121
## Batch Processing
2222

23-
Kafka supports batch processing for improved throughput. To enable it, set `batchProcessingEnabled` to `true` and configure `batchProcessingOptions`.
23+
Kafka supports batch processing for improved throughput. To enable it, set `batchProcessingEnabled` to `true` and
24+
configure `batchProcessingOptions`.
2425

2526
When batch processing is enabled, message handlers receive an array of messages instead of a single message.
2627

packages/sqs/README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
# @message-queue-toolkit/sqs
22

3-
AWS SQS (Simple Queue Service) implementation for the message-queue-toolkit. Provides a robust, type-safe abstraction for publishing and consuming messages from both standard and FIFO SQS queues.
3+
AWS SQS (Simple Queue Service) implementation for the message-queue-toolkit. Provides a robust, type-safe abstraction
4+
for publishing and consuming messages from both standard and FIFO SQS queues.
45

56
## Table of Contents
67

0 commit comments

Comments
 (0)