Skip to content

Commit 0141730

Browse files
authored
Merge branch 'codex/fix-remaining-issues-and-raise-pr' into codex/implement-ray-for-ml-orchestration
2 parents af1a409 + 4a420ec commit 0141730

17 files changed

Lines changed: 665 additions & 191 deletions

backend/.env.example

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,3 +23,13 @@ SENSITIVE_FIELDS=password,token,secret,api_key,authorization,credit_card,ssn
2323
# Development default: http://localhost:3000,http://localhost:5173,http://127.0.0.1:5173
2424
# Production example: https://yourdomain.com,https://app.yourdomain.com
2525
CORS_ORIGINS=http://localhost:3000,http://localhost:5173,http://127.0.0.1:5173
26+
27+
# Event streaming (Kafka-compatible, supports Apache Kafka and Redpanda)
28+
ENABLE_EVENT_STREAMING=False
29+
EVENT_STREAM_BACKEND=kafka
30+
KAFKA_BOOTSTRAP_SERVERS=localhost:9092
31+
KAFKA_CLIENT_ID=flexiroaster-backend
32+
TOPIC_PIPELINE_CREATED=pipeline.created
33+
TOPIC_EXECUTION_STARTED=execution.started
34+
TOPIC_EXECUTION_FAILED=execution.failed
35+
TOPIC_EXECUTION_COMPLETED=execution.completed

backend/README.md

Lines changed: 19 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -163,19 +163,35 @@ cp backend/.env.example backend/.env
163163

164164
## Event-Driven Architecture (Advanced Setup)
165165

166-
FlexiRoaster supports Kafka-backed domain events for loose coupling, high scalability, audit-friendly workflows, and real-time analytics.
166+
FlexiRoaster supports Kafka-compatible domain events (Apache Kafka or Redpanda) for loose coupling, high scalability, audit-friendly workflows, and real-time analytics.
167+
168+
169+
#### Apache Kafka
170+
- Event-driven triggers
171+
- High-throughput ingestion
172+
- Real-time pipeline activation
173+
174+
#### Redpanda
175+
- Kafka-compatible
176+
- Lower operational complexity
177+
178+
Use this layer when:
179+
- Pipelines should trigger from events
180+
- You need real-time monitoring
181+
- You process millions of records
167182

168183
### Published topics
169184
- `pipeline.created`
170185
- `execution.started`
171186
- `execution.failed`
172187
- `execution.completed`
173188

174-
### Enable Kafka publishing
189+
### Enable streaming publishing
175190
Set the following environment variables in `backend/.env`:
176191

177192
```env
178193
ENABLE_EVENT_STREAMING=true
194+
EVENT_STREAM_BACKEND=kafka
179195
KAFKA_BOOTSTRAP_SERVERS=localhost:9092
180196
KAFKA_CLIENT_ID=flexiroaster-backend
181197
TOPIC_PIPELINE_CREATED=pipeline.created
@@ -184,7 +200,7 @@ TOPIC_EXECUTION_FAILED=execution.failed
184200
TOPIC_EXECUTION_COMPLETED=execution.completed
185201
```
186202

187-
If Kafka is unavailable, the backend falls back to structured application logs for events so local development continues to work.
203+
If Kafka/Redpanda is unavailable, the backend falls back to structured application logs for events so local development continues to work.
188204

189205
## Next Steps
190206

backend/api/routes/airflow.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -48,8 +48,11 @@ async def trigger_from_airflow(
4848
detail=f"Pipeline not found: {trigger_data.pipeline_id}",
4949
)
5050

51+
pipeline = pipelines_db[trigger_data.pipeline_id]
52+
5153
execution = initialize_execution(
5254
trigger_data.pipeline_id,
55+
user_id=pipeline.user_id,
5356
context={
5457
"triggered_by": "airflow",
5558
"airflow": {

0 commit comments

Comments
 (0)