Context
PR #320 by @sienkiewiczPat fixed several real issues in the SDP streaming SQL documentation, but the skill was restructured in #368 and the target files no longer exist. Closing #320 but porting the fixes here.
Supersedes #304 (which reported the same broken TVFs).
What needs to be added
The following corrections from #320 should be applied to the current databricks-skills/databricks-spark-declarative-pipelines/references/sql/ directory:
- Kafka ingestion: Use
read_kafka() TVF (not read_stream()), with correct parameter names (bootstrapServers, backtick-quoted dot-options like `kafka.security.protocol`), and serviceCredential auth option
- Kinesis ingestion: Use
read_kinesis() TVF with correct parameter names (streamName, initialPosition) and all three auth patterns (explicit credentials, IAM role, environment variables)
- Event Hub ingestion: Use
read_kafka() with SASL/SSL JAAS config (no read_eventhub TVF exists; Event Hub uses Kafka-compatible endpoint)
- Secret syntax: Use
secret('scope', 'key') function — the {{secrets/scope/key}} template syntax is not valid in SDP SQL
- Advanced config: Add
pipelines.reset.allowed config key
Reference
See PR #320 diff for the exact corrections: https://github.com/databricks-solutions/ai-dev-kit/pull/320/files
Context
PR #320 by @sienkiewiczPat fixed several real issues in the SDP streaming SQL documentation, but the skill was restructured in #368 and the target files no longer exist. Closing #320 but porting the fixes here.
Supersedes #304 (which reported the same broken TVFs).
What needs to be added
The following corrections from #320 should be applied to the current
databricks-skills/databricks-spark-declarative-pipelines/references/sql/directory:read_kafka()TVF (notread_stream()), with correct parameter names (bootstrapServers, backtick-quoted dot-options like`kafka.security.protocol`), andserviceCredentialauth optionread_kinesis()TVF with correct parameter names (streamName,initialPosition) and all three auth patterns (explicit credentials, IAM role, environment variables)read_kafka()with SASL/SSL JAAS config (noread_eventhubTVF exists; Event Hub uses Kafka-compatible endpoint)secret('scope', 'key')function — the{{secrets/scope/key}}template syntax is not valid in SDP SQLpipelines.reset.allowedconfig keyReference
See PR #320 diff for the exact corrections: https://github.com/databricks-solutions/ai-dev-kit/pull/320/files