feat: add dbt-fabricspark adapter support (ELE-5295)#964
Conversation
Add fabricspark__ prefixed macros that delegate to spark__ implementations. dbt-fabricspark speaks Spark SQL but doesn't declare dependencies=["spark"] in its AdapterPlugin, so dispatch falls through to default__ instead of spark__. Macros added across 22 files covering: - Cross-DB utils (timestamps, dateadd/diff, to_char, safe_cast, etc.) - Data types (string, bool, timestamp, type lists, normalization) - Table operations (create, delete/insert, temp relations, etc.) - System utils (buckets CTE) - Metadata collection (information schema) - Test utils (clean test tables) Also adds a defensive fallback in get_elementary_relation.sql using api.Relation.create() when adapter.get_relation() returns None, working around a bug in dbt-fabricspark's list_relations_without_caching. Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
|
👋 @devin-ai-integration[bot] |
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughAdds ~20 Fabricspark-prefixed wrapper macros that delegate to existing Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan for PR comments
Comment |
Per reviewer feedback, removing the api.Relation.create() fallback since callers rely on None returns to detect missing relations. Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
…bric Livy API params Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@macros/utils/cross_db_utils/generate_elementary_profile_args.sql`:
- Around line 314-332: In the fabricspark__generate_elementary_profile_args
macro, fix the mistaken mapping where the "lakehouse" parameter is set to
elementary_schema; change it to use elementary_database so the macro uses the
passed-in elementary_database (keep "schema" as elementary_schema and leave the
other parameters as-is), i.e., update the _parameter("lakehouse", ...) argument
inside the macro to reference elementary_database instead of elementary_schema.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 0ce94cd2-950e-430e-8721-b2eac75c7efb
📒 Files selected for processing (1)
macros/utils/cross_db_utils/generate_elementary_profile_args.sql
…_schema Co-Authored-By: Itamar Hartstein <haritamar@gmail.com>
feat: add dbt-fabricspark adapter support (ELE-5295)
Summary
Adds
fabricspark__prefixed macros to support thedbt-fabricsparkadapter (Microsoft Fabric Lakehouse via Spark/Livy API). Thedbt-fabricsparkadapter dispatches asfabricsparkbut does not declaredependencies=["spark"]in itsAdapterPlugin, so the dispatch chain isfabricspark__ → default__instead offabricspark__ → spark__ → default__. Since fabricspark speaks Spark SQL (not T-SQL), the existingfabric__macros are wrong for it and thedefault__macros often produce invalid SQL.Each new
fabricspark__macro is a trivial one-line delegation to the correspondingspark__implementation, following the same pattern used forsqlserver__ → fabric__. Only macros wherespark__differs fromdefault__needed overrides — 21 files across cross-db utils, data types, table operations, system utils, metadata collection, and test utils.Updates since last revision
get_elementary_relation.sqlper reviewer feedback. Multiple callers rely onNonereturns to detect missing relations, so theapi.Relation.create()fallback was dropped entirely. The PR now contains only thefabricspark__macro delegations with no behavioral changes to existing code paths.Review & Testing Checklist for Human
fabricsparkwould fall through todefault__and produce invalid Spark SQL. The ticket notes the list is "non-exhaustive" — macros without aspark__override were intentionally skipped since Spark also usesdefault__for those.fabricspark__edr_type_bool/fabricspark__edr_type_timestamp: These delegate todefault__(notspark__) since Spark has no override for them. Confirm this is correct for the fabricspark dialect.fabricspark__generate_elementary_profile_args: Delegates tospark__which produces Databricks-oriented connection params (http_path,token). These may not match Fabric Spark's actual connection model — consider whether a dedicated implementation is needed (or accept as a known limitation for now).spark__implementations. Recommend a manual smoke test with an actual Fabric Lakehouse environment if available, or at minimum confirming existing CI (Spark, Fabric) still passes to ensure no regressions.Notes
dependencies=["spark"]to theirAdapterPlugin, which would make most of these overrides unnecessary long-term.Summary by CodeRabbit