Describe the bug
When bumping the adapter to dbt-databricks v1.11.5, I started to receive this error:
Compilation Error in model my_model (models/my_model.sql)
macro dateadd not implemented for datepart ~ 'DAY' ~ on Spark
> in macro spark__dateadd (macros/utils/dateadd.sql)
> called by macro databricks__dateadd (macros/utils/dateadd.sql)
> called by macro dateadd (macros/utils/dateadd.sql)
> called by macro default__date_spine (macros/sql/date_spine.sql)
> called by macro date_spine (macros/sql/date_spine.sql)
I started to get suspiscious, because the adatper was calling some Spark macros.
After further investigation, the culprit seems to be on the {%- if adapter.has_dbr_capability('timestampdiff') -%} was evaluated to false, and the adapter was falling back to spark__dateadd leading to this error.
Overriding the macro to
{% macro databricks__dateadd(datepart, interval, from_date_or_timestamp) %}
timestampadd({{datepart}}, {{interval}}, {{from_date_or_timestamp}})
{%- endmacro %}
fix the issue.
I'm using a SQL serverless Warehouse and I believe that there is a bug on how the has_dbr_capability is evaluated.
Steps To Reproduce
Create a model that uses
with date_spine AS (
SELECT *
FROM (
{{
dbt_utils.date_spine(
datepart="DAY",
start_date="CAST('2024-01-01' AS DATE)",
end_date="CAST(CONVERT_TIMEZONE('Europe/Berlin', CURRENT_TIMESTAMP()) AS DATE)"
)
}} -- end_date handle snowflake config time being in LA time
)
)
select * from date_spine
Should lead the macro databricks__dateadd to be invoked
Expected behavior
adapter.has_dbr_capability('timestampdiff') should be evaluated to true for SQL serverless warehouses
Screenshots and log output
Provided above inside the bug description
System information
The output of dbt --version:
Core:
- installed: 1.11.6
- latest: 1.11.6 - Up to date!
Plugins:
- databricks: 1.11.5 - Up to date!
- spark: 1.10.1 - Up to date!
The operating system you're using: MacOS 26.3 / Linux via Docker (python:3.12-slim-bullseye)
The output of python --version: Python 3.12.8
Additional context
Based on an initial discussion with @benc-db , it looks like that the culprit is the has_dbr_capability, which should be evaluated to true for SQL warehouses. Additionally I experience this only with timestampdiff capability, not sure if other capabilities are affected.
Describe the bug
When bumping the adapter to dbt-databricks v1.11.5, I started to receive this error:
I started to get suspiscious, because the adatper was calling some Spark macros.
After further investigation, the culprit seems to be on the
{%- if adapter.has_dbr_capability('timestampdiff') -%}was evaluated to false, and the adapter was falling back tospark__dateaddleading to this error.Overriding the macro to
fix the issue.
I'm using a SQL serverless Warehouse and I believe that there is a bug on how the has_dbr_capability is evaluated.
Steps To Reproduce
Create a model that uses
Should lead the macro databricks__dateadd to be invoked
Expected behavior
adapter.has_dbr_capability('timestampdiff')should be evaluated to true for SQL serverless warehousesScreenshots and log output
Provided above inside the bug description
System information
The output of
dbt --version:The operating system you're using: MacOS 26.3 / Linux via Docker (python:3.12-slim-bullseye)
The output of
python --version: Python 3.12.8Additional context
Based on an initial discussion with @benc-db , it looks like that the culprit is the has_dbr_capability, which should be evaluated to true for SQL warehouses. Additionally I experience this only with timestampdiff capability, not sure if other capabilities are affected.