Skip to content

Commit e656261

Browse files
committed
Cut new release 3.4.1
Co-authored-by: Isaac Signed-off-by: Madhavendra Rathore <madhavendra.rathore@databricks.com>
1 parent 4a0d20d commit e656261

15 files changed

Lines changed: 72 additions & 41 deletions

File tree

CHANGELOG.md

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,31 @@
11
# Version Changelog
22

3+
## [v3.4.1] - 2026-04-22
4+
5+
### Added
6+
- Added `CallableStatement` support with IN parameters. `Connection.prepareCall()` now returns a working `DatabricksCallableStatement` that supports positional parameter binding and execution via `{call proc(?)}` JDBC escape syntax. OUT/INOUT parameters and named parameters throw `SQLFeatureNotSupportedException`.
7+
- Added AI coding agent detection to the User-Agent header. When the driver is invoked by a known AI coding agent (e.g. Claude Code, Cursor, Gemini CLI), `agent/<product>` is appended to the User-Agent string.
8+
9+
### Updated
10+
- **[Breaking Change]** Thrift-mode metadata operations (`getTables`, `getColumns`, `getSchemas`, `getFunctions`, `getPrimaryKeys`, `getImportedKeys`, `getCrossReference`) on **SQL Warehouses** now use SQL SHOW commands by default instead of native Thrift RPCs, aligning behavior with Statement Execution API (SEA) mode. All-Purpose Clusters are unaffected and continue using native Thrift RPCs. The `UseQueryForMetadata` connection property default changed from `0` to `1`. To revert to native Thrift RPCs, set `UseQueryForMetadata=0`. Key behavioral changes:
11+
- Catalog parameter is now treated as a literal identifier (not a wildcard pattern) per JDBC spec. Use `null` to search across all catalogs.
12+
- Methods that previously threw exceptions for null/empty edge-case inputs now return empty result sets.
13+
- `getFunctions` now works correctly (was broken via native Thrift RPC).
14+
- Result columns (TABLE_CATALOG, etc.) return stored values (lowercase) instead of preserving input case.
15+
- Connection properties `EnableShowCommandForGetFunctions` and `TreatMetadataCatalogNameAsPattern` are now redundant when `UseQueryForMetadata=1` (the new default).
16+
17+
### Fixed
18+
- Fixed `EnableBatchedInserts` silently falling back to individual execution when table or schema names contain special characters (e.g., hyphens) inside backtick-quoted identifiers. Added a warn log when the fallback occurs.
19+
- Fixed `IntervalConverter` crash (`IllegalArgumentException: Invalid interval metadata`) when INTERVAL columns are returned via CloudFetch. Arrow metadata from CloudFetch uses underscored format (`INTERVAL_YEAR_MONTH`, `INTERVAL_DAY_TIME`) which the driver's regex did not accept.
20+
- Fixed `Statement` being prematurely closed after queries that return inline results, which prevented re-execution, `getResultSet()`, and `getExecutionResult()` from working. Statements now remain open and reusable until explicitly closed by the caller.
21+
- Fixed primitive types within complex types (ARRAY, MAP, STRUCT) not being correctly parsed when Arrow serialization uses alternate formats: TIMESTAMP/TIMESTAMP_NTZ as epoch microseconds or component arrays, and BINARY as base64-encoded strings.
22+
- Fixed `PARSE_SYNTAX_ERROR` for column names containing special characters (e.g., dots) when `EnableBatchedInserts` is enabled, by re-quoting column names with backticks in reconstructed multi-row INSERT statements.
23+
- Fixed Volume ingestion for SEA mode, which was broken due to statement being closed prematurely.
24+
- Fixed escaped pattern characters in catalogName for `getSchemas`, as returned catalogName should be unescaped.
25+
- Fixed `getColumnClassName()` returning null for VARIANT columns in SEA mode by adding VARIANT to the type system.
26+
- Fixed `getColumns()` returning `DATA_TYPE=0` (NULL) for GEOMETRY/GEOGRAPHY columns in Thrift mode. Now returns `Types.VARCHAR` (12) when geospatial is disabled and `Types.OTHER` (1111) when enabled, consistent with SEA mode.
27+
- Fixed `getCrossReference()` returning 0 rows when parent args are passed in uppercase. The client-side filter used case-sensitive comparison against server-returned lowercase names.
28+
329
## [v3.3.1] - 2026-03-17
430

531
### Added

NEXT_CHANGELOG.md

Lines changed: 0 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -3,28 +3,10 @@
33
## [Unreleased]
44

55
### Added
6-
- Added `CallableStatement` support with IN parameters. `Connection.prepareCall()` now returns a working `DatabricksCallableStatement` that supports positional parameter binding and execution via `{call proc(?)}` JDBC escape syntax. OUT/INOUT parameters and named parameters throw `SQLFeatureNotSupportedException`.
7-
- Added AI coding agent detection to the User-Agent header. When the driver is invoked by a known AI coding agent (e.g. Claude Code, Cursor, Gemini CLI), `agent/<product>` is appended to the User-Agent string.
86

97
### Updated
10-
- **[Breaking Change]** Thrift-mode metadata operations (`getTables`, `getColumns`, `getSchemas`, `getFunctions`, `getPrimaryKeys`, `getImportedKeys`, `getCrossReference`) on **SQL Warehouses** now use SQL SHOW commands by default instead of native Thrift RPCs, aligning behavior with Statement Execution API (SEA) mode. All-Purpose Clusters are unaffected and continue using native Thrift RPCs. The `UseQueryForMetadata` connection property default changed from `0` to `1`. To revert to native Thrift RPCs, set `UseQueryForMetadata=0`. Key behavioral changes:
11-
- Catalog parameter is now treated as a literal identifier (not a wildcard pattern) per JDBC spec. Use `null` to search across all catalogs.
12-
- Methods that previously threw exceptions for null/empty edge-case inputs now return empty result sets.
13-
- `getFunctions` now works correctly (was broken via native Thrift RPC).
14-
- Result columns (TABLE_CATALOG, etc.) return stored values (lowercase) instead of preserving input case.
15-
- Connection properties `EnableShowCommandForGetFunctions` and `TreatMetadataCatalogNameAsPattern` are now redundant when `UseQueryForMetadata=1` (the new default).
168

179
### Fixed
18-
- Fixed `EnableBatchedInserts` silently falling back to individual execution when table or schema names contain special characters (e.g., hyphens) inside backtick-quoted identifiers. Added a warn log when the fallback occurs.
19-
- Fixed `IntervalConverter` crash (`IllegalArgumentException: Invalid interval metadata`) when INTERVAL columns are returned via CloudFetch. Arrow metadata from CloudFetch uses underscored format (`INTERVAL_YEAR_MONTH`, `INTERVAL_DAY_TIME`) which the driver's regex did not accept.
20-
- Fixed `Statement` being prematurely closed after queries that return inline results, which prevented re-execution, `getResultSet()`, and `getExecutionResult()` from working. Statements now remain open and reusable until explicitly closed by the caller.
21-
- Fixed primitive types within complex types (ARRAY, MAP, STRUCT) not being correctly parsed when Arrow serialization uses alternate formats: TIMESTAMP/TIMESTAMP_NTZ as epoch microseconds or component arrays, and BINARY as base64-encoded strings.
22-
- Fixed `PARSE_SYNTAX_ERROR` for column names containing special characters (e.g., dots) when `EnableBatchedInserts` is enabled, by re-quoting column names with backticks in reconstructed multi-row INSERT statements.
23-
- Fixed Volume ingestion for SEA mode, which was broken due to statement being closed prematurely.
24-
- Fixed escaped pattern characters in catalogName for `getSchemas`, as returned catalogName should be unescaped.
25-
- Fixed `getColumnClassName()` returning null for VARIANT columns in SEA mode by adding VARIANT to the type system.
26-
- Fixed `getColumns()` returning `DATA_TYPE=0` (NULL) for GEOMETRY/GEOGRAPHY columns in Thrift mode. Now returns `Types.VARCHAR` (12) when geospatial is disabled and `Types.OTHER` (1111) when enabled, consistent with SEA mode.
27-
- Fixed `getCrossReference()` returning 0 rows when parent args are passed in uppercase. The client-side filter used case-sensitive comparison against server-returned lowercase names.
2810

2911
---
3012
*Note: When making changes, please add your change under the appropriate section

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Add the following dependency to your `pom.xml`:
2020
<dependency>
2121
<groupId>com.databricks</groupId>
2222
<artifactId>databricks-jdbc</artifactId>
23-
<version>3.3.2-SNAPSHOT</version>
23+
<version>3.4.1</version>
2424
</dependency>
2525
```
2626

assembly-thin/pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
<parent>
88
<groupId>com.databricks</groupId>
99
<artifactId>databricks-jdbc-parent</artifactId>
10-
<version>3.3.2-SNAPSHOT</version>
10+
<version>3.4.1</version>
1111
</parent>
1212

1313
<artifactId>databricks-jdbc-thin</artifactId>
@@ -51,7 +51,7 @@
5151
<dependency>
5252
<groupId>com.databricks</groupId>
5353
<artifactId>databricks-jdbc-core</artifactId>
54-
<version>3.3.2-SNAPSHOT</version>
54+
<version>3.4.1</version>
5555
</dependency>
5656
</dependencies>
5757

assembly-uber/pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
<parent>
88
<groupId>com.databricks</groupId>
99
<artifactId>databricks-jdbc-parent</artifactId>
10-
<version>3.3.2-SNAPSHOT</version>
10+
<version>3.4.1</version>
1111
</parent>
1212

1313
<artifactId>databricks-jdbc</artifactId>
@@ -51,7 +51,7 @@
5151
<dependency>
5252
<groupId>com.databricks</groupId>
5353
<artifactId>databricks-jdbc-core</artifactId>
54-
<version>3.3.2-SNAPSHOT</version>
54+
<version>3.4.1</version>
5555
</dependency>
5656
</dependencies>
5757

jdbc-core/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<parent>
77
<groupId>com.databricks</groupId>
88
<artifactId>databricks-jdbc-parent</artifactId>
9-
<version>3.3.2-SNAPSHOT</version>
9+
<version>3.4.1</version>
1010
</parent>
1111

1212
<artifactId>databricks-jdbc-core</artifactId>

pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
<groupId>com.databricks</groupId>
66
<artifactId>databricks-jdbc-parent</artifactId>
77
<!-- This value may be modified by a release script to reflect the current version of the driver. -->
8-
<version>3.3.2-SNAPSHOT</version>
8+
<version>3.4.1</version>
99
<packaging>pom</packaging>
1010
<name>Databricks JDBC Parent</name>
1111
<description>Parent POM for Databricks JDBC Driver.</description>
@@ -63,7 +63,7 @@
6363
<build-helper-maven-plugin.version>3.6.1</build-helper-maven-plugin.version>
6464

6565
<!-- Dependency Versions -->
66-
<databricks-jdbc-version>3.3.2-SNAPSHOT</databricks-jdbc-version>
66+
<databricks-jdbc-version>3.4.1</databricks-jdbc-version>
6767
<arrow.version>18.3.0</arrow.version>
6868
<commons-lang3.version>3.18.0</commons-lang3.version>
6969
<commons-configuration.version>2.10.1</commons-configuration.version>

release-notes.txt

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,29 @@ The release notes summarize enhancements, new features, known issues, workflow c
66

77
Version History ==============================================================
88

9+
3.4.1 ========================================================================
10+
Released 2026-04-22
11+
12+
Enhancements & New Features
13+
14+
* Added `CallableStatement` support with IN parameters. `Connection.prepareCall()` now returns a working `DatabricksCallableStatement` that supports positional parameter binding and execution via `{call proc(?)}` JDBC escape syntax. OUT/INOUT parameters and named parameters throw `SQLFeatureNotSupportedException`.
15+
* Added AI coding agent detection to the User-Agent header. When the driver is invoked by a known AI coding agent (e.g. Claude Code, Cursor, Gemini CLI), `agent/<product>` is appended to the User-Agent string.
16+
* [Breaking Change] Thrift-mode metadata operations (`getTables`, `getColumns`, `getSchemas`, `getFunctions`, `getPrimaryKeys`, `getImportedKeys`, `getCrossReference`) on SQL Warehouses now use SQL SHOW commands by default instead of native Thrift RPCs, aligning behavior with Statement Execution API (SEA) mode. All-Purpose Clusters are unaffected and continue using native Thrift RPCs. The `UseQueryForMetadata` connection property default changed from `0` to `1`. Key behavioral changes: catalog parameter is now a literal identifier (not a wildcard pattern), null/empty edge-case inputs return empty result sets, `getFunctions` now works correctly, and result columns return stored (lowercase) values.
17+
* Connection properties `EnableShowCommandForGetFunctions` and `TreatMetadataCatalogNameAsPattern` are now redundant when `UseQueryForMetadata=1` (the new default).
18+
19+
Resolved Issues
20+
21+
* Fixed `EnableBatchedInserts` silently falling back to individual execution when table or schema names contain special characters (e.g., hyphens) inside backtick-quoted identifiers. Added a warn log when the fallback occurs.
22+
* Fixed `IntervalConverter` crash (`IllegalArgumentException: Invalid interval metadata`) when INTERVAL columns are returned via CloudFetch.
23+
* Fixed `Statement` being prematurely closed after queries that return inline results, which prevented re-execution, `getResultSet()`, and `getExecutionResult()` from working.
24+
* Fixed primitive types within complex types (ARRAY, MAP, STRUCT) not being correctly parsed when Arrow serialization uses alternate formats: TIMESTAMP/TIMESTAMP_NTZ as epoch microseconds or component arrays, and BINARY as base64-encoded strings.
25+
* Fixed `PARSE_SYNTAX_ERROR` for column names containing special characters (e.g., dots) when `EnableBatchedInserts` is enabled.
26+
* Fixed Volume ingestion for SEA mode, which was broken due to statement being closed prematurely.
27+
* Fixed escaped pattern characters in catalogName for `getSchemas`, as returned catalogName should be unescaped.
28+
* Fixed `getColumnClassName()` returning null for VARIANT columns in SEA mode by adding VARIANT to the type system.
29+
* Fixed `getColumns()` returning `DATA_TYPE=0` (NULL) for GEOMETRY/GEOGRAPHY columns in Thrift mode. Now returns `Types.VARCHAR` (12) when geospatial is disabled and `Types.OTHER` (1111) when enabled, consistent with SEA mode.
30+
* Fixed `getCrossReference()` returning 0 rows when parent args are passed in uppercase.
31+
932
3.3.1 ========================================================================
1033
Released 2026-03-17
1134

src/main/java/com/databricks/jdbc/api/impl/DatabricksDatabaseMetaData.java

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,8 @@ public class DatabricksDatabaseMetaData implements DatabaseMetaData {
2727
public static final String DRIVER_NAME = "DatabricksJDBC";
2828
public static final String PRODUCT_NAME = "SparkSQL";
2929
public static final int DATABASE_MAJOR_VERSION = 3;
30-
public static final int DATABASE_MINOR_VERSION = 3;
31-
public static final int DATABASE_PATCH_VERSION = 2;
30+
public static final int DATABASE_MINOR_VERSION = 4;
31+
public static final int DATABASE_PATCH_VERSION = 1;
3232
public static final Integer MAX_NAME_LENGTH = 128;
3333
public static final String NUMERIC_FUNCTIONS =
3434
"ABS,ACOS,ASIN,ATAN,ATAN2,CEILING,COS,COT,DEGREES,EXP,FLOOR,LOG,LOG10,MOD,PI,POWER,RADIANS,RAND,ROUND,SIGN,SIN,SQRT,TAN,TRUNCATE";

src/main/java/com/databricks/jdbc/common/util/DriverUtil.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
public class DriverUtil {
2121

2222
private static final JdbcLogger LOGGER = JdbcLoggerFactory.getLogger(DriverUtil.class);
23-
private static final String DRIVER_VERSION = "3.3.2-SNAPSHOT";
23+
private static final String DRIVER_VERSION = "3.4.1";
2424
private static final String DRIVER_NAME = "oss-jdbc";
2525
private static final String JDBC_VERSION = "4.3";
2626

0 commit comments

Comments
 (0)